Free Online Directory Decision trees MCQs - MCQs Answers

Decision trees MCQs

10 min Score: 0 Attempted: 0/10 Subscribe
1. Which of the following is the primary criterion for splitting nodes in a decision tree?
Explanation: Entropy is commonly used to determine how to split nodes, especially in classification tasks. It measures the impurity or disorder of a dataset.






2. In a decision tree, what does a “leaf node” represent?
Explanation: A leaf node contains the final class label or prediction for a given set of features.






3. Which algorithm is primarily used for creating decision trees in machine learning?
Explanation: ID3 (Iterative Dichotomiser 3) is a decision tree algorithm that uses entropy and information gain to split nodes.






4. Which of the following is true about the Gini index used in decision trees?
Explanation: The Gini index measures how often a randomly chosen element would be incorrectly classified. A Gini index of 0 indicates that all elements belong to a single class.






5. Which of the following is a disadvantage of decision trees?
Explanation: Decision trees can easily overfit the data by learning overly specific patterns, which can be mitigated by techniques such as pruning.






6. What is “pruning” in the context of decision trees?
Explanation: Pruning is a technique used to reduce the complexity of the tree by removing branches that provide little predictive power.






7. In a decision tree, which of the following is used to evaluate the quality of a split?
Explanation: Both entropy (used in ID3) and Gini index (used in CART) are common measures to evaluate the quality of a split in a decision tree.






8. Which of the following is an advantage of decision trees?
Explanation: Decision trees can work with both types of data, unlike many other algorithms that only support numerical features.






9. Which of the following methods can help prevent overfitting in decision trees?
Explanation: Pruning helps in reducing overfitting by removing branches that capture noise in the training data.






10. What is the main purpose of using a Random Forest algorithm in decision trees?
Explanation: Random Forest is an ensemble method that uses multiple decision trees to reduce the variance and improve model performance compared to a single decision tree.






Leave a Comment

All copyrights Reserved by MCQsAnswers.com - Powered By T4Tutorials