Decision trees MCQs January 8, 2026November 18, 2024 by u930973931_answers 10 min Score: 0 Attempted: 0/10 Subscribe 1. Which of the following is the primary criterion for splitting nodes in a decision tree?Explanation: Entropy is commonly used to determine how to split nodes, especially in classification tasks. It measures the impurity or disorder of a dataset. (A) Variance (B) Standard deviation (C) Gini index (D) Entropy 2. In a decision tree, what does a “leaf node” represent?Explanation: A leaf node contains the final class label or prediction for a given set of features. (A) The final prediction or output (B) The splitting criterion (C) A decision-making point (D) A branching point 3. Which algorithm is primarily used for creating decision trees in machine learning?Explanation: ID3 (Iterative Dichotomiser 3) is a decision tree algorithm that uses entropy and information gain to split nodes. (A) K-Nearest Neighbors (B) Random Forest (C) Support Vector Machines (D) ID3 4. Which of the following is true about the Gini index used in decision trees?Explanation: The Gini index measures how often a randomly chosen element would be incorrectly classified. A Gini index of 0 indicates that all elements belong to a single class. (A) It always results in an unbalanced tree. (B) It measures the variance in a dataset. (C) It is used only for regression tasks. (D) It ranges from 0 to 1, with 0 indicating perfect purity. 5. Which of the following is a disadvantage of decision trees?Explanation: Decision trees can easily overfit the data by learning overly specific patterns, which can be mitigated by techniques such as pruning. (A) They are interpretable and easy to visualize. (B) They can only be used for classification tasks. (C) They tend to overfit the data if not properly pruned. (D) They require a lot of computational resources. 6. What is “pruning” in the context of decision trees?Explanation: Pruning is a technique used to reduce the complexity of the tree by removing branches that provide little predictive power. (A) Removing branches that do not provide significant information (B) Adding more branches to the tree (C) Combining the leaf nodes into a single node (D) Increasing the depth of the tree 7. In a decision tree, which of the following is used to evaluate the quality of a split?Explanation: Both entropy (used in ID3) and Gini index (used in CART) are common measures to evaluate the quality of a split in a decision tree. (A) Cost function (B) Cross-validation score (C) Mean squared error (D) Entropy or Gini index 8. Which of the following is an advantage of decision trees?Explanation: Decision trees can work with both types of data, unlike many other algorithms that only support numerical features. (A) They require a large amount of data to train effectively. (B) They are difficult to interpret. (C) They perform poorly with missing data. (D) They can handle both numerical and categorical data. 9. Which of the following methods can help prevent overfitting in decision trees?Explanation: Pruning helps in reducing overfitting by removing branches that capture noise in the training data. (A) Increasing the depth of the tree (B) Reducing the number of features (C) Using pruning (D) Increasing the number of leaf nodes 10. What is the main purpose of using a Random Forest algorithm in decision trees?Explanation: Random Forest is an ensemble method that uses multiple decision trees to reduce the variance and improve model performance compared to a single decision tree. (A) To increase interpretability (B) To create deeper trees (C) To handle only numerical data (D) To reduce the variance of decision trees