✔ Scroll down and test yourself — answers are hidden under the “View Answer” button.
10 Hot Decision Tree MCQs: Gain Ratio, Continuous Attributes & Tie-Breaking
1. The root node in a decision tree is selected based on:
2. If a dataset has 100% identical attribute values for all samples but mixed labels, the information gain of any attribute will be:
3. In a two-class problem, Gini Index = 0.5 represents:
4. A pruned decision tree generally has:
5. In manual decision tree construction, if an attribute gives 0 information gain, what should you do?
6. In a decision tree, if a node contains only one sample, what is its entropy?
A) 0
B) 0.5
C) 1
D) Cannot be calculated
Answer: A
Explanation: A single sample belongs to a single class → node is perfectly pure → entropy = 0.
7. Which splitting criterion can be used for multi-class problems besides binary classification?
A) Gini Index
B) Entropy / Information Gain
C) Gain Ratio
D) All of the above
Answer: D
Explanation: All these measures can handle more than two classes; they just compute probabilities for each class.
8. Which of the following is most likely to cause overfitting in a decision tree?
A) Shallow tree
B) Large minimum samples per leaf
C) Very deep tree with small leaves
D) Using pruning
Answer: C
Explanation: Deep trees with tiny leaves memorize training data → overfit → poor generalization.
9. In manual construction of a decision tree, what is the first step?
A) Calculate child node entropy
B) Select root attribute based on information gain
C) Split dataset randomly
D) Prune unnecessary branches
Answer: B
Explanation: The root is chosen to maximize information gain, which reduces the initial uncertainty the most.
10. If a node’s children after a split all have entropy = 0.3 and the parent has entropy = 0.3, what does it indicate?
A) Maximum information gain
B) Node is pure
C) Overfitting
D) No information gain
Answer: D
Explanation: Information gain = Parent entropy − Weighted child entropy = 0 → the split did not improve purity.
No comments:
Post a Comment