1. What does Precision measure in a classification model?
A. The proportion of actual positive instances correctly predicted as positive
B. The proportion of actual negative instances correctly predicted as negative
C. The proportion of predicted positive instances that are actually positive
D. The proportion of predicted negative instances that are actually negative
Answer: C
2. Which of the following formulas represents Recall (Sensitivity)?
A. Recall = TP / (TP + FP)
B. Recall = TP / (TP + FN)
C. Recall = (TP + TN) / (TP + TN + FP + FN)
D. Recall = (TP + FN) / (FP + TN)
Answer: B
3. What is the F1-score in the context of classification evaluation?
A. The harmonic mean of accuracy and precision
B. The geometric mean of precision and recall
C. The average of precision and recall
D. The harmonic mean of precision and recall
Answer: D
**4. Given the following confusion matrix:
Predicted Positive | Predicted Negative | |
---|---|---|
Actual Positive | 70 | 30 |
Actual Negative | 20 | 80 |
What is the Precision of the model?**
A. 0.7
B. 0.75
C. 0.85
D. 0.8
Answer: B
(Precision = TP / (TP + FP) = 70 / (70 + 20) = 70 / 90 ≈ 0.75)
5. What is the primary goal of maximizing the Recall of a model?
A. To minimize the number of false positives
B. To minimize the number of false negatives
C. To maximize the model’s ability to identify positive cases
D. To improve the model’s accuracy
Answer: C
6. If a classification model has a Recall of 0.9 and Precision of 0.6, what would the F1-score be?
A. 0.72
B. 0.75
C. 0.80
D. 0.85
Answer: A
(F1 Score = 2 * (Precision * Recall) / (Precision + Recall) = 2 * (0.6 * 0.9) / (0.6 + 0.9) = 1.08 / 1.5 = 0.72)
7. Which of the following is True regarding Precision and Recall?
A. Precision and Recall are mutually exclusive; improving one worsens the other
B. Precision is related to the number of false positives, while Recall is related to the number of false negatives
C. Precision is based on the total number of correct predictions, and Recall is based on the total number of errors
D. Precision and Recall are always identical in a good classification model
Answer: B
8. A classification model predicts 80 positive instances, of which 60 are actually positive. What is the Precision of the model?
A. 0.5
B. 0.75
C. 0.8
D. 0.9
Answer: B
(Precision = TP / (TP + FP) = 60 / 80 = 0.75)
9. In the context of classification, which evaluation metric would you use if the cost of False Negatives is much higher than the cost of False Positives?
A. Accuracy
B. Recall
C. Precision
D. F1-score
Answer: B
(Recall is more important when you want to minimize False Negatives)
10. If a model has Precision = 0.8 and Recall = 0.4, which of the following statements is True?
A. The model is better at identifying positive instances than avoiding false positives
B. The model is performing poorly at both identifying positive instances and avoiding false positives
C. The model is doing well at identifying positive instances but struggling to avoid false positives
D. The model is performing well on both precision and recall
Answer: C
11. What does a F1-score of 1 indicate?
A. The model has perfect precision but poor recall
B. The model has perfect recall but poor precision
C. The model has a perfect balance of precision and recall
D. The model is not able to classify any instances correctly
Answer: C
12. If the Recall of a model is 0.95, what does this mean in terms of False Negatives?
A. The model misses 5% of positive instances
B. The model correctly identifies 95% of negative instances
C. The model has 5% false positives
D. The model misses 95% of positive instances
Answer: A
(Recall = TP / (TP + FN), so a recall of 0.95 means that 5% of the positive instances are false negatives)
13. What happens to the F1-score if the Precision and Recall are equal?
A. The F1-score will be zero
B. The F1-score will be equal to Precision or Recall
C. The F1-score will be higher than Precision and Recall
D. The F1-score will depend on the number of false positives
Answer: B
(If Precision = Recall, then F1-score = Precision = Recall)
14. In a situation where False Positives are less critical than False Negatives, which metric would be more important?
A. Precision
B. Recall
C. F1-score
D. Accuracy
Answer: B
(When False Positives are less critical, Recall is more important as it focuses on minimizing False Negatives)
15. Which of the following is the formula for F1-score?
A. (Precision + Recall) / 2
B. (Precision * Recall) / (Precision + Recall)
C. 2 * (Precision * Recall) / (Precision + Recall)
D. (Precision * Recall) / 2
Answer: C