Precision, recall, and F1-score MCQs January 8, 2026November 19, 2024 by u930973931_answers 14 min Score: 0 Attempted: 0/14 Subscribe 1. What does Precision measure in a classification model? (A) The proportion of predicted positive instances that are actually positive (B) The proportion of actual negative instances correctly predicted as negative (C) The proportion of actual positive instances correctly predicted as positive (D) The proportion of predicted negative instances that are actually negative 2. Which of the following formulas represents Recall (Sensitivity)? (A) Recall = TP / (TP + FP) (B) Recall = TP / (TP + FN) (C) Recall = (TP + TN) / (TP + TN + FP + FN) (D) Recall = (TP + FN) / (FP + TN) 3. What is the F1-score in the context of classification evaluation? (A) The harmonic mean of accuracy and precision (B) The geometric mean of precision and recall (C) The harmonic mean of precision and recall (D) The average of precision and recall 4. What is the primary goal of maximizing the Recall of a model? (A) To minimize the number of false positives (B) To minimize the number of false negatives (C) To maximize the model’s ability to identify positive cases (D) To improve the model’s accuracy 5. If a classification model has Recall = 0.9 and Precision = 0.6, what is the F1-score? (A) 0.75 (B) 0.72 (C) 0.80 (D) 0.85 6. Which of the following is TRUE regarding Precision and Recall? (A) Precision and Recall are mutually exclusive (B) Precision is based on total correct predictions (C) Precision relates to false positives, while Recall relates to false negatives (D) Precision and Recall are always identical 7. A classification model predicts 80 positive instances, of which 60 are actually positive. What is the Precision? (A) 0.5 (B) 0.9 (C) 0.8 (D) 0.75 8. Which evaluation metric should be used when the cost of False Negatives is much higher than False Positives? (A) Recall (B) Accuracy (C) Precision (D) F1-score 9. If a model has Precision = 0.8 and Recall = 0.4, which statement is TRUE? (A) Good at avoiding false positives but weak at identifying positives (B) Poor at both tasks (C) Better at identifying positives than avoiding false positives (D) Excellent at both precision and recall 10. What does an F1-score of 1 indicate? (A) Perfect balance of precision and recall (B) Perfect recall only (C) Perfect precision only (D) No correct classifications 11. If the Recall of a model is 0.95, what does this mean? (A) The model correctly identifies 95% of negatives (B) The model misses 5% of positive instances (C) The model has 5% false positives (D) The model misses 95% of positives 12. What happens to the F1-score if Precision and Recall are equal? (A) F1-score becomes zero (B) F1-score equals Precision and Recall (C) F1-score becomes higher (D) F1-score depends on false positives 13. When False Positives are less critical than False Negatives, which metric is more important? (A) Precision (B) Recall (C) F1-score (D) Accuracy 14. Which of the following is the correct formula for the F1-score? (A) (Precision + Recall) / 2 (B) (Precision × Recall) / (Precision + Recall) (C) 2 × (Precision × Recall) / (Precision + Recall) (D) (Precision × Recall) / 2