Confusion matrix MCQs January 8, 2026November 19, 2024 by u930973931_answers 12 min Score: 0 Attempted: 0/12 Subscribe 1. In a confusion matrix, what does a True Positive (TP) represent? (A) The number of instances incorrectly predicted as negative (B) The number of instances correctly predicted as positive (C) The number of instances correctly predicted as negative (D) The number of instances incorrectly predicted as positive 2. In a confusion matrix, what does a False Positive (FP) represent? (A) The number of instances incorrectly predicted as negative (B) The number of instances correctly predicted as positive (C) The number of instances correctly predicted as negative (D) The number of instances incorrectly predicted as positive 3. Which of the following is a metric derived from the confusion matrix used to evaluate classification performance? (A) Mean Squared Error (MSE) (B) F1 Score (C) Entropy (D) Precision 4. What does False Negative (FN) indicate in a confusion matrix? (A) Negative instances correctly classified as negative (B) Negative instances incorrectly classified as positive (C) Positive instances correctly classified as positive (D) Positive instances incorrectly classified as negative 5. What is the formula for calculating Precision from a confusion matrix? (A) TP / (TP + FN) (B) TP / (TP + FP) (C) (TP + TN) / (TP + TN + FP + FN) (D) (TP + FP) / (TP + TN + FP + FN) 6. In a confusion matrix, what does the term “True Negative” (TN) represent? (A) Instances correctly classified as negative (B) Instances incorrectly classified as positive (C) Instances correctly classified as positive (D) Instances incorrectly classified as negative 7. Which metric evaluates the balance between Precision and Recall and is derived from the confusion matrix? (A) Specificity (B) Accuracy (C) F1 Score (D) ROC-AUC 8. What does a confusion matrix with high True Negative (TN) and low False Positive (FP) indicate? (A) The model correctly identifies negative instances (B) The model struggles to identify positives (C) The model is overfitting (D) The model is ineffective 9. What is the relationship between Precision and False Positives (FP)? (A) Precision increases as FP increases (B) Precision is unaffected by FP (C) Precision decreases as FP increases (D) Precision increases as FP decreases 10. What is the primary purpose of the confusion matrix? (A) To calculate model speed (B) To visualize data distribution (C) To summarize correct and incorrect predictions (D) To estimate memory usage 11. What does the False Positive Rate (FPR) represent? (A) Proportion of negatives incorrectly classified as positive (B) Proportion of positives correctly classified (C) Proportion of negatives correctly classified (D) Proportion of positives incorrectly classified 12. What is the formula for calculating Accuracy from a confusion matrix? (A) (TP + FP) / (TP + TN + FP + FN) (B) (TP + FN) / (TP + TN + FP + FN) (C) (TP + TN) / (TP + TN + FP + FN) (D) (TP + TN) / (FP + FN)