Confusion matrix MCQs

1. In a confusion matrix, what does a True Positive (TP) represent?

A. The number of instances incorrectly predicted as negative
B. The number of instances incorrectly predicted as positive
C. The number of instances correctly predicted as negative
D. The number of instances correctly predicted as positive

Answer: D


2. In a confusion matrix, what does a False Positive (FP) represent?

A. The number of instances incorrectly predicted as negative
B. The number of instances correctly predicted as positive
C. The number of instances incorrectly predicted as positive
D. The number of instances correctly predicted as negative

Answer: C


3. Which of the following is a metric derived from the confusion matrix that is used to evaluate classification model performance?

A. Mean Squared Error (MSE)
B. Precision
C. Entropy
D. F1 Score

Answer: B


**4. In a binary classification problem, if you have the following confusion matrix:

Predicted Positive Predicted Negative
Actual Positive 50 10
Actual Negative 5 100

What is the accuracy of the model?**
A. 0.90
B. 0.93
C. 0.85
D. 0.92

Answer: B
(Accuracy = (TP + TN) / (TP + TN + FP + FN) = (50 + 100) / (50 + 10 + 5 + 100) = 150 / 165 ≈ 0.93)


5. What does False Negative (FN) indicate in a confusion matrix?

A. The number of negative instances that are correctly classified as negative
B. The number of positive instances that are incorrectly classified as negative
C. The number of positive instances that are correctly classified as positive
D. The number of negative instances that are incorrectly classified as positive

Answer: B


6. What is the formula for calculating Precision from a confusion matrix?

A. TP / (TP + FP)
B. TP / (TP + FN)
C. (TP + TN) / (TP + TN + FP + FN)
D. (TP + FP) / (TP + TN + FP + FN)

Answer: A


**7. If a confusion matrix has the following values:

Predicted Positive Predicted Negative
Actual Positive 30 5
Actual Negative 2 50

What is the Recall (Sensitivity) of the model?**
A. 0.75
B. 0.85
C. 0.88
D. 0.93

Answer: B
(Recall = TP / (TP + FN) = 30 / (30 + 5) = 30 / 35 ≈ 0.85)


8. In a confusion matrix, what does the term “True Negative” (TN) represent?

A. The number of instances that were correctly classified as positive
B. The number of instances that were incorrectly classified as positive
C. The number of instances that were correctly classified as negative
D. The number of instances that were incorrectly classified as negative

Answer: C


9. What metric is commonly used to evaluate the balance between Precision and Recall, and is derived from the confusion matrix?

A. F1 Score
B. Accuracy
C. Specificity
D. ROC-AUC

Answer: A
(F1 Score = 2 * (Precision * Recall) / (Precision + Recall))


10. Which of the following is a correct interpretation of a confusion matrix with high True Negative (TN) and low False Positive (FP)?

A. The model is correctly identifying negative instances with high precision
B. The model is having difficulty identifying positive instances
C. The model is overfitting and incorrectly classifying positive instances
D. The model is ineffective in distinguishing negative instances

Answer: A


**11. If a confusion matrix has the following values:

Predicted Positive Predicted Negative
Actual Positive 100 10
Actual Negative 20 150

What is the specificity (True Negative Rate) of the model?**
A. 0.88
B. 0.93
C. 0.90
D. 0.85

Answer: B
(Specificity = TN / (TN + FP) = 150 / (150 + 20) = 150 / 170 ≈ 0.88)


12. What is the relationship between Precision and False Positive (FP)?

A. Precision increases as FP increases
B. Precision decreases as FP increases
C. Precision is unaffected by FP
D. Precision increases as FP decreases

Answer: B


13. What is the primary purpose of the confusion matrix in evaluating classification models?

A. To calculate the speed of the model
B. To visualize the data distribution
C. To summarize the classification performance by showing correct and incorrect predictions
D. To estimate the memory usage of the model

Answer: C


14. What does the “False Positive Rate” (FPR) represent in a confusion matrix?

A. The proportion of positive instances correctly classified as positive
B. The proportion of negative instances incorrectly classified as positive
C. The proportion of negative instances correctly classified as negative
D. The proportion of positive instances incorrectly classified as negative

Answer: B
(FPR = FP / (FP + TN))


15. What is the formula for calculating Accuracy from a confusion matrix?

A. (TP + TN) / (TP + TN + FP + FN)
B. (TP + FN) / (TP + TN + FP + FN)
C. (TP + FP) / (TP + TN + FP + FN)
D. (TP + TN) / (FP + FN)

Answer: A

Leave a Comment

All copyrights Reserved by MCQsAnswers.com - Powered By T4Tutorials