ROC curves and AUC MCQs January 8, 2026November 19, 2024 by u930973931_answers 15 min Score: 0 Attempted: 0/15 Subscribe 1. What does the ROC Curve (Receiver Operating Characteristic Curve) visualize? (A) The trade-off between True Positive Rate (TPR) and False Positive Rate (FPR) (B) The relationship between Precision and Recall (C) The correlation between actual values and predicted values (D) The distribution of the predicted probabilities 2. What is the True Positive Rate (TPR) also known as? (A) Specificity (B) Precision (C) Recall or Sensitivity (D) False Negative Rate (FNR) 3. What does the False Positive Rate (FPR) represent in a confusion matrix? (A) The proportion of negative instances incorrectly classified as positive (B) The proportion of positive instances correctly classified as positive (C) The proportion of negative instances correctly classified as negative (D) The proportion of false positives relative to actual positives 4. Which of the following describes a perfect classifier in the context of an ROC curve? (A) A curve passing through the origin (0,0) (B) A curve hugging the top-left corner (C) A diagonal straight line (D) Equal True Positive and False Positive Rates 5. What is the range of the Area Under the ROC Curve (AUC)? (A) [0, 0.5] (B) [0, ∞) (C) (-∞, ∞) (D) [0, 1] 6. What does an AUC value of 0.5 imply about a model? (A) The model is randomly guessing (B) The model performs very well (C) The model is perfectly classifying instances (D) The model cannot identify positives 7. Which of the following is TRUE about an ROC curve with a steep initial rise? (A) Low TPR and high FPR (B) Poor class discrimination (C) High accuracy in early predictions (D) Balanced precision and recall only 8. When comparing two models using ROC curves, which model is considered better? (A) Curve closest to the bottom-right corner (B) Model with the least area (C) Curve closest to the bottom-left corner (D) Model with the highest AUC 9. What does a decreasing ROC curve indicate? (A) Overfitting (B) Poor model performance (C) Excellent classification (D) Underfitting 10. How is AUC interpreted when comparing classification models? (A) AUC above 0.7 indicates overfitting (B) Higher AUC indicates worse performance (C) AUC is unreliable (D) Higher AUC indicates better performance 11. Which statement about the ROC curve is TRUE? (A) It plots TPR vs FPR (B) It shows Precision vs Recall (C) It plots FNR vs TPR (D) It is only used for binary classification 12. What is the main purpose of the ROC curve in model evaluation? (A) To visualize class discrimination ability (B) To find the optimal threshold (C) To measure computational efficiency (D) To evaluate model fit 13. In ROC analysis, what does Sensitivity represent? (A) Correctly identifying negatives (B) Overall accuracy (C) Proportion of false positives (D) Proportion of true positives 14. Which of the following is FALSE regarding AUC? (A) AUC measures class discrimination (B) AUC can be used for multi-class problems (C) AUC below 0.5 indicates better-than-random performance (D) AUC is threshold-independent 15. If a model has an AUC of 0.85, what does this mean? (A) The model ranks a random positive higher than a random negative 85% of the time (B) The model correctly classifies 85% of instances (C) The model has a 15% error rate (D) The model is perfect