Matthews Correlation Coefficient Calculator
Enter confusion-matrix counts (TP, TN, FP, FN) to compute MCC, accuracy, precision, recall, and specificity.
MCC
0.7303
Accuracy
86.36%
Interpretation
Strong positive agreement.
Precision: 90.91%
Recall (Sensitivity): 83.33%
Specificity: 90.00%
Total samples: 110
How to Use This Calculator
- Enter confusion-matrix counts for a binary classifier.
- Review MCC alongside accuracy, precision, recall, and specificity.
- Interpret MCC to understand overall agreement between predictions and ground truth.
- Use the metrics to compare classifiers or tune thresholds.
Formula
MCC = (TP · TN − FP · FN) / √[(TP + FP)(TP + FN)(TN + FP)(TN + FN)]
Precision = TP / (TP + FP)
Recall = TP / (TP + FN)
Specificity = TN / (TN + FP)
Accuracy = (TP + TN) / (TP + TN + FP + FN)
MCC ranges from −1 (total disagreement) to +1 (perfect classification), with 0 representing random performance.
Full Description
Matthews correlation coefficient is a balanced metric for binary classification, considering all four cells of the confusion matrix. It excels in situations with class imbalance where accuracy may be misleading. Higher MCC values indicate stronger agreement between predictions and actual labels.
Unlike precision or recall alone, MCC reflects model performance for both positive and negative classes simultaneously.
Frequently Asked Questions
Can MCC be used with imbalanced datasets?
Yes. MCC is particularly robust to imbalance because it uses all confusion-matrix cells.
Why might MCC be zero?
A value near zero indicates predictions are no better than random guessing.
What if the denominator is zero?
When any term in the denominator equals zero, MCC is undefined; we report zero for stability and highlight insufficient variation.
Is MCC suitable for multi-class problems?
MCC can be generalized, but this calculator focuses on binary classification.