Reading a Confusion Matrix

Gabriel Warner
2 min readJul 15, 2021

--

The confusion matrix is a valuable way to see how well your model performs. A confusion matrix works by counting the number of times the model predicts the right category and how many times it predicts the wrong category. This is then put into a matrix for you to understand.

The picture above shows what each part of the confusion matrix is. On the x-axis the predicted values are shows and on the y-axis the actual values are shown. It is important to note that not all x and y axis labels are the same and they could be switched.

TN = True Negative: you predicted someone does not have covid-19 and they do not

FN = False Negative: you predicted that someone does not have covid-19 but they do

FP = Fale Positive: you predicted that someone has covid-19 and they do not

TP = True Positive: you predicted that someone has covid-19 and they do

Calculations with the confusion matrix:

Recall:

Recall calculates out of all the positive cases how many did we predict correctly.

Precision:

Precision calculates from all the classes we predicted as positive how many are really positive.

Accuracy:

Accuracy is from all the classes how many did we predict correctly.

F1 Score:

F1 scores show the weighted average between precision and recall.

Hopefully, after reading this blog you have a better understanding of what a confusion matrix is and what it can be useful for.

Next step: Understanding a three-class confusion matrix.

--

--

No responses yet