confusion_matrix()¶
- audplot.confusion_matrix(truth, prediction, *, labels=None, label_aliases=None, percentage=False, show_both=False, ax=None)[source]¶
Confusion matrix between ground truth and prediction.
The confusion matrix is calculated by
audmetric.confusion_matrix
.- Parameters
labels (
Optional
[Sequence
]) – labels to be included in confusion matrixlabel_aliases (
Optional
[Dict
]) – mapping to alias names for labels to be presented in the plotpercentage (
bool
) – ifTrue
present the confusion matrix with percentage values instead of absolute numbersshow_both (
bool
) – ifTrue
and percentage isTrue
it shows absolute numbers in brackets below percentage values. IfTrue
and percentage isFalse
it shows the percentage in brackets below absolute numbersax (
Optional
[Axes
]) – pre-existing axes for the plot. Otherwise, callsmatplotlib.pyplot.gca()
internally
Examples
>>> truth = [0, 1, 1, 1, 2, 2, 2] * 1000 >>> prediction = [0, 1, 2, 2, 0, 0, 2] * 1000 >>> confusion_matrix(truth, prediction)
>>> confusion_matrix(truth, prediction, percentage=True)
>>> confusion_matrix(truth, prediction, show_both=True)
>>> confusion_matrix(truth, prediction, percentage=True, show_both=True)
>>> confusion_matrix(truth, prediction, labels=[0, 1, 2, 3])
>>> confusion_matrix(truth, prediction, label_aliases={0: "A", 1: "B", 2: "C"})