confusion_matrix()

audplot.confusion_matrix(truth, prediction, *, labels=None, label_aliases=None, percentage=False, show_both=False, ax=None)[source]

Confusion matrix between ground truth and prediction.

The confusion matrix is calculated by audmetric.confusion_matrix.

Parameters:
  • truth (Sequence | Series) – truth values

  • prediction (Sequence | Series) – predicted values

  • labels (Sequence) – labels to be included in confusion matrix

  • label_aliases (dict) – mapping to alias names for labels to be presented in the plot

  • percentage (bool) – if True present the confusion matrix with percentage values instead of absolute numbers

  • show_both (bool) – if True and percentage is True it shows absolute numbers in brackets below percentage values. If True and percentage is False it shows the percentage in brackets below absolute numbers

  • ax (Axes) – pre-existing axes for the plot. Otherwise, calls matplotlib.pyplot.gca() internally

Examples

>>> truth = [0, 1, 1, 1, 2, 2, 2] * 1000
>>> prediction = [0, 1, 2, 2, 0, 0, 2] * 1000
>>> confusion_matrix(truth, prediction)
../_images/audplot-confusion_matrix-2.svg
>>> confusion_matrix(truth, prediction, percentage=True)
../_images/audplot-confusion_matrix-3.svg
>>> confusion_matrix(truth, prediction, show_both=True)
../_images/audplot-confusion_matrix-4.svg
>>> confusion_matrix(truth, prediction, percentage=True, show_both=True)
../_images/audplot-confusion_matrix-5.svg
>>> confusion_matrix(truth, prediction, labels=[0, 1, 2, 3])
../_images/audplot-confusion_matrix-6.svg
>>> confusion_matrix(
...     truth, prediction, label_aliases={0: "A", 1: "B", 2: "C"}
... )
../_images/audplot-confusion_matrix-7.svg