rater_agreement()¶
- audpsychometric.rater_agreement(ratings, *, axis=1)[source]¶
Calculate rater agreements.
Calculate the agreement of a rater by the Pearson correlation of a rater with the mean score of all other raters.
This should not be confused with the agreement value that relates to a rated stimulus, e.g.
audspychometric.agreement_numerical()
.- Parameters
- Return type
- Returns
rater agreements
Examples
>>> rater_agreement([[1, 1, 0], [2, 2, 1]]) array([1., 1., 1.]) >>> rater_agreement([[1, 1, 0], [2, 2, 1], [2, 2, 2]]) array([0.94491118, 0.94491118, 0.8660254 ])