cronbachs_alpha()¶
- audpsychometric.cronbachs_alpha(ratings, *, axis=1)[source]¶
Calculate Cronbach’s alpha.
The Cronbach coefficient quantifying interrater agreement. Returns alpha as a float and additional information specific to this measure collated into a dictionary.
Cronbach’s alpha generalizes Cohen’s kappa and can handle three or more answers per variable. It is suitable for Likert type scale answers. A blogpost on congeneric reliability states that Cronbach’s alpha assumes essential tau-equivalence and underestimates reliability. A tau-equivalent measurement model is a special case of a congeneric measurement model with all loadings equal [cro].
A simplified formula is given in Hilsdorf [Hil] that relates the measure to the average reliability:
\[\alpha_{st} = \frac{N \times \bar{r}} {1 + (N - 1) \times \bar{r}}\]where
\(N\) is the number of items (labelled chunks)
\(\bar{r}\) is the average correlation between the items
- Parameters
- Return type
- Returns
Cronbach’s alpha and additional results lumped into dict