Intraclass Correlation Coefficient:
From: | To: |
The Intraclass Correlation Coefficient (ICC) is a measure of reliability or agreement among raters or measurements. It assesses how strongly units in the same group resemble each other, ranging from 0 (no agreement) to 1 (perfect agreement).
The calculator uses the ICC formula:
Where:
Guidelines:
Tips: Enter mean square values from ANOVA table, number of raters and subjects. All values must be positive with at least 2 raters and 2 subjects.
Q1: What's the difference between ICC and Pearson correlation?
A: ICC assesses agreement (absolute values), while Pearson assesses linear relationship (relative values).
Q2: Which ICC model should I use?
A: This calculator uses ICC(2,1) - two-way random effects model for absolute agreement.
Q3: How many raters/subjects do I need?
A: Typically 2-5 raters and 30+ subjects for stable estimates, but depends on expected ICC.
Q4: Can ICC be negative?
A: Yes, but this usually indicates problems with the data or model specification.
Q5: What are alternatives to ICC?
A: Cohen's kappa (for categorical data), Bland-Altman plots, or Krippendorff's alpha.