Content Disclaimer Copyright @2020. All Rights Reserved. |

**Links : **Home
Index (Subjects)
Contact StatsToDo

Explanations and References
This page presents Intraclass Correlation Coefficient (ICC) in its "agreement" model, an algorithm to measure agreement (consensus, concordance) between 2 or more measurements that are normally distributed.
Javascript Program
ICC has advantages over correlation coefficient, in that it is adjusted for the effects of the scale of measurements, and that it will represent agreements from more than two raters or measuring methods. The calculation involves an initial Two Way Analysis of Variance, the components of which are then used to calculate the ICC
The data in the example is artificially generated to demonstrate the procedure, It purports to be from an exercise to test 3 methods of measuring blood pressure, to check whether the results agreed with each other. The table to the left shows the data, the columns represent the 3 methods of mesurements, the 5 rows the 5 subjects being measured. The table to the right is the analysis of variance (truncated to 1 decimal point for brevity), showing components of variation. The algorithm calculates the ICC and its 95% confidence interval using the variance components from the table. Six (6) methods of calculations are possible, depending on assumptions about the nature of the data - Model 1 assumes that each method of measurement or rater is different, being subsets of a larger set of methods or raters, randomly chosen. This would be the case of having different research assistants measuring the height of children, with different sets of people doing the measurements at different sites, then combining the data together.
- Model 2 assumes the same methods or raters performs the evaluations in all cases, although these methods or raters may be a subset of a larger set of methods or raters. This is the model most frequently encountered in clinical research, when the same researchers carried out measurements on all the subjects.
- Model 3 makes no assumptions about the methods or raters.
In most cases, single measurements are used. Unless the data has features that requires otherwise, the result of choice is In this example, the ICC (truncated to 2 decimal point) is 0.95, 95% Confidence interval from 0.79 to 0.99 ## InterpretationsThe results can be interpreted as follow- In the older texts such as that from Portney and Watkins (see reference), ICC is used without further processing as an index of agreement. ICC of 0-0.2 indicates
*poor*agreement, 0.3-0.4 indicates*fair*agreement; 0.5-0.6 indicates moderate agreement; 0.7-0.8 indicates*strong*agreement; and >0.8 indicates*almost perfect*agreement. - More recently Koo and Li suggested that an ICC of less than 0.5 should be considered poor agreement, and over 0.5 as good agreement. If this is accepted then a 95% confidcence interval that does not traverse the 0.5 value can be considered as good agreement
## ReferencesPortney LG & Watkins MP (2000) Foundations of clinical research Applications to practice. Prentice Hall Inc. New Jersey ISBN 0-8385-2695-0 p 560-567Koo TK abd Li MY (2016) A Guideline of Selecting and Reporting Intraclass Correlation Coefficients for Reliability Research. J Chiropr Med. 2016 Jun;15(2):155-63. article availabe on NIH website https://en.wikipedia.org/wiki/Intraclass_correlation Intraclass correlation on Wikipedia https://rdrr.io/cran/irr/src/R/icc.R Source Code for ICC from package irr
R provides an algorithm via its irr package. It produces Intraclass Correlation Coefficient, and the statistical significance of difference between the columns (measurements)
dat = (" 120 115 125 130 140 125 100 98 105 150 156 145 90 90 95 ") mx = read.table(textConnection(dat),header=FALSE) #install.packages("irr") # if not already installed library(irr) icc(mx, model = "twoway", type = "agreement", unit = "single")The results are > icc(mx, model = "twoway", type = "agreement", unit = "single") Single Score Intraclass Correlation Model: twoway Type : agreement Subjects = 5 Raters = 3 ICC(A,1) = 0.953 F-Test, H0: r0 = 0 ; H1: r0 > 0 F(4,8.46) = 50.7 , p = 6.17e-06 95%-Confidence Interval for ICC Population Values: 0.791 < ICC < 0.995 |