Dedoose Blog

BLOG

Inter-Rater Reliability – Calculating Kappa

1/12/2017

Reliability is the “consistency” or “repeatability” of your measures (William M.K. Trochim, Reliability) and, from a methodological perspective, is central to demonstrating that you’ve employed a rigorous approach to your project. There are a number of approaches to assess inter-rater reliability—see the Dedoose user guide for strategies to help your team build and maintain high levels of consistency—but today we would like to focus on just one, Cohen’s Kappa coefficient.  So, brace yourself and let’s look behind the scenes to find how Dedoose calculates Kappa in the Training Center and find out how you can manually calculate your own reliability statistics if desired.

.

For a full description of  Inter-Rater Reliability, from basics to more advanced, check out our article here!!

.

Hope this helps and do send word if you want to see more blogs and articles like this? Got any questions? Let us know!

.

References

De Vries, H., Elliott, M. N., Kanouse, D. E., Teleki, S. S. (2008). Using Pooled Kappa to Summarize Interrater Agreement across Many Items. Field Methods, 20(3), 272-282.

Trochim, W. M. (2006). Reliability. Retrieved December 21, 2016, from http://www.socialresearchmethods.net/kb/reliable.php

;
Back