Project Title
Agree with me, K? Interrater Reliability Using Cohen's Kappa
Document Type
Event
Start Date
10-5-2019 3:30 PM
End Date
10-5-2019 6:30 PM
Description
This project explores interrater reliability, which is the level of agreement between two coders. Specifically, we look at the mathematics behind Cohen's Kappa and its extensions, which are ways of measuring the agreement. Functions to calculate the statistic are coded in R, and an example application is given from psychological research.
Discipline
Math
Research Mentor(s)
Brian Gill
Copyright Status
http://rightsstatements.org/vocab/InC/1.0/
Additional Rights Information
Copyright held by author(s).
Agree with me, K? Interrater Reliability Using Cohen's Kappa
This project explores interrater reliability, which is the level of agreement between two coders. Specifically, we look at the mathematics behind Cohen's Kappa and its extensions, which are ways of measuring the agreement. Functions to calculate the statistic are coded in R, and an example application is given from psychological research.