Project Title

Agree with me, K? Interrater Reliability Using Cohen's Kappa

Presenting Author(s)

Jessica Fossum

Document Type

Event

Start Date

10-5-2019 3:30 PM

End Date

10-5-2019 6:30 PM

Description

This project explores interrater reliability, which is the level of agreement between two coders. Specifically, we look at the mathematics behind Cohen's Kappa and its extensions, which are ways of measuring the agreement. Functions to calculate the statistic are coded in R, and an example application is given from psychological research.

Discipline

Math

Research Mentor(s)

Brian Gill

Copyright Status

http://rightsstatements.org/vocab/InC/1.0/

Additional Rights Information

Copyright held by author(s).

This document is currently not available here.

Share

COinS
 
May 10th, 3:30 PM May 10th, 6:30 PM

Agree with me, K? Interrater Reliability Using Cohen's Kappa

This project explores interrater reliability, which is the level of agreement between two coders. Specifically, we look at the mathematics behind Cohen's Kappa and its extensions, which are ways of measuring the agreement. Functions to calculate the statistic are coded in R, and an example application is given from psychological research.

Rights Statement

In Copyright