Chrome Extension
WeChat Mini Program
Use on ChatGLM

Multi-rater delta: extending the delta nominal measure of agreement between two raters to many raters

JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION(2022)

Cited 1|Views6
No score
Abstract
The need to measure the degree of agreement among R raters who independently classify n subjects within K nominal categories is frequent in many scientific areas. The most popular measures are Cohen's kappa (R = 2), Fleiss' kappa, Conger's kappa and Hubert's kappa (R >= 2) coefficients, which have several limitations. In 2004, the delta coefficient was defined for the case of R = 2, which did not have the limitations of Cohen's kappa coefficient. This article extends the coefficient delta from R = 2 raters to R >= 2 (coefficient multi-rater delta), demonstrating that it can be expressed in the kappa format and has the same advantages as the coefficient delta with regard to the type kappa classic coefficients: (i) it refers to the proportion of replies that are concordant not by chance; (ii) allow to obtain a parameter that faithfully measures the degree of agreement in each category; and (iii) it is not affected by the marginal imbalance.
More
Translated text
Key words
Cohen's kappa, Conger's kappa, Delta agreement, Fleiss' kappa, Hubert's kappa, nominal agreement
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined