logo BDSP

Base documentaire

  1. Bias, prevalence and kappa.

    Article - En anglais

    Since the introduction of Cohen's kappa as a chance-adjusted measure of agreement between two observers, several « paradoxes » in its interpretation have been pointed out.

    The difficulties occur because kappa not only measures agreement but is also affected in complex ways by the presence of bias between observers and by the distributions of data across the categories that are used ( « prevalence »). In this paper, new indices that provide independent measures of bias and prevalence, as well as of observed agreement, are defined and a simple formula is derived that expresses kappa in terms of these three indices.

    Mots-clés Pascal : Coefficient K, Prévalence, Accord interjuge, Analyse statistique, Biais

    Mots-clés Pascal anglais : U value, Prevalence, Interrater agreement, Statistical analysis, Bias

    Logo du centre Notice produite par :
    Inist-CNRS - Institut de l'Information Scientifique et Technique

    Cote : 93-0517063

    Code Inist : 002B28A. Création : 199406.