logo BDSP

Base documentaire


  1. Focus on quantitative method : Interpreting kappa values for two-observer nursing diagnosis data.

    Article - En anglais

    Reliability of nursing observations often is estimated using Cohen's kappa, a chance-adjusted measure of agreement between observer RNs.

    However, use of kappa as an omnibus measure sometimes can be misleading.

    In a study partly designed to describe the frequency and reliability of nursing diagnoses in long-term care facilities, 360 residents each were assessed independently by two registered nurses, and kappa and observed proportion of agreement were calculated as estimates of reliability.

    For some diagnoses we observed high proportions of agreement, yet paradoxically low kappa values.

    This article presents an in-depth statistical analysis to resolve this paradox.

    Results from our analysis also suggest means for planning improvements in the diagnostic performance of participating RNs.

    Consequently, our approach can be used in similar studies of diagnosis reliability to enhance nursing research, education, and practice.

    Mots-clés Pascal : Diagnostic, Infirmier, Long séjour, Indice kappa, Fiabilité, Accord interjuge, Homme, Analyse statistique

    Mots-clés Pascal anglais : Diagnosis, Nurse, Long stay, Kappa number, Reliability, Interrater agreement, Human, Statistical analysis

    Logo du centre Notice produite par :
    Inist-CNRS - Institut de l'Information Scientifique et Technique

    Cote : 97-0513073

    Code Inist : 002B30A05. Création : 13/02/1998.