High volume gynecological specimens (pap smears and cervical biopsies) are interpreted by multiple observers.
For these interpretations, we quantified interobserver (percent) agreement and variability (using the Kappa statistic).
30 liquid-based cytology specimens and 24 cervical biopsy slides were interpreted by 10 cytotechnologists (pap smears only), 4 cytopathologists (both pap smears and cervical biopsies) and 3 surgical pathologists (cervical biopsies only).
Agreement among the cytotechnologists on pap smears was 79% (23.7/30) with a moderate to high Kappa value range (0.398 - 0.801). Intergroup agreement and variability between cytotechnologists and cytopathologists (on pap smears) were also similar, 75.2% (22.8/30) and Kappa value range 0.223 – 0.742. Agreement among the cytopathologists themselves on pap smears was less, 66.6% (20/30) (Kappa value range 0.348 - .681). Agreement among cytopathologists on cervical biopsy slides was reduced further, 58.3% (14/24) (Kappa value range 0.390 – 0.598). The agreement among the surgical pathologists on cervical biopsy slides was least 51.2% (12.3/24), and the range of Kappa values range widest (0.066 –0.709). Intergroup agreement between cytopathologists and the surgical pathologists on biopsy slides was also low: 50.5% (12.1/24) (Kappa value range also 0.066 – 0.709).
Agreement between the cytotechnologists and cytopathologists (regarding pap smears) was highest; cytopathologists agreed marginally less well among themselves; cytopathologists and surgical pathologists agreed (on biopsy slides) significantly less. Variable agreement by specimen type (pap smear vs. cervical biopsy) and among the types of interpreters, especially interobserver variability among cytopathologists and surgical pathologists can lead to diagnostic, and, over time, therapeutic variation in patient management, not due to changes in patient presentation or condition.
See more of Poster Session
See more of The 2005 Institute for Quality in Laboratory Medicine Conference