|
Context:
It has been fifteen years since Congress enacted the Clinical Laboratory Improvement Amendments of 1988 (CLIA) including special provisions for cytology proficiency testing (PT) which measures the performance of the individual rather than the performance of the laboratory. Only one program meets the requirements. This program, the Maryland Cytology Proficiency Testing Program (MCPTP), provides testing for less than 5% of the estimated 16,000 individuals in the United States involved in performing cytology examinations.
Objective:
To compare proficiency testing in gynecologic cytology using glass slides and virtual slides.
Methods:
One hundred eleven individuals (pathologists=52, cytotechnologists=59) took two proficiency tests. The annual test of the Maryland Cytology Proficiency Testing Program (MCPTP) was administered to individuals in their laboratory following normal work practice (i.e., using microscopes and equipment they were familiar with). The other test was CytoViewJ II, a computer-based test composed of virtual slides captured from the MCPTP's glass slides, which test administration personnel transported to the individual's laboratory and administered using one of two laptop computers. Performance on the two tests was compared to determine the effect of various potential confounding variables.
Results:
The mean score of the individuals (n=111) on MCPTP was 99.2% with a standard deviation of 2.2 and scores ranging from 90-100. The mean score of the individuals (n=111) on CytoViewJ II was 96.8% with a standard deviation of 5.8 and scores ranging from 70-100. No individual scored less than 90% on the glass slide test, pass rate =100%. Eight individuals (pathologists=3, cytotechnologists=5) scored less than 90% on the CytoViewJ II, pass rate of 93.8% (p<.05 @ 95% confidence interval). Lower scores on CytoViewJ II were associated with virtual slides that did not attain a 90% consensus among study participants. When virtual slides that did not attain a 90% consensus were excluded from the scoring, there was no significant difference in the pass rate for the two tests—glass slide test 100% and computer-based test 99.1% (no statistically significant difference detected at the 95% confidence interval).
Conclusions:
Computer-based testing can be equivalent to glass slide testing if test validation for both tests is comparable.
See more of Poster Session
See more of The 2005 Institute for Quality in Laboratory Medicine Conference