OBJECTIVE
We conducted a pilot evaluation to determine the effects of automated case detection on the timeliness of case reporting using data extracted from the Utah National Electronic Telecommunications System for Surveillance (NETSS). We hypothesized that presence of an automated case detection system influences timeliness of reporting and compliance with state law for “reporting within three working days”.
METHODS
Data for 364 reported influenza-associated hospitalizations from two urban counties were extracted from NETSS for the 2005-2006 influenza season. To limit confounding introduced by between county reporting and evaluate the effects of automated detection alone, we focused only on cases hospitalized in the same county of residence (n=275). The interval between the date of lab test and local health department report was calculated to include only work days and exclude holidays. Timeliness of case reporting was evaluated using the hazard ratio (HR) from Cox regression analysis.
RESULTS
Case reporting was skewed and took as long as 37 days for facilities using automated case detection systems. Proportions of cases reported within three working days were nearly identical for facilities using an automated or manual detection system. There was no significant difference in the timeliness of reporting after three working days (HR= 1.034; p=0.9139). After allowing 10 working days to accommodate routine weekly workflow, cases continued to be reported including 12 cases from facilities with automated detection and six where manual detection was used.
CONCLUSION
Results from this pilot study
inform current projects underway by the Utah Center of Excellence in Public
Health Informatics. When assessing the timeliness of case reporting,
considerations must be made for reporting between counties and continued
detection of cases using routine case identification processes. The solution
may be a shared reporting database with work already in progress in the state
of