Partially Automating the Casefinding Audit Process
Scott Riddle, B.S. ● Nankee Singh, CTR
Cancer Registry of Greater California
Introduction
The Cancer Registry of Greater California (CRGC) performs casefinding
audits in order to have an objective baseline of how well a facility is
reporting cases. The CRGC will request a listing of the Medical Records
Disease Index (MRDI) for a specified time period and query our
Database Management System (DBMS) to see if cancer cases were
reported. If a case is not in our database then we ask the facility
abstractor to investigate and explain why any identified cases were not in
our database.
Manual Madness
Casefinding audits were being performed manually using whatever the
facility cared to send. The types of information received varied –
anywhere from faxed reports to formatted spreadsheets. The process of
performing an audit was conducted in the following manner:
1. The MRDI Listing was printed if a file was not received.
2. For cases that had a reportable ICD-9 diagnosis, the Auditor
manually queried our DBMS to see if it had been reported by the
facility.
3. If the case was not in our DBMS then the Auditor manually entered
the case information into an Excel Follow Back Spreadsheet.
4. Once the DBMS look-up portion of the audit was completed, the
Auditor sent the Follow Back Spreadsheet to the facility for further
investigation.
5. The facility investigated the cases listed on the Follow Back
Spreadsheet and determined whether or not each was reportable.
6. The Follow Back Spreadsheet was returned to the Auditor who then
monitored the facility’s transmit submissions to ensure all missed
reportable cases made their way into the DBMS.
Meeting of the Minds
The Auditor and Programmer met and determined that a casefinding
audit process should do the following:
1. The facility information needed to be delivered electronically.
2. Manual querying of the DBMS should be reduced and made as easy
as possible.
3. The system should keep track of progress.
4. The Follow Back Spreadsheet should be generated from within the
program.
5. The system should be used to track missed cases to help determine
when all missed cases were reported. The audit is deemed complete
when all missed cases have been reported.
Details, details…
Electronic Information
We created a document that details the items in an Excel format
necessary to import the facility’s MRDI information into our new program.
We will provide this information with our data request.
Collating the patient information
The program collates the patient and associated MRDI information so
that all related information is shown and processed together.
Internal linkage
The facility information will be linked to the DBMS information and the
results will be provided on Matched and Non-matched screens.
Progress counter
A running total of reviewed and non-reviewed cases will be displayed on
the main screen.
Tracking missed cases
Once the Follow Back Spreadsheet is returned, the Auditor flags the
missed cases in the program for ease of tracking.
Midstream Enhancements
After the first audit using the program was completed, two enhancements
were added to speed up the process even more.
Changing the status
The status selection was changed from a drop down list (requiring 2 or 3
mouse clicks) to a radio button (one mouse click).
Excluding non-reportable entries
The second facility used for this new program sent all records, instead of
cancer only records, for the time period requested. We added a routine
to set any entry that did not have any ICD-9 related cancer codes to Not
Reportable.
Screen Shots
Screens were created to make distinct steps for each process.
Figure 1. Main screen showing audit counts Figure 2. Patient Match screen
Figure 3. Non-match screen Figure 4. Followback screen used to record results from Follow Back
Spreadsheet
Comic strips generated using toonlet.com
Results
Lessons Learned
Medical Record Departments don’t always read the documentation.
We will need to perform more audits in order to get the program
configured to the point where the Auditor can set up an audit without
any IT assistance.
Conclusions
By being more concise with facility information request, grouping the
information into patient sets, and providing screens that allows for faster
DBMS querying, the CRGC was able to save 2 weeks of an Auditors
time with the first 2 audits. We expect subsequent audits to be performed
even faster.
Facility 1 Facility 2
Annual Case Count (facility size) 850 175
Medical Record Disease Index Information
Months of Information Requested 2 4
Total Patients Sent 739 1,846
Total DX Records Sent 1,756 9,069
Audit Information
Patient Matches to DBMS 739 338
Non-Matches 4 1,508
Follow Back Cases 97 46
Time tracking (Hours)
Estimate of Manual Audit 38 84
Actual Time using Program 24 18
Estimated time saved using Program 14 66