Optimizing Real World Data Collection: In-Instrument Quality Control in REDCap

Author

Farees Saqlain, Sophia Z. Shalhout, David M. Miller

Published

April 26, 2020

Abstract
A proposed workflow for referral of data quality concerns in a patient registry hosted on REDCap

Objectives for the Optimizing Real World Data Collection Tutorial Series

Overview

Overview of REDCap

In-Instrument Quality Control: Tutorial Overview

Key Points

  • In this post we provide a workflow for referral of data quality concerns up the hierarchy of oversight in the day-to-day operation of a patient registry hosted on REDCap. This can be easily adapted to maintain data integrity and quality control for other research data collection efforts in REDCap.
  • After this lesson, you will:
    • Know how to implement in-instrument quality control fields in your registry.
    • Understand how to use the reports feature in REDCap to facilitate review of quality control concerns by supervising personnel.
  • Skill Level: Intermediate

In-Instrument Quality Control Dashboard

Instruments in REDCap

  • REDCap allows for the creation of separate instruments or forms for capture of clusters of clinical information.

    • For example, in a typical tumor registry, there might be an instrument for patient demographics, and separate instruments for capturing details on presentation, individual tumor lesions, and pathology, as illustrated below.

  • REDCap includes the capability for “repeating instruments” that allow the data abstractor to generate multiple pages (or “instances”) of the same form for data entry. For example, you may anticipate capturing details from different pathology reports in distinct pages within the pathology instrument.

Data Entry Supervision Hierarchy

  • Although there are different organizational structures for collection of RWD, such as a tumor registry, one common workflow incorporates a three-tiered model, with the degree of disease-specific and registry-specific expertise increasing with each level.
    1. Data Abstractor: Locates and enters targeted data into the registry.
    2. Data Manager: Trains and oversees data abstractors. Primarily responsible for maintenance of data quality in the registry.
    3. Longitudinal Provider: Clinician with first-hand knowledge of the patient’s clinical course.
      - Alternatively, a site may have a Principal Investigator (Site PI) with content expertise and oversight responsibilities, but no direct patient care contact with the data collection subjects.
      • For the remainder of tutorial, the Longitudinal Provider and Site PI will share the same position in the workflow hierarchy.
  • To ensure data quality, concerns that are generated during the abstraction process must be recorded and resolved with input of supervising personnel.

In-Instrument Quality Control Dashboard

  • The Quality Control Dashboard is situated at the conclusion of every instrument instance (i.e. every page of every form) and offers a means to capture concerns as they emerge during the abstraction process as well as document outcome of the review.

  • Checkboxes are associated with each of the three levels of personnel in the generic registry workflow, outlined above.

  • Through checking off the corresponding boxes, the Data Abstractor is able to confirm that capture for the page is complete and supervising team members can confirm that they have performed a quality check of entered data.

Flagging Pages for Review

  • Members of the team can flag the page for review by a supervisor. The Longitudinal Provider (or Site PI) may also flag pages for special consideration.

  • Flagging the record opens an associated text box for entry of further details.

  • Details of flagged records across the entire patient registry are summarized in an accompanying REDCap report.

Flag Reports

  • REDCap allows for the generation of basic summary tables and visualizations through the Reports feature.

  • In a REDCap Report, you can filter which patient records appear in table rows with conditional logic, and select which registry fields are displayed in the table’s columns.

  • Here we have created three reports, corresponding to Flags made by the Data Abstractor, Data Manager, and Longitudinal Provider.

  • For example, the Data Abstractor Flag Report displays the patient record, the instrument, the instance or page number, and the accompanying flag note. The specific column in which the note appears depends on which instrument was flagged, whether “Patient Characteristics,” “Presentation and Initial Staging,” “Lesion Information,” or “Pathology.”

  • Clicking on the Record ID hyperlink will direct the reviewer directly to the instance of concern.

  • Flagged records drop off reports as soon as a member of the team higher in the supervision hierarchy performs a quality check and resolves the concern.

    • For Data Abstractor Flags, Quality Check by the Data Manager or Longitudinal Provider will drop the record from the report.

      • The conditional logic used in the report to accomplish this is given below, as an example.

        ([ptch_qcdash(daf)] = 1 AND [ptch_qcdash(dmqc)] <> 1 AND [ptch_qcdash(lpqc)] <> 1) OR 
        ([pres_qcdash(daf)] = 1 AND [pres_qcdash(dmqc)] <> 1 AND [pres_qcdash(lpqc)] <> 1) OR 
        ([les_qcdash(daf)] =  1 AND [les_qcdash(dmqc)] <> 1 AND [les_qcdash(lpqc)] <> 1) OR 
        ([path_qcdash(daf)] = 1 AND [path_qcdash(dmqc)] <> 1 AND [path_qcdash(lpqc)] <> 1)
    • For Data Manager Flags, Quality Check by the Longitudinal Provider will drop the record from the summary report.

    • For Longitudinal Provider Flags, Quality Check by the Longitudinal Provider will drop the record from summary report.

Sample Workflow

  • The critical advantage the described system provides is the ability for all three levels of the registry team to operate simultaneously, with supervisors reviewing and responding to identified concerns in parallel with data abstraction. This prevents stalling in progress on data acquisition, while ensuring that identified concerns are always attended to.
  • A simplified example of a weekly workflow is illustrated below.

Takeaways

  • Using “in-instrument” quality control fields and accompanying summary reports in REDCap provides an efficient and customizable approach to quality control and maintenance in a tumor registry.

Next in our series of tutorial posts on Optimizing Real World Data Collection: Monitoring Completion of Retrospective Records within a Patient Registry in REDCap