List of Suggested Data Management Standard Operating Procedures for Electronic Data Capture
The following list of Standard Operating Procedures (SOPs) is a suggested set of SOPs for users of electronic data capture (EDC) systems. This is in no way meant as an exhaustive list, but is instead presented as a recommended minimum set of data management procedures. For a complete list of required SOPs, please consult the current regulations and guidelines applicable to your business and/or study(ies). OpenClinica’s professional services team can help you develop SOPs, or review your existing SOPs.
In addition to this list, organizations that use electronic systems for clinical trials should audit the vendor(s) of the software system(s) used to ensure the appropriate development SOPs were in place and followed appropriately throughout the development of the software.
SOP | Description |
1. Development and maintenance of SOPs | Define the SOP template and the development, review/approval process for all SOPs, including roles/responsibilities, SOP release/distribution requirements, SOP version control, etc. |
2. SOP Deviations | Describe the process for reporting and documenting any deviations from the SOPs. Be sure to address planned as well as unplanned deviations. |
3. Data Privacy and Protection | Describe the process for ensuring data privacy and protection within your organization as well as via your software solution/service (if applicable). |
4. Document/File/Study Binder Management | Describe the process for managing all documents related to study conduct. Include details on any differences between in-house vs. CRO-conducted studies. What is the version control process for the Study Binder? |
5. Data Management Roles and Responsibilities | Clearly define the roles and responsibilities for all users participating in study data management. |
6. Data Management Plan (DMP) | Describe the DMP template. Be sure to include a list of the SOPs to be followed, the clinical data management system to be used, descriptions of data sources, data handling processes, data transfer formats and process, and quality control procedures to be applied. Define the process for developing, approving, and maintaining the Data Management Plan. Include details on version control. |
7. Data Monitoring Plan | Describe the Data Monitoring Plan template. The Data Monitoring Plan should ensure:
If partial data monitoring is used, be sure to specify exactly what partial data monitoring means for the study in question (e.g., 100% monitoring for a list of critical data values, 100% verification of 20% of the subjects, etc.) Define the process for developing, approving, and maintaining the Data Monitoring Plan. Include details on version control. |
8. Statistical Analysis Plan | Describe the Statistical Analysis Plan template and define the process for developing, approving, and maintaining the Statistical Analysis Plan; include details on version control. |
9. e-CRF Design and Development | Define the process for design, development, and standardization of eCRFs. Be sure to include details for the design, development, approval, and version control process. |
10. Study-Specific Database Design | Describe the process for setting up any study-specific attributes (anything outside of your standard eCRFs). This may include annotated CRFs or design documents. |
11. Edit Check/Data Validation Programming | Document the process for creating edit check specifications, as well as edit check development, review and approval, testing, documentation, and version control. |
12. Study User Acceptance Testing (UAT) | Define what testing is required and what documentation is required to demonstrate that the study passed validation. Specify who gives approval for use of the system. Testing should not be performed by the person who built the study database. |
13. Data Entry | Define the process for entering and editing data. Data entry should address general guidelines (inputting scientific symbols (if applicable), use of UI features, etc.) as well as how/where to document study-specific guidelines. |
14. Data Receipt and Handling | Define the different means by which data may be received. Be sure to address all types of data receipt EDC, ePRO, imports, web services, paper, etc. |
15. Discrepancy Management | Define the process for reviewing and resolving data discrepancies, and define roles and responsibilities associated with discrepancy management. |
16. Coding | Define the process for coding adverse events and medications, any review process involved, and the change control or re-coding process. |
17. Serious Adverse Event Reconciliation | Define the process for handling serious adverse events and the reconciliation process between data management and safety surveillance. Define any review timeframes and sign-off procedures that may be required prior to locking the database. |
18. Lab Data Management | Define the process for handling laboratory data. If necessary, differentiate between local vs. central labs and the data import and discrepancy resolution process. |
19. Data Extraction and Validation | Define the process for extracting data and the method for verifying that the data that was extracted matches the data that was entered into the system. |
20. Data Transfer and Validation | Define the process for transferring data to other systems and the method for verifying that the data that was transferred matches the data that was entered into the original system. |
21. Database Security | Describe the requirements, methods, and tests that ensure your database is secure. This should include username/password requirements, password expiration, means for resetting passwords, how system/study access is granted/revoked, roles and role-based access, etc. |
22. Database Lock/Unlock/Closure | Define the process for locking, unlocking, and closing a database. Include details on lower-level (e.g., event-level locking) if lower-level locking methods are used. Address investigator signature requirements prior to locking. |
23. Data Retention and Archival | Define the data retention, archival, and retrieval process. For databases managed by external sources (CRO, hosting service provider), define the process for accessing the database throughout your defined retention period. This should include the clinical data, eCRFs, and discrepancies/resolutions. |
24. CRO and Vendor Management | Detail the CRO / vendor selection and management process. Address sign-off procedures, meeting frequency, metrics, etc. Also address the auditing process and schedule. |
25. Training
| Define how the data management staff and site staff are trained on the topics listed at left (and any other topics as you see fit), how training is documented, re-training requirements, and how training records are maintained. |
References and Additional Resources
21 CFR Part 11, US Department of Health and Human Services, Food and Drug Administration, March 1997
Guidance for Industry Part 11, Electronic Records; Electronic Signatures Scope and Application, US Department of Health and Human Services, Food and Drug Administration, August 2003
Guidance for Industry E6 Good Clinical Practice: Consolidated Guidance, US Department of Health and Human Services, Food and Drug Administration, April 1996
Guidance for Industry Computerized Systems Used in Clinical Trials, US Department of Health and Human Services, Food and Drug Administration, May 2007
PIC/S Guidance Good Practices for Computerised systems in Regulated GXP Environments, PIC/S, September 2007
Susanne Prokscha, Practical Guide to Clinical Data Management, Third Edition, CRC Press, Oct 26, 2011