Despite a number of guidance documents and public statements explaining what regulators expect from manufacturers, many companies continue to struggle with basic data integrity problems.
While data integrity is a complex area of compliance, there are a number of simple ways to help ensure your data meets the fundamental ALOCA attributes and mitigate data integrity risks. We've highlighted six of those actions below.
Most data integrity problems attributed to human error result from poor data entry and handling. When these errors go unnoticed, they can evolve into much larger compliance issues.
While it may seem like a simple remedy, routine review of practices performed by quality personnel on the manufacturing floor can detect and resolve data-related issues before they develop into something far more serious. A quality expert with expert knowledge of the operations and practices being reviewed is best suited to perform these reviews; however, a designated supervisor or foreman can be trained for oversight as well.
Free white paper: Ensuring Data Integrity in FDA-Regulated Industries
The typical manufacturing process involves two main roles: the producer and the verifier. By adding a second round of review to double-check the data, issues that may have evaded the verifier can be identified while the producer is still on-site.
When an error is found, the designated quality reviewer can initiate questioning and either work to solve the issue on-site or immediately escalate the problem to higher authority if technical or procedural problems are discovered.
“One of the simplest, most effective things company leaders can do to stop data integrity problems in their tracks is putting a quality person directly into the manufacturing area to detect and address problems in real time.” - Jose Gutierrez, Consultant at The FDA Group
If, for instance, the problem was found to be mechanical or computer-based, a quality professional would be on-hand to diagnose the issue, prepare the necessary documentation, and get the proper specialists and/or data stewards involved at the time of discovery –– not days later.
Training personnel in proper data management must go beyond documentation practices to cover why processes are conducted the way they are. This should extend to every process on the manufacturing floor.
“Training has to address the ‘how’ and the ‘why’. Why is a particular ingredient or component added or installed during this phase of production? How does it affect the product? With a depth of knowledge into not only what needs to be done, but why it needs to be done that way, operators naturally become more conscious about the decisions they make in every dimension of work.” - Jose Gutierrez, Consultant at The FDA Group
A good first step to enhancing your training program is explaining the consequences of poor handling for any given process. What happens when things go wrong and how can the smallest of mistakes devolve into serious problems?
All too often, terms like “validation” and “protocol” exist as abstract concepts. We know we need to conduct or follow them, but few have a meaningful grasp on the reasons why these activities are important.
The more operators and technicians stay informed at the point of production and data collection, the fewer human errors will be made.
Changes in the ways laboratories collect, store, and transport data are forcing changes to the ways laboratories operate. Sample analysis is one activity in particular where data integrity issues often arise.
Today, this process results in several types of data illustrated in the chart below.
Automation can also do away with the need for maintaining a hybrid (paper and electronic) system –– one of the major culprits of data integrity issues due to risks posed by constant synchronization.
Data integrity relies on having reliable master data (a single source used across systems) to use when discrepancies are detected. Spreadsheets should never be used for this kind of data storage as they offer no protection against data corruption and no multi-user or validation capabilities.
With the click of a button, or copying of wrong information, metadata can be lost in an instant, creating an enormous data integrity problem.
From sample receipt to release of results, mapping the entire laboratory workflow can lead to significant consolidation.
Taking all related operations into consideration, simplify your workflow to mitigate data integrity risks and make validation easier.
The lack of standards for managing scientific data poses a major challenge and makes regular auditing and verification absolutely essential. But a number of organizations are currently working to standardize these processes. The American Association of Pharmaceutical Scientists (AAPS) offers guidance on Analytical Instrument Qualification (AIQ), which has been incorporated into the United States Pharmacopeia (USP).
Additionally, the Allotrope Foundation––an international not-for-profit association––is currently developing a framework for managing laboratory data. This effort is sponsored by many of the leading drug and biotech companies and aims to define a common standard for processing, exchanging, and verifying data in parallel with FDA’s regulatory goals.
This is by no means an exhaustive list of recommendations and advice. Data integrity is a far reaching area of quality and compliance which not only requires constant diligence, but an effective control framework throughout your organization.
We've compiled expert insights from some of the industry's top data integrity professionals and wrapped them into a free white paper explaining how to implement just such a system in your organization while mitigating common compliance risks.
Learn how to ensure data integrity throughout your entire organization. Grab our free white paper, Ensuring Enterprise-Wide Data Integrity in FDA-Regulated Industries, a 24-page guide filled with solutions to common compliance problems and a step-by-step process for integrating an effective control framework for data integrity.