Data integrity problems in pharmaceutical quality control laboratories are driving more regulatory action than ever before. What has changed to drive all this activity? While plenty of information is available, much of it seems to confuse rather than clarify.
Data integrity is a critical aspect in pharmaceutical laboratories, ensuring that the data generated during business operations and drug manufacturing is accurate, complete, and reliable. When data is reliable, business owners can make informed decisions, improve product quality, and contribute to overall success.
Data integrity is important because it builds trust with stakeholders and ensures that the information used to evaluate drug safety, efficacy, and quality is trustworthy. For patients using a pharmaceutical product, it assures them of the safety that is promised and provides qualitative evidence to support the manufacturer’s guarantee.
As W.E. Deming said,
“Without data, you are just another person with an opinion.”
Let’s explore some common myths of data integrity by looking at facts, based on a study of available resources and direct interactions with U.S. Food and Drug Administration (FDA) staff and their consultants.
Myth: All this regulation around data integrity is new
Data integrity has been a concern for decades. The FDA’s focus on it began with 21 CFR Part 11 in 1998. In 2003, after the pharmaceutical industry spent years struggling with the regulation, the FDA released its Scope and Application guidance, clarifying some of the requirements in Part 11. This guidance also included a discussion of the FDA’s selective enforcement strategy based on what the administration was finding during its inspections. In 2010, the FDA announced its focus on data integrity inspections. At that time, however, few people within the FDA were qualified to understand the data integrity aspects of computerised systems. Thus, beginning in 2013, data integrity has been a primary inspection point, and there has been a visible increase in data integrity enforcement across all geographies. In addition, starting in 2014, as a result of those inspections, the FDA has often included the names of hardware and software products in their warning letters and related public information documents in a less than subtle message to the hardware and software makers that the administration expects them to assist customers with data integrity and compliance concerns.
Myth: Data integrity is an IT issue
Success in addressing data integrity relies less on technology and more on fostering a culture, organisation, and mindset conducive to excellence. Key contributors to effective data integrity solutions include a shared vision of data integrity practices and a commitment to continuous improvement. In both paper-based and electronic systems, data integrity issues can arise, each presenting unique challenges and requiring tailored remediation strategies. Many responses to these issues overlook the possibility of such occurrences in paper-based systems, failing to conduct risk assessments or identify areas for remediation. Compliance and best practices must span data generation, transformation, maintenance, accuracy, and consistency. Cultivating the right culture, assembling capable teams, ensuring transparency in data integrity performance, and aligning company goals with data integrity objectives are all essential components of a successful data integrity initiative.
Myth: Only the software needs to be compliant
Software often does not comply with regulations. The software itself is inert; software contains the technical controls to support compliance with the applicable regulations. In addition to technical controls, procedural controls must also be in place. A discussion about procedural controls versus technical controls is often seen in FDA warning letters, particularly when gaps in a system’s ability to support technical controls required by various regulations have been exploited.
A standard operating procedure (SOP), used as a procedural control, can substitute for a technical control as long as:
• People are trained on that SOP
• The SOP is followed
• Adherence to the SOP is confirmed by quality oversight and/or compliance auditing
Often, however, even if SOPs exist, they are not followed, and adherence isn’t properly verified. Consequently, the FDA will demand system remediation to prevent a recurrence of the behaviour. Audit trails within computerised systems are an example of technical controls. The software must be able to generate audit trails that contain all the components the regulations require, and then those controls must be enabled.
Analytical instrument manufacturers are taking compliance and regulations into account with their products. As an example, Agilent is applying critical thinking to redesigning laboratory software to help respond to new regulatory compliance realities. Many systems may generate audit trail reports in printed form, but the new version of the Agilent OpenLAB Chromatography Data System has a built-in tool that allows a user to electronically review electronic audit trails entries. These audit trail entries are organised by type, an online review can be performed, and electronic signatures incorporated.
If data integrity regulation compliance is a necessity for your pharma lab, Chemetrix is able to provide solutions that include instruments and software that can help ensure your data is not only well managed and organised, but kept safe and generated with adherence to all the regulatory guidelines.
Data integrity problems can severely impact business operations, leading to financial losses, legal issues, and damaged reputation. It forms the foundation of for reliable pharmaceutical research, development, and manufacturing and, therefore, should be as error-free and precise as possible. It goes beyond being just a practice; data integrity is the cornerstone of trust and excellence in pharmaceutical labs, paving the way for groundbreaking discoveries and lifesaving innovations.