A Look at Data Integrity in Pharma Labs

Data integrity problems in pharmaceutical quality control laboratories are driving more regulatory action than ever before. What has changed to drive all this activity? While plenty of information is available, much of it seems to confuse rather than clarify.

Data integrity is a critical aspect in pharmaceutical laboratories, ensuring that the data generated during business operations and drug manufacturing is accurate, complete, and reliable. When data is reliable, business owners can make informed decisions, improve product quality, and contribute to overall success.

Data integrity is important because it builds trust with stakeholders and ensures that the information used to evaluate drug safety, efficacy, and quality is trustworthy. For patients using a pharmaceutical product, it assures them of the safety that is promised and provides qualitative evidence to support the manufacturer’s guarantee.

As W.E. Deming said,

“Without data, you are just another person with an opinion.”

Let’s explore some common myths of data integrity by looking at facts, based on a study of available resources and direct interactions with U.S. Food and Drug Administration (FDA) staff and their consultants.

 

Myth: All this regulation around data integrity is new

Data integrity has been a concern for decades. The FDA’s focus on it began with 21 CFR Part 11 in 1998. In 2003, after the pharmaceutical industry spent years struggling with the regulation, the FDA released its Scope and Application guidance, clarifying some of the requirements in Part 11. This guidance also included a discussion of the FDA’s selective enforcement strategy based on what the administration was finding during its inspections. In 2010, the FDA announced its focus on data integrity inspections. At that time, however, few people within the FDA were qualified to understand the data integrity aspects of computerised systems. Thus, beginning in 2013, data integrity has been a primary inspection point, and there has been a visible increase in data integrity enforcement across all geographies. In addition, starting in 2014, as a result of those inspections, the FDA has often included the names of hardware and software products in their warning letters and related public information documents in a less than subtle message to the hardware and software makers that the administration expects them to assist customers with data integrity and compliance concerns.

 

Myth: Data integrity is an IT issue

Success in addressing data integrity relies less on technology and more on fostering a culture, organisation, and mindset conducive to excellence. Key contributors to effective data integrity solutions include a shared vision of data integrity practices and a commitment to continuous improvement. In both paper-based and electronic systems, data integrity issues can arise, each presenting unique challenges and requiring tailored remediation strategies. Many responses to these issues overlook the possibility of such occurrences in paper-based systems, failing to conduct risk assessments or identify areas for remediation. Compliance and best practices must span data generation, transformation, maintenance, accuracy, and consistency. Cultivating the right culture, assembling capable teams, ensuring transparency in data integrity performance, and aligning company goals with data integrity objectives are all essential components of a successful data integrity initiative.

 

Myth: Only the software needs to be compliant

Software often does not comply with regulations. The software itself is inert; software contains the technical controls to support compliance with the applicable regulations. In addition to technical controls, procedural controls must also be in place. A discussion about procedural controls versus technical controls is often seen in FDA warning letters, particularly when gaps in a system’s ability to support technical controls required by various regulations have been exploited.

A standard operating procedure (SOP), used as a procedural control, can substitute for a technical control as long as:

• People are trained on that SOP

• The SOP is followed

• Adherence to the SOP is confirmed by quality oversight and/or compliance auditing

Often, however, even if SOPs exist, they are not followed, and adherence isn’t properly verified. Consequently, the FDA will demand system remediation to prevent a recurrence of the behaviour. Audit trails within computerised systems are an example of technical controls. The software must be able to generate audit trails that contain all the components the regulations require, and then those controls must be enabled.

Analytical instrument manufacturers are taking compliance and regulations into account with their products. As an example, Agilent is applying critical thinking to redesigning laboratory software to help respond to new regulatory compliance realities. Many systems may generate audit trail reports in printed form, but the new version of the Agilent OpenLAB Chromatography Data System has a built-in tool that allows a user to electronically review electronic audit trails entries. These audit trail entries are organised by type, an online review can be performed, and electronic signatures incorporated.

Chromatography Data Systems
Chromatography Data Systems

 

If data integrity regulation compliance is a necessity for your pharma lab, Chemetrix is able to provide solutions that include instruments and software that can help ensure your data is not only well managed and organised, but kept safe and generated with adherence to all the regulatory guidelines.

Data integrity problems can severely impact business operations, leading to financial losses, legal issues, and damaged reputation. It forms the foundation of for reliable pharmaceutical research, development, and manufacturing and, therefore, should be as error-free and precise as possible. It goes beyond being just a practice; data integrity is the cornerstone of trust and excellence in pharmaceutical labs, paving the way for groundbreaking discoveries and lifesaving innovations.

 

Residual Solvent Analysis of Pharmaceutical Products

Organic solvents constitute a major fraction in the synthesis of pharmaceutical products. The manufacturing process for active pharmaceutical ingredients (APIs) may contribute to residual solvents remaining in the final product. Producers need to monitor and control the levels of residual solvents for several reasons—including safety, effect on crystalline form, solubility, bio-availability, and stability.

Therefore, all products must be tested to assess whether the solvents used during the manufacturing processes are within the accepted limits. Quality assurance laboratories routinely use the United States Pharmacopeia (USP) Method <467>.

 

Procedures for identification and quantification

The USP <467> monograph specifies the different classes of solvents per their toxicity, sets the concentration limits according to their health hazard, and describes the assay procedure for the solvents. A complete list of all the solvents that may be used in manufacturing processes is not mentioned under these classes. Therefore, the final products should be screened according to the solvents used during their specific manufacturing process.

The method is composed of three analytical procedures for identification and quantification.

  • Procedure A: Identification and limit testing. Uses a G43 phase (624-type column).
  • Procedure B: Confirms whether or not an identified solvent is above the regulated limits. Uses a G16 phase (WAX-type column).
  • Procedure C: Quantitative test using a G43 phase or G16 phase, depending on which produced fewer coelutions.

 

USP <467> analytical flowchart for residual solvent analysis.

 

Columns for excellent performance

Agilent J&W DB-Select 624 UI columns have shown excellent performance for residual solvent analysis according to USP <467> Procedure A. Repeatability was generally better than 2.5% RSD for Class 1, Class 2A, and Class 2B solvents. Once a residual solvent was identified above the permitted daily exposure (PDE) limit, Procedure B is performed to confirm analyte identity. The Agilent J&W DB-WAX UI GC column has been successfully used as a confirmation column, because it yields an alternate selectivity compared to that of a G43 column.

Agilent J&W DB-Select 624 UI columns

 

Recommended instruments

For this method, Chemetrix can recommend state-of-the-art analytical instruments. With best-in-class technology and powerful software, the Agilent 7697A headspace sampler is packed with the latest productivity-boosting features.  It’s unique sampling design allows you to use hydrogen as a carrier gas, delivering optimal chromatography and helping to future-proof your lab.

Agilent 7697A Headspace Sampler

 

Based on the Agilent Intuvo 9000 GC system, Agilent Residual Solvent Analyzers are factory pretested and preconfigured to deliver results, fast, while saving precious startup time. What’s more, their analytical precision exceeds USP method requirements for the three classes of residual solvents. It’s chemically tested to ensure optimal analysis of class 1 and class 2A/B solvents and labs can begin system calibration and validation immediately following installation.

Agilent Intuvo 9000 GC

 

A critical process

Residual Solvent Analysis is a must in any manufacturing environment where solvents form part of the production process. Because this process is so critical, using the correct instruments suited for the lab requirements can save time and boost accuracy.

 

Quality control at the heart of it all

At every stage of the quality control process, Chemetrix can assist labs with full end-to-end solutions for your residual solvent analysis. Our team of qualified professionals can share a comprehensive portfolio of solutions, including different instrument models, software and consumables, that work together to provide accurate and reproducible results.

 

Looking for more information on Residual Solvent Analyis? Watch our webinar >

 

Keeping Pace with COVID-19 Virtual Symposium

COVID-19 research and testing methods are evolving every day. Join us for an interactive live event to hear the latest on what leading researchers and labs are doing to keep pace.
You will discover best practices for applying technology to support COVID-19 research, test development, and vaccine and drug development – and how these workflows can be effectively applied toward other infectious diseases as well.

 

Register to access a diverse lineup of speakers and sessions, including:

  • Collaborating in the fight against COVID-19
  • A next-generation tool bench for virology research
  • Keynote presentation: Viral hijacking of cellular metabolism
  • Seahorse XF reveals bioenergetic impact in virology
  • Rapid functional evaluation of virus-neutralizing antibodies and antiviral drugs using multiparametric live-cell analysis
  • Focus on virology: Applications to enable discovery
  • SLIMS: Easier and faster LIMS deployment for COVID
  • Faster in, faster out: Achieve the most efficient analysis of raw material for vaccine production
  • Agilent BioMDS solutions for vaccine research
  • Agilent GC solution for alcohol-based hand sanitizer testing
  • Analysis of alcohol levels in hand sanitizer formulations using Cary 630 FTIR
  • QA/QC of protection masks

 

Agenda

 

Register Here >