XiltriX

View Original

Safeguarding Sample Management

Safeguarding Sample Management, Research Reproducibility, and Data Integrity

The necessity for sample integrity and quality is one of the most important aspects to the success and validity of life science research. Maintaining proper conditions–whether samples are being stored, moved, or utilized–to ensure sample integrity is a priority throughout the research project life cycle. Sample management and inventory management are common terms used in this practice, and they are often seen as synonymous. This assumption may not be correct in that sample management includes the inventory management data of a sample, but also encompasses a holistic overview–complex analytical data sets, experimentation records, and overall research. Sample management includes inventory management, but the latter does not necessarily include all of the former.

Sample integrity directly influences research reproducibility and is a noteworthy topic in the life science research community. Scientists require the means to reproduce experiments and other scientific processes from early ideation through scale-up and manufacturing. In order to do so, access to original scientific data, research materials, and key protocols is imperative. Life science companies that strive to innovate and increase speed to market must have easy access to credible data at all stages of the product life cycle.

The Need for Sample Management and Research Reproducibility

The use, storage, and transportation of biological samples require strict adherence to supply and cold chain practices in order to preserve the samples’ structural integrity. This chain of custody is important for all samples and especially important for temperature-sensitive samples. Processes and procedures should be well-documented and accessible to all personnel involved throughout the sample lifecycle.

Whether single, specific samples or entire collections are being stored, managed, or moved, having defined mechanisms for inventory control and management processes is a priority for each part of the chain of custody. This ensures samples are safeguarded and alleviates any potential risks or damages that could occur.

The need for accurate, verifiable data is fundamental to almost all aspects of scientific research. Scientific methodology often depends on researchers reproducing original experimentation to build and improve on existing procedures. Unfortunately, in too many instances, scientific findings in biotechnology research and development cannot be reproduced due to a variety of reasons. This leads to losses in time and valuable resources. It can also put the credibility of projects, or even entire organizations, at risk.

There are many issues that come with lack of reproducibility and most of them are not new to scientific companies, but there have been recent industry-wide attempts to better define the term reproducibility. Entities and organizations, such as the American Society for Cell Biology (ASCB), have provided a multifarious approach to identifying the differences in what research reproducibility means for science-based companies.

An in-depth study on research reproducibility and the economic costs associated with low reproducibility rates estimated that $28 billion per year is spent on research that is not reproducible in the United States alone and that over half of all preclinical research is irreproducible. The need to overcome this issue is staggering in both cost and time for life science organizations.

Main Factors Contributing to Lack of Reproducibility

Research scientists can be hindered by many circumstances during experimentation. One of the most important factors that contribute to the lack of research reproducibility is not having access to methodological details, research materials, and data. Having well-documented procedures is paramount to reproducibility, along with the proper samples and biological materials used in the original experimentation. The integrity of these samples and materials is intrinsically important when striving to replicate or reproduce research results. Access to the proper materials and datasets will alleviate impediments in reproducibility.

Technological advancements enable organizations to garner vast amounts of data. The integration of informatics tools and IoT technologies provides increased oversight to many functions in life science research, but there are still difficulties when researchers attempt to navigate data repositories. There is an immense amount of knowledge needed to properly analyze and interpret data correctly. These new, integrated technologies may also not have the same established or standardized protocols and procedures as prior practices. This can adversely affect data analysis.

When organizations adopt new technologies to improve research reproducibility or procedural oversight, it is important to define the value of the data and oversight these tools enable. With proper due diligence, the data gathered can immensely benefit research reproducibility, fostering a supportive environment for scientific innovation and collaboration.

The Need for Data Integrity

Data integrity should be considered a prerequisite for any sample management technology an organization decides to implement. If data on the way samples are stored, managed, and utilized is inaccurate or difficult to analyze, it will permeate throughout the product life cycle and create ongoing costs and other issues. With so many stringent agencies regulating life science industries, failing to capture accurate and reliable data on sample management can have adverse effects when required to prove quality and compliance.

In 2015, 75% of the warning letters sent by the FDA cited issues with data integrity. That figure continued to increase the following year. In a response to this growing problem, guidance documentation for data integrity was released. “Data Integrity and Compliance with Good Manufacturing Practice Guide for Industry” provides key insights to educate the industry on the pertinence of data integrity and ways to alleviate common industry challenges.

A key takeaway from this document is the concept of ALCOA, guidelines that establish effective initiatives for data integrity:

  • Attributable: Records of specific personnel performing tasks with timestamps for accountability and accuracy.

  • Legible: Records should be concise and permanent.

  • Contemporaneous: Data records indicate the exact time of taking measurements to prevent retroactive data inputs.

  • Original: The first medium of data recording should be preserved and accessible throughout the life cycle.

  • Accurate: All data records should be without bias and errors.

There are a variety of ways organizations can ensure that they foster good practices in relation to data integrity, but in order to manage data integrity, there are some key tools that can be utilized to improve and streamline processes.

  1. Document control and employee training tools to ensure that regulations for records and documentation storage are followed.

  2. Audit management tools to automate audit processes for internal and external quality and compliance.

  3. Data analysis and integration tools to allow for importing and exporting data, deriving insights, and data security.

  4. Laboratory automation tools for monitoring and information management.

 

Ensuring these systems are properly integrated allows for organizations to build better processes and utilizing the robust capabilities of these tools creates environments that foster accurate and compliant data.

When organizations focus on improving sample management, research reproducibility, and data integrity they alleviate many of the risks that can occur in a life science organization, making them better suited to innovate in a fast-paced, ever-changing industry.

To learn more about how XiltriX can ensure your lab’s critical assets and equipment are protected 24/7/365, schedule a free lab consultation with one of our experts.