Common lab informatics questions – part 1: one system or more??

As an information systems consultancy dedicated to successfully delivering lab-based information systems, we help our clients to overcome many different challenges. There are some important questions that we are frequently asked to evaluate.

In part one of this blog series, we’ll summarise the considerations to make when answering 3 common questions about lab informatics systems, all in the theme of ‘is a single system better than multiple similar systems?’

1. Should R&D labs use the same informatics systems as QC?

Here the context matters. If one were to generalise, R&D labs tend to be experiment-based, answering questions like ‘What ingredient changes in the product formulation will increase effectiveness and reduce environmental impact?’. On the other hand, QC labs are more focused on samples taken from production runs, and questions such as ‘Are the % composition of key ingredients within a production batch within specification?’

If we use the above generalisation and apply lab informatics thinking, in broad terms, ELNs are centred on recording experiments and therefore more suited to R&D. LIMS, being sample, test and results biased, are generally more suitable to QC labs.

However, it is not that simple. For example, perhaps one of the R&D labs provides analytical services to various teams executing R&D experiments – this type of ‘service’ lab is often better served by LIMS than ELNs.

The type of labs involved is not the only factor to consider. For example, CDS systems are generally applicable to both R&D and QC. The methods and use of the instruments may well vary across R&D and QC, but the instrument data systems can be exactly the same.

Finally, regulatory needs, specifically for QC can also be a driving factor in answering the question. We will consider this further in one of the following questions.

2. Should we implement a single global system or several more local systems?

When Scimcon first started nearly three decades ago, the focus within large multi-national companies was on implementing large, monolithic lab systems. This approach still has its place, particularly where the distributed labs are very close in terms of operational and analytical workflows.

Current thinking, however, looks to best support the diversity of lab workflows across global sites. While this should not mean a different system in every single lab, it should ensure some flexibility in selecting systems locally. This has several benefits, including a better informatics fit for each lab, and the increased local user buy-in gained by allowing flexibility.

However, against the background of the drive to increased data aggregation, data science and analytics, and AI/ML, this local approach can be counterproductive. It is therefore important to set standards and guardrails about how these systems are implemented, and how the data is structured and linked via reference data, so that consolidation into centralised reporting tools and data lakes is facilitated.

3. Should I have different systems for GxP and non-GxP work?

There is a well-used saying within regulatory-compliant organisations: ‘If a system contains just 1% of GxP data, then the whole system is required to be implemented, managed and maintained in a regulatory compliant manner.’

This statement leaves compliant organisations questioning:

  1. Is it easier to run one regulatory compliant system, that contains both non-GxP and GXP data, and accepting that the non-GxP will also be subject to the associated GXP administrative overheads?
  2. Or is it easier to have two systems, one GxP and the other non-GxP, the latter of which is subject to less rigid controls?

The first step to answering the question is to determine the delta between administering a GxP system, and administering a non GxP system. LIMS, ELN, SDMS, CDS and other lab informatics systems are often classified by labs as mission-critical. Most organisations wouldn’t countenance a lack of system administration rigour or releasing untested changes to mission-critical systems, so this delta may be lower than it first seems.

The next step is an open conversation with QA teams about the types of data being held, and the control systems that will be put in place. In the past, we have successfully taken a two-tier approach, where the administration procedures for non-GxP are simpler than those for GxP data in the same system. However, for this type of arrangement to be viable, a detailed risk assessment is required, and the ongoing management and control of the administration has to be very well executed.

Finally, before making the decision, it’s worth considering whether there are shared services or functions involved. For example, if the GxP and non-GxP work uses the same inventory management, it might be complex to get the inventory system interfacing and updating two systems simultaneously.

Summary

Hopefully, we have illustrated the importance of being clear about what your requirements are before answering these key questions about lab informatics systems. Each case is unique, and your decision will usually be based on a wide range of influencing factors. We help organisations to consider all of the options and roll out their chosen model.

Stay tuned for part 2 of this blog series, where we will look at the key question of how you can prepare your data for AI and machine learning.

Industry leader interviews: Pascale Charbonnel?

Our latest industry leader interview is with Pascale Charbonnel, who tells us about how SCTbio supports customers through the cell therapy manufacturing chain.

In this instalment of our industry leader series, we speak to Pascale Charbonnel, Chief Business Officer of SCTbio. Pascale tells us about the work of SCTbio, how they collaborate with biotech developers, and why they are a great choice for outsourcing cell and gene therapy (CGT) manufacture.

Tell us about SCTbio

SCTbio is a cell-based therapy and viral vector contract development and manufacturing organisation (CDMO). Originally part of the SOTIO group, we spun out in 2022 and operate a Good Manufacturing Practice (GMP) facility in Prague, Czech Republic. Recently, eureKING, a French special purpose acquisition company, or SPAC, has signed an agreement to purchase full ownership interest in SCTbio, which will further bolster our position as a leading CDMO service provider.

As part of SOTIO group, we were developing our own cell and gene therapies for 13 years, so we have a lot of experience in manufacturing for clinical trials from phase I to phase III across multiple geographies. Given this expertise, customers trust us to guide them through the development process as they navigate the GMP world and clinical development.

What kind of customers do you support, and how do you support them?

Our target customers are mainly early-stage biotechnology companies, who typically outsource all their production needs. We are sometimes also used as an additional facility to absorb around 20-30% of the production needs for large Ph II / Ph III phases. Our main goal is to establish trust with customers right from the beginning, so we can then support them as the project progresses through later clinical phases. The average customer project takes about two years.

With our history in SOTIO, we can ensure GMP compliance for the full drug development life cycle as we have also faced some of those same hurdles associated with developing therapeutics. Our team understands the importance of saving time and costs, and maintaining momentum to ensure approvals run smoothly and that we can move onto the next clinical stage. We use this experience to create optimised development plans, which give customers the assurance that we can support them and hopefully go on this journey with them for many years to come.

How do you manage the data you generate for customers, and what formats do you report in?

We are still very much in a mixed model – so we have turned to electronic systems in some cases, but we do still have paper-based approaches too. It’s useful to have both, as it means we can tailor our approach depending on customer requirements. We’ve built our own data management system, which has been developed specifically to fit our operation here – so while there is scope for us to move to a full digital system, it will take time and our customers’ current requirements do not warrant that.

When it comes to customer data, we typically start by storing the raw data in a validated platform which we can then manage regularly. We then export it to the customer in whatever format they wish. As each customer’s requirements differ greatly, there’s no need for us to move to full digital systems yet, but it’s definitely something we’re bearing in mind for the future.

What does a typical audit look like, and how do you ensure success?

Since last year, we’ve run four audits – three by customers, one by a regulatory body. They all follow a similar process, where we will receive a request or announcement about two weeks in advance that an auditor is going to visit, and they usually request specific documentation which of course we already have to hand. During the day they will look at everything in our facility, speak to some of our technical staff, and then make a report outlining any observations.

GMP culture is very deeply rooted in our company, to the point where our recent regulatory audit returned no observations at all! While this shows everything was as expected, our customers were particularly impressed. One of our customers came back to us following their audit to say that they can see we go above and beyond the standard for GMP, and that our team is clearly well organised and collaborative.

How does SCTbio stand out as a CDMO?

One thing I think really makes us special is our people. We are a team of about 80 people, many of whom have been with us since the inception of SOTIO, and the staff turnover rate is very low indeed. It gives our customers a great deal of assurance that as well as having far-reaching experience in developing drugs and a deeply rooted GMP culture, our people are committed to our customers and get to know them and their needs.

What set us apart is our 13 years expertise in the CGT field and our flexibility to accommodate different sizes/stage of projects. We plan to stay very flexible, so that we can continue to take a bespoke approach to supporting our customers.

In addition, we offer a really wide range of services. We can collect the starting material, process it in our facility, release it under quality assurance / qualified person (QP/QA) and GMP conditions, and we have a logistical advantage as we’re based in central Europe, so close to a number of key markets. Being able to offer a full start-to-finish process in one place is quite unusual, so it gives us a strong advantage.

The recipe for success as a CDMO in my eyes is to have mutual trust and transparent communication with partners and customers, so with highly skilled people and low turnover, as well as the cost benefits of our location, our customers rely on us for consistency, reliability, and quality.

What do you think the future holds for cell and gene therapy?

The market has faced many challenges over the last few years, but we’re now starting to see an upturn. Funding is becoming available again, and we believe that ‘the good science’ will prevail. We’re excited to see what projects will come our way and to keep supporting customers to develop life-changing medicines.

Scimcon is proud to showcase CDMOs like SCTbio, and we’re looking forward to seeing how the company will grow over the coming years. To contribute to our industry leader blog series, or to find out more about how Scimcon supports organisation with lab informatics and data management solutions, contact us today.

The evolution of pharmacovigilance?

Jamie, please tell us a bit about yourself and your background 

My name is Jamie Portnoff, and I am the founder and principal consultant at JMP Consulting. JMP Consulting assists clients in the pharmaceutical industry to achieve and sustain compliance and improve overall performance in pharmacovigilance (PV) and related functions like quality, medical information and regulatory affairs. Before founding JMP Consulting, I worked in the pharmaceutical industry. Not many management consultants working in PV have hands-on, real-world PV experience; this experience means I understand the realities of day-to-day work in and around PV, and how challenging it can be to deliver against requirements and expectations.  In my earliest days in industry, I especially enjoyed working with people and on projects, and I soon realised that I wanted to marry up my problem solving and analytical skills with my practical industry knowledge, and after a few years of working with big consultancy companies I decided to start JMP Consulting.

Big changes are coming in PV, but before we look at the future, we need to understand the past

Let us look at the last three decades.

In the 1990’s there were basic PV safety database systems, such as ArisG, ArisLite and ClinTrace. Fax machines were a huge part of the tech that enabled PV processes, with a high volume of incoming and outgoing data by fax. Processes were extremely paper-intensive and were designed to accommodate transactional work, such as processing of cases and putting aggregate reports together; everything was very compliance-focused. Consequently, there was demand for full-time roles dedicated to paper management, typing up documents and data entry. Teams were typically regionalized, and everything was done “onshore”.

In the 2000’s, PV technology became more sophisticated, more globally oriented. There were advances in what the technology could do, and consolidation of major tech players due to M&A activity. Paper-based processes began to give way to more digitization and electronic workflow management. Analytics tools become more prevalent and more user-friendly.  However, a typical PV department was still very paper-intensive. Some of the regionalized models began to consolidate to one system, one process, and one organization, particularly between US and Europe. 

Throughout this decade, more stringent regulatory requirements were continually being introduced, such as the Risk Evaluation and Mitigation Strategy (REMS), as well as Volume 9a. Consequently the bar was being raised for the calibre of work, and quality management expectations were increasing. We saw more focused teams dedicated to signal detection and risk management, and specialized teams emerged to manage increasing business system needs as the regulatory requirements led to increasingly complex systems. Dedicated vendor oversight teams were also required as companies began to work offshore with vendors.

Over the last decade, good pharmacovigilance practices (GVP) were introduced in the European Union (EU). The Qualified Person for Pharmacovigilance (QPPV) is not a new requirement, but it became clear that this person needs a whole team around them to support them and help shoulder the workload.

Offshore work has grown in magnitude, partnerships between companies have become an integral part of how business is done, and next generation technology is rolling out to improve efficiency and consistency. Safety systems have become truly global, enabling a scalable end-to-end safety process within a single system.

Figure: Illustrated examples of the way the world of PV has evolved.

What will happen with the advent of next-generation technology?

Big changes are coming with PV technology, which will drive major shiftsin the way we think about how PV work gets done. We have seen evolution in PV technology before, but it seems this time around will be more impactful than anything from the past 20 years.

With the advent of next-generation technology, new hard skills will be required, such as understanding of machine learning, natural language processing and artificial intelligence. Organizations need to be able to manage transformation of the PV business effectively and regularly, and leverage advanced analytical tools to derive meaningful insights from various data sets. Additional ‘soft’ skills will also be needed, such as adaptability, flexibility, open-mindedness as well as the ability to ‘think outside the box’ to drive improvements through innovative thinking.

New roles within the organisation will emerge, with specific roles dedicated to:

Meanwhile, other roles will fade out and teams of people (in-house or outsourced) performing transactional activities will become a thing of the past.

What does it mean to be future-ready in pharmacovigilance?

From a process perspective – Processes must be highly scalable to accommodate growth in volume and complexity, and a blend of proven and cutting-edge technology is needed to support and enable this. A future-ready process has metrics to enable continuous improvement; it can efficiently evolve and adapt to simultaneously accommodate new regulations, innovative products and evolving stakeholder expectations.

From a technology perspective – Highly agile, flexible and robust, technology needs to be business-led with strong IS support and should be woven into an organisation’s processes, not vice versa.

From a people perspective – People in the organisation must accept increasing automation of processes – you can have the best technology in the world, but if the people in the team are rejecting it, it is not going to be successful. Well-managed resource models are also hugely important.  The organisational structure must be designed around the business’ needs, not vice versa. Employees should offer more than one skillset and in return, they must have a pathway to develop professionally. It is critical that a team can approach things from different angles and can adapt to change – these days excelling in just one area is often not enough.

In order to work as intended, this site stores cookies on your device. Accepting improves our site and provides you with personalized service.
Click here to learn more