Common lab informatics questions – part 1: one system or more??

As an information systems consultancy dedicated to successfully delivering lab-based information systems, we help our clients to overcome many different challenges. There are some important questions that we are frequently asked to evaluate.

In part one of this blog series, we’ll summarise the considerations to make when answering 3 common questions about lab informatics systems, all in the theme of ‘is a single system better than multiple similar systems?’

1. Should R&D labs use the same informatics systems as QC?

Here the context matters. If one were to generalise, R&D labs tend to be experiment-based, answering questions like ‘What ingredient changes in the product formulation will increase effectiveness and reduce environmental impact?’. On the other hand, QC labs are more focused on samples taken from production runs, and questions such as ‘Are the % composition of key ingredients within a production batch within specification?’

If we use the above generalisation and apply lab informatics thinking, in broad terms, ELNs are centred on recording experiments and therefore more suited to R&D. LIMS, being sample, test and results biased, are generally more suitable to QC labs.

However, it is not that simple. For example, perhaps one of the R&D labs provides analytical services to various teams executing R&D experiments – this type of ‘service’ lab is often better served by LIMS than ELNs.

The type of labs involved is not the only factor to consider. For example, CDS systems are generally applicable to both R&D and QC. The methods and use of the instruments may well vary across R&D and QC, but the instrument data systems can be exactly the same.

Finally, regulatory needs, specifically for QC can also be a driving factor in answering the question. We will consider this further in one of the following questions.

2. Should we implement a single global system or several more local systems?

When Scimcon first started nearly three decades ago, the focus within large multi-national companies was on implementing large, monolithic lab systems. This approach still has its place, particularly where the distributed labs are very close in terms of operational and analytical workflows.

Current thinking, however, looks to best support the diversity of lab workflows across global sites. While this should not mean a different system in every single lab, it should ensure some flexibility in selecting systems locally. This has several benefits, including a better informatics fit for each lab, and the increased local user buy-in gained by allowing flexibility.

However, against the background of the drive to increased data aggregation, data science and analytics, and AI/ML, this local approach can be counterproductive. It is therefore important to set standards and guardrails about how these systems are implemented, and how the data is structured and linked via reference data, so that consolidation into centralised reporting tools and data lakes is facilitated.

3. Should I have different systems for GxP and non-GxP work?

There is a well-used saying within regulatory-compliant organisations: ‘If a system contains just 1% of GxP data, then the whole system is required to be implemented, managed and maintained in a regulatory compliant manner.’

This statement leaves compliant organisations questioning:

  1. Is it easier to run one regulatory compliant system, that contains both non-GxP and GXP data, and accepting that the non-GxP will also be subject to the associated GXP administrative overheads?
  2. Or is it easier to have two systems, one GxP and the other non-GxP, the latter of which is subject to less rigid controls?

The first step to answering the question is to determine the delta between administering a GxP system, and administering a non GxP system. LIMS, ELN, SDMS, CDS and other lab informatics systems are often classified by labs as mission-critical. Most organisations wouldn’t countenance a lack of system administration rigour or releasing untested changes to mission-critical systems, so this delta may be lower than it first seems.

The next step is an open conversation with QA teams about the types of data being held, and the control systems that will be put in place. In the past, we have successfully taken a two-tier approach, where the administration procedures for non-GxP are simpler than those for GxP data in the same system. However, for this type of arrangement to be viable, a detailed risk assessment is required, and the ongoing management and control of the administration has to be very well executed.

Finally, before making the decision, it’s worth considering whether there are shared services or functions involved. For example, if the GxP and non-GxP work uses the same inventory management, it might be complex to get the inventory system interfacing and updating two systems simultaneously.

Summary

Hopefully, we have illustrated the importance of being clear about what your requirements are before answering these key questions about lab informatics systems. Each case is unique, and your decision will usually be based on a wide range of influencing factors. We help organisations to consider all of the options and roll out their chosen model.

Stay tuned for part 2 of this blog series, where we will look at the key question of how you can prepare your data for AI and machine learning.

We’re hiring! Scimcon launches its first Graduate Consultant Scheme for scientific and technology graduates?

Over the past year we have seen a vast increase in demand for quality individuals to lead and resource laboratory digital transformation projects. 

This increased demand coupled with a desire to develop the next generation of world class consultants has resulted in Scimcon creating its first Graduate Consultant Scheme, for scientific and technology graduates.

The role of a Graduate Consultant at Scimcon

Scimcon is actively partnering with universities and attending graduate recruitment fairs to attract and recruit the right candidates to join the scheme. The candidates will be trained in the multiple disciplines that Scimcon typically work, Project Leadership, Business Analysis, Solutions Architecture and Computer Systems Validation. Additionally, they will also get exposure to a domain that combines both science and technology. From the chemistry of materials science to biologics drug discovery, our teams work in a diverse range of scientific fields.

Scimcon will introduce the successful candidates to a career as a laboratory information systems consultant. Training will be provided in a variety of project settings in multiple industries and with various software vendors. The individuals will shadow our experienced consultants to build knowledge and gain an effective understanding of what it takes to provide insightful, pragmatic and highly valued consultancy services to laboratory-based organizations. Our graduate consultants will work on exciting projects for globally recognized industry names giving them the perfect opportunity to kick start their career. As our customers are based around the globe, there is also an exciting opportunity for successful candidates to travel and work on-site with customers in Europe, the US, and beyond.

Who we’re looking for: are you the right candidate?

As customer-facing consultants, our team demonstrate a particular set of qualities. They are dynamic, enthusiastic, driven, conscientious with an eye for detail. They have excellent relationship building skills, but above all they demonstrate integrity consistently. We are looking for individuals that exhibit these same qualities.

If you are graduating in 2022 with a scientific or computer/technology related degree/masters/PhD and you are looking for an exciting career in informatics consultancy, please get in touch with Scimcon’s Head of Operation, David Sanders at dsanders@scimcon.com

Podcast: Scimcon discusses digital transformation?

Scimcon has been on quite a journey since its founding in 2000. Our co-founder Geoff Parker recently spoke with John Storton at Yellow Spider Media for its Business Spotlight podcast, where he discussed Scimcon’s experience in informatics projects over the last 21 years, how implementation projects have changed, and trends in digital lab transformation.

You can listen to the discussion below.

Interested in hearing more from Scimcon? Make sure you’re following us on LinkedIn and Twitter for regular updates.

To learn more about digital lab transformation, visit one of our earlier blogs here.

Planning for successful User Acceptance Testing in a lab or clinical setting?

What is User Acceptance Testing?

User Acceptance Testing (UAT) is one of the latter stages of a software implementation project. UAT fits in the project timeline between the completion of configuration / customisation of the system and go live. Within a regulated lab or clinical setting UAT can be informal testing prior to validation, or more often forms the Performance Qualification (PQ).

Whether UAT is performed in a non-regulated or regulated environment it is important to note that UAT exists to ensure that business processes are correctly reflected within the software. In short, does the new software function correctly for your ways of working?

Identifying and managing your requirements

You would never go into any project without clear objectives, and software implementations are no exception. It is important to understand exactly how you need software workflows and processes to operate.

To clarify your needs, it is essential to have a set of requirements outlining the intended outcomes of the processes. How do you want each workflow to perform? How will you use this system? What functionality do you need and how do you need the results presented? These are all questions that must be considered before going ahead with a software implementation project.

Creating detailed requirements will highlight areas of the business processes that will need to be tested within the software by the team leading the User Acceptance Testing.

Requirements, like the applications they describe, have a lifecycle and they are normally defined early in the purchase phase of a project. These ‘pre-purchase’ requirements will be product independent and will evolve multiple times as the application is selected, and implementation decisions are made.

While it is good practice to constantly revise the requirements list as the project proceeds, it is often the case that they are not well maintained. This can be due to a variety of reasons, but regardless of the reason you should ensure the system requirements are up to date before designing your plan for UAT.

Assessing your requirements

A common mistake for inexperienced testing teams is to test too many items or outcomes. It may seem like a good idea to test as much as possible, but this invariably means all requirements from critical to the inconsequential are tested to the same low level.

Requirements are often prioritised during product selection and implementation phases according to MoSCoW analysis. This divides requirements into Must-have, Should-have, Could-have and Wont-have and is a great tool for assessing requirements in these earlier phases.

During the UAT phase these classifications are less useful, for example there may be requirements for a complex calculation within a LIMS, ELN or ePRO system. These calculations may be classified as ‘Could-have’ or low priority because there are other options to perform the calculations outside of the system. However, if these calculations are added to the system during implementation, they are most likely, due to their complexity, a high priority for testing.

To avoid this the requirements, or more precisely their priorities, need to be re-assessed as part of the initial UAT phase.

A simple but effective way to set priority is to assess each requirement against the risk criteria and assign a testing score. The following criteria are often used together to assess risk:

Once the priority of the requirements has been classified the UAT team can then agree how to address the requirements in each category.

A low score could mean the requirement is not tested or included in a simple checklist.

A medium score could mean the requirement is included in a test script with several requirements.

A high score could mean the requirement is the subject of a dedicated test script.

Planning UAT

A key question often asked of our team is how many test scripts will be needed and in what order should they be executed? These questions can be answered by creating a Critical Test Plan (CTP). The CTP approach requires that you first rise above the requirements and identify the key business workflows you are replicating in the system. For a LIMS system these would include:

Sample creation, Sample Receipt, Sample Prep, Testing, Result Review, Approval and Final Reporting.

Next the test titles required for each key workflow are added in a logical order to a CTP diagram, which assists in clarifying the relationship between each test. The CTP is also a great tool to communicate the planned testing and helps to visualise any workflows that may have been overlooked.

Now that the test titles have been decided upon, requirements can be assigned to a test title and we are ready to start authoring the scripts.

Choosing the right test script format

There are several different approaches to test script formats. These range from simple checklists, ‘objective based’ where an overview of the areas to test are given but not the specifics of how to test, to very prescriptive step by step instruction-based scripts.

When testing a system within the regulated space you generally have little choice but to use the step by step approach.

Test scripts containing step by step instruction should have a number of elements for each step:

A typical example is given below.

A screenshot of a cell phone
Description automatically generated

However, when using the step by step format for test scripts, there are still pragmatic steps that can be taken to ensure efficient testing.

Data Setup – Often it is necessary to create system objects to test within a script. In an ELN this could be an experiment, reagent or instrument, or in ePRO a subject or site. If you are not directly testing the creation of these system objects in the test script, their creation should be detailed in a separate data setup section outside of the ‘step by step’ instructions. Besides saving time during script writing, any mistakes made in the data setup will not be classified as script errors and can be quickly corrected without impacting test execution.

Low Risk Requirements – If you have decided to test low risk requirements then consider the most appropriate way to demonstrate that they are functioning correctly. A method we have used successfully is to add low risk requirements to a table outside of the step by step instructions. The table acts as a checklist with script executers marking each requirement that they see working correctly during executing the main body of step by step instructions. This avoids adding the low requirements into the main body of the test script but still ensures they are tested.

Test Script Length – A common mistake made during script writing is to make them too long. If a step fails while executing a script, one of the resulting actions could be to re-run the script. This is onerous enough when you are on page 14 of a 15 page script. However, this is significantly more time-consuming if you are on page 99 out of 100. While there is no hard and fast rule on number of steps or pages to have within a script, it is best to keep them to a reasonable length. An alternative way to deal with longer scripts is to separate them into sections which allows the option of restarting the current block of instructions within a script, instead of the whole script.

Are all the requirements covered?

An important task when co-ordinating UAT is be fully transparent about which requirements are to be tested and in which scripts. We recommend adding this detail against each requirement in the User Requirements Specification (URS). This appended URS is often referred to as a Requirements Trace Matrix. For additional clarity we normally add a section to each test script that details all the requirements tested in the script as well as adding the individual requirement identifiers to the steps in the scripts that test them.

What comes next?

UAT is an essential phase in implementing new software, and for inexperienced users it can become time-consuming and difficult to progress. However, following the above steps from our team of experts will assist in authoring appropriate test scripts and leading to the overall success of a UAT project. In a future blog we will look at dry running scripts and formal test execution, so keep an eye on our Opinion page for further updates.

Welcome to Scimcon, the Scientific Information Management Consultancy?

Digital Laboratory Transformation

With 91 million results in google search for the term ‘digital laboratory transformation’, this area seems to be the buzzword of the 20s. Scimcon has long been a stalwart in this area, having provided global partnership in information management to big pharma, bio and clinical organizations for over two decades.

The combined experience of our team spans more than 200 years of hands-on project roles in life sciences! What Information Management projects need, Scimcon has delivered: we have seen every type of project from every type of angle and have a unique perspective on how to ensure success.

And digital laboratory transformation is our lifeblood.

So with the establishment of our new website, we are launching a blog to address the subject of the digital lab and eClinical systems, and try to tackle some of the challenges, whilst also dispelling some of the myths.

Analytical projects including Information System Strategies, LIMS, ELN & LES, SDMS, CDS, DMS, Stability Management and Instrument Integration to mention a few. 

Scimcon’s original pedigree lies in the field of Information Management projects in the analytical laboratory. The company and its project consultants have extensive experience in:

  • LIMS (Laboratory Information Management Systems)
  • ELNs (Electronic Lab Notebooks) & LES (Lab Execution Systems)
  • SDMS (Scientific Data Management Systems)
  • CDS (Chromatography Systems)
  • Integration of laboratory instruments
  • Document Management Systems
  • Sample and Freezer Management Systems
  • Biobank Management Systems
  • Stability Management Systems
  • Information System Strategies
  • Project leadership
  • Business analysis
  • Technical specialists
  • User requirements gathering
  • Technical Audits
  • Gap analysis
  • Project requirement scoping
  • Management of tenders and RFP management
  • Information systems implementation management
  • Systems validation
  • Vendor audits and vendor selection
  • Regulatory compliance including ISO 17025
  • E-Signatures and compliance with Part 11
  • Designing and delivering systems training

Clinical projects including eCOA, ePRO, eDiaries, Drug Safety Systems & EDC 

With such an extensive knowledge of Information Management in laboratories and life sciences, Scimcon has more recently been invited to supply similar services for the clinical trials industry and especially in the move from paper to digital records. The development and adoption of new Drug Development Tools (DDTs) lies at the intersection between regulatory, science, academia and pharma drug innovation. It is generally thought that paper is a poor format for patient compliance in diaries, and data/ electronic adoption improves both compliance and record-keeping in clinical trials. With our relatable experience, Scimcon has been involved in applying its successful project leadership to the areas of:

  • ePRO (electronic patient reported outcomes)
  • eCOA (Electronic Clinical Outcome Assessments)
  • eConsent
  • BYOD (Bring your own devices)
  • Electronic patient diaries
  • Drug Safety Systems
  • EDC (Electronic Data Capture)

Top tips for partnering on projects

Our blogs will help you to navigate the world of outsourcing project leadership for either your analytical or clinical Information Management project.

The first step is to recognize why partnering is helpful:

Successful projects happen when you can trust the partner who is helping you. You do not need to relinquish control of your Information Management projects, but to successfully take them forward a trusted supplier who is not afraid to challenge, and who is vendor-neutral, can bring so much to the table.

In both the largest and the smallest organization alike, the knowledge of the organization rests with its employees. This leads to a natural gap of knowledge between best practice, external and competitor knowledge, and learning from others.

Scimcon fills that gap, its combination of deep hands-on experience, combined with information systems skills, extensive scientific systems and organizational experience, and a commitment to project success, means that we are always committed to bringing successful projects over the line.

We know the shortcuts, we understand the limitations with vendor capabilities, we recognize the scope of your existing systems and we know how to get the best from your teams. And our 100% year-in, year-out experience as a project partner means that we are always working, always able to move projects forward, even when faced with organizational challenges.

Talk to us now, or contact us to discuss your projects, past and future alike.

In order to work as intended, this site stores cookies on your device. Accepting improves our site and provides you with personalized service.
Click here to learn more