Common lab informatics questions – part 1: one system or more??

As an information systems consultancy dedicated to successfully delivering lab-based information systems, we help our clients to overcome many different challenges. There are some important questions that we are frequently asked to evaluate.

In part one of this blog series, we’ll summarise the considerations to make when answering 3 common questions about lab informatics systems, all in the theme of ‘is a single system better than multiple similar systems?’

1. Should R&D labs use the same informatics systems as QC?

Here the context matters. If one were to generalise, R&D labs tend to be experiment-based, answering questions like ‘What ingredient changes in the product formulation will increase effectiveness and reduce environmental impact?’. On the other hand, QC labs are more focused on samples taken from production runs, and questions such as ‘Are the % composition of key ingredients within a production batch within specification?’

If we use the above generalisation and apply lab informatics thinking, in broad terms, ELNs are centred on recording experiments and therefore more suited to R&D. LIMS, being sample, test and results biased, are generally more suitable to QC labs.

However, it is not that simple. For example, perhaps one of the R&D labs provides analytical services to various teams executing R&D experiments – this type of ‘service’ lab is often better served by LIMS than ELNs.

The type of labs involved is not the only factor to consider. For example, CDS systems are generally applicable to both R&D and QC. The methods and use of the instruments may well vary across R&D and QC, but the instrument data systems can be exactly the same.

Finally, regulatory needs, specifically for QC can also be a driving factor in answering the question. We will consider this further in one of the following questions.

2. Should we implement a single global system or several more local systems?

When Scimcon first started nearly three decades ago, the focus within large multi-national companies was on implementing large, monolithic lab systems. This approach still has its place, particularly where the distributed labs are very close in terms of operational and analytical workflows.

Current thinking, however, looks to best support the diversity of lab workflows across global sites. While this should not mean a different system in every single lab, it should ensure some flexibility in selecting systems locally. This has several benefits, including a better informatics fit for each lab, and the increased local user buy-in gained by allowing flexibility.

However, against the background of the drive to increased data aggregation, data science and analytics, and AI/ML, this local approach can be counterproductive. It is therefore important to set standards and guardrails about how these systems are implemented, and how the data is structured and linked via reference data, so that consolidation into centralised reporting tools and data lakes is facilitated.

3. Should I have different systems for GxP and non-GxP work?

There is a well-used saying within regulatory-compliant organisations: ‘If a system contains just 1% of GxP data, then the whole system is required to be implemented, managed and maintained in a regulatory compliant manner.’

This statement leaves compliant organisations questioning:

  1. Is it easier to run one regulatory compliant system, that contains both non-GxP and GXP data, and accepting that the non-GxP will also be subject to the associated GXP administrative overheads?
  2. Or is it easier to have two systems, one GxP and the other non-GxP, the latter of which is subject to less rigid controls?

The first step to answering the question is to determine the delta between administering a GxP system, and administering a non GxP system. LIMS, ELN, SDMS, CDS and other lab informatics systems are often classified by labs as mission-critical. Most organisations wouldn’t countenance a lack of system administration rigour or releasing untested changes to mission-critical systems, so this delta may be lower than it first seems.

The next step is an open conversation with QA teams about the types of data being held, and the control systems that will be put in place. In the past, we have successfully taken a two-tier approach, where the administration procedures for non-GxP are simpler than those for GxP data in the same system. However, for this type of arrangement to be viable, a detailed risk assessment is required, and the ongoing management and control of the administration has to be very well executed.

Finally, before making the decision, it’s worth considering whether there are shared services or functions involved. For example, if the GxP and non-GxP work uses the same inventory management, it might be complex to get the inventory system interfacing and updating two systems simultaneously.

Summary

Hopefully, we have illustrated the importance of being clear about what your requirements are before answering these key questions about lab informatics systems. Each case is unique, and your decision will usually be based on a wide range of influencing factors. We help organisations to consider all of the options and roll out their chosen model.

Stay tuned for part 2 of this blog series, where we will look at the key question of how you can prepare your data for AI and machine learning.

Reducing laboratory carbon footprint in biotech and pharma?

My Green Lab – a non-profit organisation that is focused on improving the sustainability of scientific research – recently reported that the carbon footprint produced by the biotech and pharmaceutical industry (including laboratories) increased from 3.9 percent in 2021 to five percent in 2022.

But, more and more companies are committing to the UN’s Race to Zero campaign, which aims to halve total carbon emissions by 2030 and reach net zero emissions by 2050.

In addition to reducing Scope 1 emissions (direct emissions from owned or controlled sources) and Scope 2 emissions (indirect emissions from the purchase and use of electricity, steam, heating and cooling), there is a growing focus on Scope 3 emissions (indirect emissions that occur in the upstream and downstream activities of an organisation).

My Green Lab found that overall, Scope 3 emissions are 4.6 times greater than Scope 1 and 2 combined in the biotech and pharma sector. The impact of this is that pressure to reduce carbon use is being applied down the supply chain, impacting labs at every phase of development, scale-up and manufacturing.

According to CPHI’s 2023 annual survey, 93 percent of executives state that ‘visibility on supply chain partner’s sustainability record’ is either ‘extremely important’ or ‘important’.

There are a number of ways in which laboratories can demonstrate their commitment to sustainability – and help the organisations they are providing services to reduce their Scope 3 emissions – some to consider include:

  1. Obtain My Green Lab certification: considered the gold standard for laboratory sustainability best practices around the world, the program provides scientists and the teams that support laboratories with actionable ways to make meaningful change.
  2. Switch to laboratory products that have the ACT Environmental Impact Factor Label: by emphasizing Accountability, Consistency, and Transparency (ACT) around manufacturing, energy and water use, packaging, and end-of-life, ACT makes it easy to choose more sustainable products.
  3. Identify opportunities for energy efficiency in the laboratory: the Center for Energy Efficient Laboratories (CEEL) provides useful reports and advice.
  4. Join the Sustainable European Laboratories Network: a network of local sustainability teams as well as independent ‘green labs’ networks, which aims to transform the way science is done so that it better responds to the environmental challenges of our era.
  5. If your lab is part of an academic institution, consider joining the LEAF Programme, a standard set by University College London – and followed by 85 global institutions – to improve the sustainability and efficiency of laboratories.

There are many other networks, initiatives and accreditations aimed at helping labs become more sustainable. Tapping into these resources, as well as finding ways to make your lab more efficient, can help you to both reduce carbon emissions and save costs. Importantly, it will ensure your lab does not lose out in future when sustainability becomes a deciding factor in procurement.

Scimcon continues its commitment to reducing its carbon footprint, having signed up to the Science Based Targets Initiative (SBTi) and providing a target whilst also gaining an award for sustainability from Ecovadis. As we continue to add value in the complex lab informatics field, we work closely with our clients to detail Scimcon’s Scope 3 assessments and action plans. 

Navigating change successfully in the lab?

Change is both inevitable and constant in the modern lab – evolving regulatory requirements demand greater analytical sensitivity or more rigorous reporting; new instruments are launched to tackle increasing analytical problems faced by scientists; and digital transformation is now considered a necessary step for labs processing and communicating huge amounts of data between systems and software.

Despite its importance, change can be daunting for scientists. Change management is crucial to implementing meaningful and successful change, yet it is often neglected or not applied in full across the lab.

Our infographic highlights the key considerations labs should take when embarking on change, and how to ensure that change is managed successfully across all facets of the lab environment.

Scimcon describes key change management factors for lab managers to consider when undertaking an informatics software implementation.

For more recommendations about how to successfully manage change in an informatics software project, visit our blog: Breaking the change management mould – leading successful laboratory information system projects and digital transformations – Scimcon

Breaking the change management mould – leading successful laboratory information system projects and digital transformations?

Laboratory-based organisations have consistently undergone change, whether provisioning new analytical techniques, instrumentation, information system implementations, or incorporating new regulatory requirements. This is especially true today, when we are undertaking initiatives such as digital transformation and the introduction of AI/ML. In fact, one definition of transformation is ‘a radical change’.

What’s clear is that change is constant. However, managing change effectively is essential to success when undergoing these types of projects. Well-run lab informatics projects manage change within the software project lifecycle. Examples of project change include adjusting functional scope; raising change requests as functionality is demonstrated; and variation of costs. Yet, one key area of change is often neglected.

The problem arises when change management for lab informatics projects focuses solely on the technical delivery of the software. In these cases, very little effort is allocated to the change that will need to occur within the laboratory to accommodate the new system. If lab change management is considered, it is often dealt with ad-hoc and separately from the software delivery part of the project, leading to misalignment, misunderstanding, and missed timelines.

75% of the lab is indifferent to your project.

Lab Manager reports that in a typical change environment, 25% of staff will be early adopters, 25% will actively resist change, and about 50% will be ‘on the fence’ in the early stages.1

These statistics are backed up by experience. Scimcon is often called in to resolve issues within ‘in-flight’ informatics projects. All too often, the route cause analysis reveals the lab community only understood the true impact of the new system too late to adopt it, adapt lab workflow, and change procedures. Rectifying the issues after the fact is seldom quick or low-cost.

Informatics projects don’t operate in a vacuum.

Informatics software does not function in isolation, so change management needs to consider the physical working procedures, workflows, SOPs, job roles, quality system, and other areas that will be impacted within the laboratory.

For example, the implementation of a new LIMS could trigger changes such as:

Given that a lab informatics project will generate a large number of change items similar to the above examples, they must be managed appropriately.

In many respects, these changes are very similar to a system’s user requirements, except they are related to the lab processes as opposed to software functionality. With this in mind, they need to be handled in a similar fashion. Create a team with a project lead and subject matter experts who represent the laboratory. The lab change team should be tasked with actively gathering and maintaining the backlog of change items throughout the project life cycle. Each change should be assessed for impact and priority, added to the change management plan, and allocated to team members to be actioned.

Planning for change starts early.

Before making any significant lab Informatics investment within an organisation, it is likely a business case will be required. If you are serious about managing all aspects of change this is where you should begin. Business cases generally do an excellent job of covering benefits, costs, and ROI – however, change management, specifically within the physical lab, is often not called out in terms of impact, approach or importantly the resources and associated costs.

Not highlighting the lab change management process, resources and costs at this stage will make it considerably more difficult for change management to become embedded in your project at a later stage.

Benefits of effective change management.

The benefits of effectively integrating laboratory change management alongside traditional change management for lab informatics project cannot be ignored. New systems can get up and running faster, and can, importantly, deliver improved lab processes and be met with enthusiasm rather than reluctance, scepticism, or apprehension.

Scimcon consultants are on-hand to support lab leaders overseeing change. As many of our consultants have lab experience themselves, they have seen first-hand the impact of change in the lab, and can provide in-depth knowledge on how to ensure success.

For more information about how Scimcon can support your next big project, contact us.

References:

  1. ‘A Guide to Successful Change Management’ Lab Manager, https://www.labmanager.com/a-guide-to-successful-change-management-27457 [accessed 02/11/23]
Digital Transformation in the lab: where to begin??

Digital transformation is not a new concept, it is just expanding the use of technology as it advances. Today’s laboratory users expect a certain level of usability and synchronicity. After all, in other aspects of their daily lives they are accustomed to having, for example, a seamless digital shopping experience via Amazon.

So, with demand for digital transformation coming from the lab users themselves, and often from the organisation, establishing what it really means to you and what’s achievable, as well as where you are already on the path to digital transformation, is a useful starting point.

What is digital transformation in the lab?

Digital transformation requires constantly improving the environment and the platforms in the lab to give the scientists the best tools possible and make their lives easier. It’s not a single project or something that will be completed in a year, or two.

For some organisations, the first step on their digital transformation might be putting in a new LIMS or ELN – which drastically improves their operations, but could be a huge undertaking depending on the scale of the organisation and the legacy infrastructure. For others, it might be establishing the tools and connections to enable the online monitoring of instrument status, automatic ordering of consumables, reserving instrument time and auto-tracking utilisation, for example. Plus, there are many iterations in between.

What’s important for any lab embarking on, or evolving, a digital transformation journey, is to determine where they are, what their goals are and what’s achievable.

How Scimcon can help

We understand the scale of the digital transformation challenge, as well as what is needed to overcome limitations and ensure improvements are made. Our team of experienced consultants – scientists themselves – are ideally placed to help you define and progress your digital transformation journey.

Efforts will continue in the coming years to achieve a truly digital laboratory. However, this will not be a linear journey. Advancements are constantly emerging and the latest technology will build upon the success of others, meaning the ‘latest thing’ is always evolving. Navigating this process successfully will allow laboratories to achieve increased productivity and optimised workflows – giving scientists back more time to spend on getting results.  

Advancing your digital transformation journey can be a challenge, but, if done well, can transform your lab and its results. Through a wealth of experience in this area, Scimcon can help you to identify your digital transformation goals and help make them a reality in the short, medium, and long term.

Contact us today to learn more about how we can help you with your digital transformation journey.

Project or Program? Why adapting your approach and working practices makes the difference.?

By Geoff Parker and Paul McTurk

Having worked on more than one hundred information system projects and programs over the last 20+ years, for lab-based organisations of all shapes and sizes, we know that people can sometimes confuse the two. It’s an easy mistake to make! However, there are very clear differences between a project and a program and, as we have demonstrated to our clients many times, handling each in the correct way can have a big impact on overall success.

What is a project vs. a program?

Projects are typically well-defined, as they deliver a well-understood, specific goal or outcome, within a specified timeline: e.g: implementing a new information system or service within a laboratory. There is usually a distinct team and a clear route from start to completion.

A program involves doing things that deliver a strategy or initiative – or a number of strategy points or initiatives – and is less easy to define, compared to a project. For example, a program might be put in place to respond to a challenge such as: ‘We want to make the lab 30% more efficient.’ There might be (and usually are) projects underneath this, which could include ‘Specific enhancements to current information systems’, ‘Good lab practice training’, ‘Lab supply chain improvement’, etc. Programs can span several months, or even years, and therefore require strategic oversight, a lot of iteration and the involvement of many stakeholders.  

Projects are managed through project management methodologies such as PRojects IN Controlled Environments (PRINCE2), and Gantt charts are often employed to map out how you will get from A to B and in what timeframe. At a program level, Gantt charts rapidly become overly complicated and you’re more likely to see a roadmap with aims and targets, but without the detail and structure of a project plan.

So why does this matter? It might be tempting to replicate how you plan and lead a project when thinking about a program. But it’s going to be impossible to scale and communicate effectively using the same approaches.

Having helped many lab-based organisations to run informatics projects and programs, we share some of our insights on how to lead, communicate, manage risk and account for human factors, when planning and rolling out both projects and programs.

1. Leadership

Program leaders require strategic thinking, flexibility, excellent communication and stakeholder management, strong delegation, and empowerment skills, as well as effective team and resource management, among many other attributes.

While project managers also need many of these skills, their focus is much more task and delivery-focused. In short, they prioritise everything related to ‘getting the job done’, on time and within budget.

Program leaders have a much wider remit, from defining the strategic direction and focus, to creating a structure under which the ‘child’ projects will operate, managing ‘child’ project risks that could impact other ‘child’ projects, or the program as a whole. Program leaders are focused on achieving benefits, and strategic objectives that align with the organization’s vision.

2. Communication

Project communication is usually to a defined group of people on a regular basis, i.e. daily, weekly or monthly. Most people engaged in a project are involved to a similar degree and are very familiar with the details, so the level of information shared will be both quite granular and consistently consumed by all team members. Good communication within a project tends to be direct, detailed, and unfiltered.

For programs, where there may be hundreds of people involved with varying levels of engagement, cutting through the noise and providing updates that are impactful, relevant and easy to digest is key. Whereas ‘one size fits all’ may be suitable for a project, programs need to be communicated in various levels of detail, and, rather than relying solely on scheduled communication, benefit from participants ‘self-serving’ information.

Program leaders need to enable a shared awareness about what’s happening across the whole program, in an easily digestible format. A simple one-page graphic that shows the key milestones and summarises the roadmap can be effective and might be sufficient for some stakeholders. A program newsletter, outlining progress against key milestones and any major challenges or opportunities is another useful communication method. When sharing updates via tools such as Microsoft Teams, tagging stakeholders is a good way of ensuring your update attracts their attention.

Often Scimcon includes expert communications professionals within programs, who help determine the level of information sharing and recommend the best channels to use, as well as providing guidance on how to navigate organisational culture for the most effective program communication.

3. Risk management

Risk management is critical for both projects and programs.

Typically, within projects, risks are identified, investigated, and mitigated as the project progresses. The risks are listed and managed within a regularly updated risk log.

Once again, the scale and complexity of programs dictates a different approach. Rather than identifying risks as they become apparent, a proactive and systematic methodology is required.

A technique we have borrowed from product development methodologies, such as the Lean Startup framework is Riskiest Assumption Testing, often referred to as RAT.

RAT is an effective technique that ensures the program’s most critical assumptions are identified and adequately tested, both at the start of the program, and on an ongoing basis. For example, at the start, one of your riskiest assumptions is whether your team can work well together at all. This needs to be tested early. See “Human Factors” below.

Other examples of riskiest assumptions:

  1. Program objectives are well-defined, well understood and agreed.
  2. The lab and the wider organisation will accept the business change.
  3. There is sufficient budget for the program and the required ‘child’ projects.

RAT emphasizes rapid experimentation, learning from failures, and adapting mitigation strategies based on evidence.

4. Human factors

If a project team works well together, it might be tempting to think that larger teams can do the same. The difference between leading small teams of 10-20 people and teams that are much larger is significant.

Program delivery success is influenced by a variety of human factors that can impact the effectiveness and efficiency of the program and could easily justify a dedicated blog post.

These factors include team dynamics, motivation and morale, decision-making, conflict resolution, issue escalation and knowledge sharing.

Let’s look at one of these – issue escalation – in a little more detail.

Early escalation of issues is a key success factor in the on-time delivery of projects. When confronted with an issue, well-meaning team members can mistakenly believe it is their job to solve the problem quietly and report when the resolution is complete. Often however, this results in the potential problem only coming to the wider team’s attention days or possibly weeks later.

The escalation process should be multi-tiered (‘heads up’, ‘warning’ and ‘escalation’) and transparent within teams, so that it becomes second nature for individuals to share any concerns with the right people, at the appropriate time. Regular problem-solving sessions or informal team meetings where the only agenda point is discussing/brainstorming any concerns, no matter how small, is a good practice and something we do ourselves and advocate with clients!

The connected nature of the program and the ‘child’ projects within the program means that the likelihood of human factors affecting delivery increases and requires ongoing monitoring and proactive management.

Summary

Projects and programs may appear very similar in nature however due to programs’ scale and complexity we highly recommend you don’t attempt to lead them in the same manner as projects.

We have hopefully provided some tips and insight for how to take the right approach when planning, leading and implementing projects and programs. To ensure successful outcomes, project / program leaders should include the key aspects of leadership, communication, risk management and human factors in their project or program planning.

If you need help with your upcoming projects or programs, contact us.

Industry leader interviews: Jana Fischer?

We’re kicking off 2023 with a new industry leader interview, and shining a spotlight on Jana Fischer, Co-Founder and CEO of Navignostics.

In this blog, we speak to Jana about Navignostics’ mission, and how the team plans to revolutionise personalised oncology treatments with the help of data and AI.

Tell us about Navignostics

Navignostics is a start-up personalised cancer diagnostics business based in Zurich, Switzerland. Our goal is simple – we want to revolutionise cancer treatment by identifying a highly personalized and thus optimal treatment for every patient, to ensure that each patient’s specific cancer is targeted and fought as needed. Our capabilities allow us to do this by analysing tumour material, through extracting spatial single-cell proteomics information. and using this data to analyse many proteins simultaneously in individual cells within the tissue.

What is spatial single-cell proteomics?

Single-cell proteomics comprises of measuring and identifying proteins within a single cell, whereas spatial proteomics focuses on the organisation and visualisation of these proteins within and across cells. Combining these two research tools allows the team at Navignostics to characterise tumours on a cellular level, by identifying the proteins present across cells in a tumour, and also how these proteins and cells are organised. This means that the team can provide a more accurate estimate for how certain tumours will respond to different medications and treatments.

Proteins are typically the target of cancer drugs and measuring them on a cellular level allows us to identify different types of tumour cells, as well as immune cells that are present and how the two interact. This data is highly relevant to inform clinicians of the best form of (immuno-) oncology and combinatorial treatment for individual patients. Also, this information is highly relevant to pharma companies in order to accelerate their oncology drug development, by providing insight on drug mode of action, and signatures to identify responders to novel drugs.

The kind of data that we are able to extract from different types of tumours are monumentally valuable, so the work doesn’t stop there. All of the data we harness from these tumours is stored centrally, and we plan on utilising this data by building it into a system we refer to as the Digital Tumour, that will continuously allow us to improve the recommendations we can make to our clinical and pharma partners. Our journey has been rapid, though it is built on years of research and preparation: we founded the business in 2022, as a spin-off from the Bodenmiller Lab at the University of Zurich.

The dream became a reality for us in November 2022, when we secured a seed investment of 7.5m CHF. This seed funding will allow us to pursue our initial goals of establishing the company, achieving certification for our first diagnostic product and developing our Digital Tumour. By extension, collaborating with pharma and biotech partners in oncology drug development. It has also given us the resource we need to move to our own premises. We are due to move off university campus in May 2023. This offers us great opportunity to push forward with the certification processes for our new lab, and it gives us to the chance to grow our team and expand our operation. We will be located in a start-up campus for life science organisations in the region of Zurich, so we’ll be surrounded by companies operating in a similar field and at a similar capacity.

Tell us more about the Digital Tumour – how does it work?

The Digital Tumour will be the accumulation of all the molecular data we have extracted from every tumour that we have analysed to date, and ongoing. Connected to that, we store information on the clinical parameters and patient response to treatment. Over time, our aim is to utilize this central data repository to identify new tumour signatures, and build a self-learning system that will provide fully automated treatment suggestions for new patients, based on how their molecular properties compare to previously analysed patients that have been successfully treated.

Sounds interesting – are there any challenges to working with a database of this size?

Our data storage is quite advanced, so volume isn’t really a challenge for us. Our main focus is standardising the input of data itself. The technology is based on years of research and the data analysis requires a great deal of experience and in-depth expertise. In order to extract the full value from this data, it must be completely standardised. Data integrity is therefore vital to our work, and allows us to get the maximum value from past analyses. Our past experience in the Bodenmiller Lab allowed us to develop standardised processes to ensure that all of our data is fully comparable, which means that we can learn more and more from our past data, and apply this to new cases that we analyse.

It is also important to report on our complex data in a comprehensive but easily interpretable manner to the clinician/tumour board who needs to organise a treatment plan. We’re currently working with our clinical collaborators to develop readily understandable and concise reporting outputs. Unlike genomics analysis, our reports focus on proteins in tissue, which is the same information that clinicians are used to working with. So, there is a common language there that offers us the unique opportunity to provide clinicians with data they can easily interpret and work with.

What does this kind of research and data mean for oncology, both in terms of pharmaceuticals, biologics, and healthcare?

It’s important to note that personalised treatment approaches and precision medicine are not new concepts in the diagnostics space. However, our technology and algorithms allow us to extract novel types of biomarkers which were previously inaccessible or unknown, so we’re helping to level up the playing field and give clinicians and drug developers’ comprehensive information to individualize therapies.

Comprehensive tumour data is truly at the heart of what we do, and one key benefit of our technology is that we’re able to analyse very small amounts of sample – such as fine needle biopsies – to provide therapy suggestions. We can also analyse bio banked tumour material, so if there is any old material that has been stored, we have the ability to analyse those samples retrospectively. Not only does this help us to fuel our Digital Tumour with more data, but it also allows us to examine new fields such as long-term survival rates of patients with these tumours. This is of huge value to fuel our product development pipeline because it allows us to identify different molecular properties between individuals that may not have been considered on a clinical level, but may have played a role in patient responses to treatments and survival outcomes in the long-term.

This kind of retrospective data also plays a key role in the evolution of healthcare and drug development, as having the technologies available to acquire this sort of data and mine it to our advantage will provide enormous benefits. These include improving individual treatment courses for patients, as well as expediting the development of novel cancer drugs so pharma companies can get more effective treatments to market sooner.

For example, one commonly cited statistic is that 90% of clinical drug development fails during phase I, II, III trials and drug approval. Often, this may arise from a lack of available information to identify the subset of patients most likely to benefit from a novel drug. Having access to Navignostics’ technology and algorithms and a database such as the Digital Tumour will offer the potential to pre-select the right patients to enroll in clinical trials, and more easily identify the patients that do respond to the novel treatment, which could substantially expedite the speed of drug development in the trial stage, and help bring more effective drugs to the market.

Even unsuccessful trials offer valuable opportunities: it is possible to repurpose and reanalyse material from previous failed trials. Such high rates of failure in clinical development means that there are a large number of companies that have invested $millions in developing drugs that have not come to fruition, so if companies want to re-mine their data, our team can reinterpret the existing work into identifying more successful strategies, so we can give those drugs another chance and offer a better chance of Return on Investment.

A failure no longer needs to be a failure. Navignostics and its offerings can bring value to our pharma and biotech partners, and will also bring direct benefit to patients and clinicians once we launch our diagnostics product. So, data from every facet of the oncology industry, from curing a patient to halting the development of a drug, can offer us valuable insight that both we and the Digital Tumour could learn from when developing treatments.

What does 2023 and beyond have in store for Navignostics?

The next three years will be critical for our work, and we have projected timelines and key milestones for our diagnostics developments that we will achieve until our next funding round. Along the way, we are actively speaking to biotech and pharmaceutical organisations to identify projects and build the foundation for long lasting collaborations. We are looking forward to a successful continuation of the Navignostics development in 2023!

Scimcon is proud to showcase start-up companies like Navignostics, and we’re looking forward to seeing how the company will grow over the coming years.

To contribute to our industry leader blog series, or to find out more about how Scimcon supports organisation with lab informatics and data management solutions, contact us today.

Introducing Ben Poynter: Associate consultant, and Scimcon’s newest recruit?

Our team at Scimcon is made up of a talented group of interesting individuals – and our newest recruit Ben Poynter certainly does not disappoint!

Ben joined our Scimcon team in July 2022 as an associate consultant, and has been working with the lab informatics specialists to get up to speed on all things Scimcon. We spoke to Ben about his experience so far, his interests, background, and what he hopes to achieve during his career as an informatics consultant.

To get us started, tell us a bit more about your background.

So, I studied Biomedical Science at Sheffield Hallam University, which was a four-year course and allowed me to specialise in neuroscience. During my time at university, I created abstracts that were presented in neuroscience conferences in America, which was a great opportunity for me to present what I was working on. My final year dissertation was on bioinformatics in neuroscience, as I was always interested in the informatics side of biomedical science as well.

Once COVID hit, I moved into code work, and worked in specimen processing, and then as a supervisor for PerkinElmer who were undertaking some of the virus research. When things started to die down, I began working for a group called Test and Travel (not the infamous Track and Trace initiative, but a similar idea!). I started there as a lab manager, training new staff on lab protocols for COVID-19, and then a month into that I started working more on the LIMS side – which is where I ended up staying. I wrote the UAT scripts for 3 different companies, I performed validation on the systems, I would process change controls. I then moved to Acacium as LIMS lead there, so over the course of my career I’ve worked with a number of LIMS and bioinformatics systems, including LabWare 7, LIMS X, Labcentre, WinPath Enterprise, and Nautilus (ThermoFisher Scientific).

Which now brings you to Scimcon! What was the deciding factor for you taking on the associate consultant role?

In the early stages, I would have to say it was when Jon and Dave led my first interview, and Jon asked me a question I hadn’t been asked in an interview setting before. He asked me ‘who is Ben Poynter?’. The first time I answered, I discussed my degree, my professional experience with LIMS and other informatics systems, and how that would apply within Scimcon’s specialism in lab informatics consultancy. Then he asked me again and I realised he was really asking what my hobbies were, and how I enjoyed spending my free time. Since starting at Scimcon, I’ve been introduced to the full team and everyone is happy to sit and talk about your life both inside and outside of work, which makes for a really pleasant environment to work in. Also, it seems as though everyone has been here for decades – some of the team have even been here since Scimcon’s inception back in 2000, which shows that people enjoy their time enough to stay here.

I’ve been given a really warm welcome by everyone on the team, and it’s really nice to see that everyone not only enjoys their time here, but actively engages with every project that’s brought in. It’s all hands on deck!

That brings us nicely into our next question then – who is Ben Poynter? What do you like to do outside of work?

So, my main hobbies and interests outside of work are game design, as well as gaming in general. I run a YouTube account with friends, and we enjoy gaming together after work and then recording the gameplay and uploading to YouTube. We are also working on a tower defence game at the moment, with the aim to move into more open world games using some of the new engines that are available for game development.

In addition to gaming and development, I also enjoy 3D printing. I have a 3D printer which allows me to design my own pieces and print them. It’s a bit noisy, so I can’t always have it running depending on what meetings I have booked in!

Technology is a real interest of mine, and I’m really fortunate to have a role where my personal interests cross-over into my career. The language I use for game design is similar to what I work with at Scimcon, and the language skills I’ve developed give me a fresh perspective on some of the coding we use.

What sort of projects are you working on? Have you had the opportunity to use your language skills to full effect?

At the moment, I’m working on configuration for some of the LIMS systems I’ll be working with at customer sites, which I really enjoy as it gives me the chance to work with the code and see what I can bring to the table with it. Other projects include forms for Sample Manager (ThermoFisher Scientific), making it look more interesting, moving between systems, and improving overall user experience. It’s really interesting being able to get to grips with the systems and make suggestions as to where improvements can be made.

My first week mainly consisted of shadowing other Scimcon lab informatics consultants to get me up to speed on things. I have been working with the team on the UK-EACL project, which has been going really well, and it’s been great to get that 1-2-1 experience with different members of the team, and I feel like we have a real rapport with each other. I’ve been motoring through my training plan quite quickly, so I’m really looking forward to seeing the different roles and projects I’ll be working on.

What are you hoping to achieve during your career at Scimcon?

I’d really like to get to grips with the project management side of things, and also love to get to grips with the configuration side as well. It’s important to me that I can be an all-round consultant, who’s capable at both managing projects and configuration. No two projects are the same at Scimcon, so having the capability to support clients with all their needs, to be placed with a client and save them time and money, is something I’m keen to work towards.

For more information about Scimcon and how our dedicated teams can support on your lab informatics or other IS projects, contact us today.

Industry leader interview: Luke Gibson?

2020 has been a difficult year for most industries, not least for event and tradeshow providers. Luke Gibson, Founding Director of Open Pharma Research and Lab of the Future, shares his experience of running events in the laboratory industry, and what makes Lab of the Future such a unique event.

Luke, please tell us a bit more about yourself and Lab of the Future

My name is Luke Gibson, and I am one of the three founding directors of Open Pharma Research. I have 30 plus years of experience in developing and running events, primarily in the financial and trade and commodity sectors. My colleagues Kirianne Marshall and Zahid Tharia bring a similar level of experience to the company.

Kirianne has had many years of experience in managing the commercial side of large congresses, such as Partnering in Clinical Trials, and research and development congresses. Zahid has 30 years of events experience too, particularly in running life science portfolios, and launching congresses/events. Our paths have crossed many times throughout our years working in events, and we eventually hit a point where all 3 of us had the capacity to try something new – something that was worthwhile, fun, and different to the corporate worlds we had become accustomed to. So that was why we created Lab of the Future – with a view to running events in a different way.

Did you feel that there was a gap in the market for this type of event?

I’m not sure if I would describe it as a gap in the market, more an ambition to do things differently. There was a desire from all of us to build an event with a different approach to the one we would take when working for large organisations, because when you’re working on a large portfolio of global events that cover a variety of topics, you and your team are always looking ahead to the next event, and the focus on the longevity of a single event isn’t always there.

We wanted something that we can nurture and grow, something that we can work on year-round without getting distracted by the next thing on our list. It also allows us to stay within this space and build our community, without having to face pressures such as a year-on-year development strategy or diverse P&L. Our desire was to avoid these constraints, and create an event that we can continue to work on for a long time.

Are you building just the one event, or are you looking at hosting a series? Has your business plan changed since starting?

We want to be able to live and breathe Lab of the Future, but one of the interesting things about it is that it’s such a broad concept. On the one hand we deal with informatics, but on the other hand, we deal with equipment, technology, and all the connectivity between them – but even that’s just one part of it. We are not an informatics conference; we are not strictly an instrumentation conference; we also look at the innovation side of things.

I think the best way to describe how we see Lab of the Future is as a proxy for how you do science in the future. Everything pertains to more efficient processes; better results; or ways of creating breakthrough innovation, and these are all part of the picture of science in the future. And that is the lab of the future – where the lab is the proxy for the environment where you do the science that matters.

So what is the main focus for Lab of the Future?

When we started off, we found we received a lot of queries from industry contacts who wanted to get involved, but certain topics they wanted to discuss didn’t necessarily pertain to the physical laboratory itself. But if it was relevant to science, then it was relevant to us. Things like data clouds and outsourced services may not be directly linked to the lab, but they still relate to how you work. So, within that, the scope for the Lab of the Future gets wider still, looking at areas such as how we can create virtual clinical trials, or use real world-data to feed back into R&D.

People are also keen to learn more from their peers and from other areas of the industry. Lab of the Future allows us to host senior speakers and keynotes who can tell us where we’re heading, and show us how the efforts of one area within life science feed into other areas. It presents us with an almost ever-changing jigsaw image, and it’s this strategic element that I think sets us apart from other events.

Who is your main audience for Lab of the Future?

We attract a real mix of attendees, and that’s what I love about it. You can run a conference for people in a specific job function, such as a data scientist or an R&D manager, but what people really want to know is what the people around them are doing, to almost give them context of the industry as a whole. So, our conference doesn’t just exist to help you do your own job better, but it helps you to develop a concept of where your department is heading in the future, and what you should think about longer term. We aren’t telling scientists how to do their job today; we’re helping them think about their responsibilities for delivery in the future.  Lab of the Future is about the delivery of science of the future.

Our sponsors and solution providers that support the conference are also very much part of our community, as they’re all innovating and making waves in this space as well. They’re in a space that’s always evolving to build the Lab of the Future; and they are part of that solution. So, we don’t merely facilitate a conference of buying and selling between providers and services, we offer a space where everyone is evolving together. It’s a real melting pot, and that’s the fun bit really.

How do you build the Lab of the Future Community?

Zahid’s background in life sciences definitely gave us a starting point. Further to that, we’ve found that every time we put something out, that our community engages, and as a consequence we’re introduced to people we never expected to be introduced to. The fact we’re always talking to people enriches our content – the people we meet and conversations we have change our way of thinking, and shape what we’re doing.

Although I’m in charge of our marketing operations, I have to say I’m not always sure where some of our contacts come from! One thing I’ve found quite surprising is the lack of reliance on a database – there’s a lot of power in word-of-mouth, especially in this space where everyone is working on something – why not share that? As we’re seen as adding value to the conversation, it allows people to find us through their connections and our supporters.

Scimcon is proud to sponsor Lab of the Future, and we can’t wait to see you at the Autumn virtual congress on 26 – 27th October 2021. Contact us today to learn more about our participation in the event, and stay tuned on our Opinion page for part 2 of our conversation with Luke.

The role of AI and ML in the future of lab informatics?

A few months ago I read an article on bioprocess 4.0, which discusses how combining AI and ML with extensive sensor data collected during biopharmaceutical manufacturing could deliver constant real-time adjustments, promising better process consistency, quality and safety.

This led to a discussion with some of my colleagues about what the future of Lab Informatics could look like when vendors start to integrate AI and ML into products such as lab information management systems (LIMS), electronic lab notebooks (ELN) and others.

What is AI and ML?

AI:  In simple terms, AI (artificial intelligence) makes decisions or suggestions based on datasets with the ultimate aim of creating truly instinctive system interfaces, that appear like you are interacting with a person.

ML: ML (machine learning) is one of the methods used to create and analyse the datasets used by AI and other system modules. Crucially machine learning does not rely on a programmer to specify the equations used to analyse data. ML looks for patterns and can ‘learn’ how to process data by examining data sets and expected outcomes.

How does ML work?

The following example is extremely simple, but it helps to illustrate the basic principles of ML. The traditional approach to adding two values together is to include the exact way the data should be treated within the system’s configuration.

Graphical user interface, text, application
Description automatically generated

By using ML, the system is given examples, from which it learns how the data should be processed.

Table
Description automatically generated

Once the system has seen enough datasets, the ML learning functions learn that A & B should be added together to give the result. The key advantage of ML is its powerful flexibility. If we feed our example system with new datasets, the same configuration could be used to subtract, multiply, divide or calculate sequences all without the need for specific equations.

Where can we see examples of how ML and AI are used in everyday life?

Possibly without realising it, we already see ML in everyday life. When you open Netflix, Amazon Prime Video or Apple TV+ the recommended selections you are presented with are derived using ML. The systems learn the types of content each of us enjoy by interpreting our previous behaviour.

Most of us also have experience of personal assistants such as Amazon’s Alexa and Apple’s Siri. These systems are excellent examples of AI using natural speech to both understand our instructions and then communicate answers, or results of actions. ML not only powers the understanding of language but also provides many of the answers to our questions.

The fact that we all can recognise such an effective and powerful everyday example shows just how far AI and ML have come since their inception in the 1950s.

How will AI and ML affect the day-to-day operations of the lab?

Voice recognition software has been available for decades; however, it has not made large inroads into the lab. It has been used in areas where extensive notes are taken, areas such as pathology labs or for ELN experiment write ups. These are the obvious ‘big win’ areas because of the volume of text that is traditionally typed, the narrow scope of AI functionality needed, and the limited need to interface to other systems.

However, companies such as LabTwin and LabVoice are pushing us to consider the widespread use of not just voice recognition, but natural language voice commands across the lab. Logging samples into LIMS, for example, is generally a manual entry, with the exception of barcode scanners and pre-created sample templates, where possible. Commands such as “log sample type plasma, seals intact, volume sufficient, from clinic XYZ” is much simpler than typing and selecting from drop downs. Other functions such as “List CofAs due for approval”, “Show me this morning’s Mass Spec run” would streamline the process of finding the information you need.

Opportunities to take advantage of AI and ML within lab systems.

Take stability studies where samples are stored in various conditions (such as temperature, humidity, and UV light) for several years and ‘pulled’ for analysis at various set points throughout the study.

The samples are analysed for decomposition across a matrix of conditions, time points and potentially product formulations or packaging types. Statistics are produced for each time point and used to predict shelf life using traditional statistics and graphs.

Stability studies are expensive to run and can take several years to reach final conclusions.

AI and ML could, with access to historical data, begin to be used to limit the size of studies so they can focus on a ‘sweet spot’ of critical study attributes.    Ultimately, this could dramatically reduce study length by detecting issues earlier and predicting when failure will occur.

Moving on to lab instrumentation

Instrument downtime, particularly unscheduled, is a significant cost to laboratories. Using ML to review each new run, comparing it with previous runs and correlating with system failures, could predict the need for preventative maintenance.

AI/ML interventions such as these could significantly reduce the cost of downtime. This type of functionality could be built into the instruments themselves, systems such as LIMS, ELN, Scientific Data Management Systems (SDMS) or instrument control software. If this was combined with instrument telemetry data such as oven temperature, pump pressure or detector sensitivity we have the potential to eliminate most unplanned maintenance.  

Another major concern with instrumentation in labs today is scheduling and utilisation rates. It is not uncommon for instruments to cost hundreds of thousands of pounds/dollars/euros, and getting the highest utilisation rates without obstructing critical lab workflows is a key objective for labs. However, going beyond the use of instrument booking systems and rudimentary task planning is difficult. Although it is not hard to imagine AI and ML monitoring systems such as LIMS and ELN, there is far more that can be done to ensure this functionality can go even further. Tasks such as predicting workload; referring to previous instrument run times; calculating sample / test priority; and even checking for scientist’s free diary slots are all tasks that can be optimised to improve the scheduling of day-to-day laboratory work. The resulting optimisation would not only reduce costs and speed up workflows, but would dramatically reduce scientists’ frustration in finding available instruments.

Data integrity

Over the last few years, there has been a massive focus on data integrity within regulated labs. However, many of the control mechanisms that are put in place to improve integrity or mitigate issues are not real-time. For instance, audit trail review is often done monthly at best, and generally quarterly. Not only is it tedious, it is all too easy to miss discrepancies when reviewing line upon line of system changes.

ML could be used to monitor the audit trails of informatics systems and instrument systems in real-time and AI could report any out of the ordinary actions or result trends that do not ‘look’ normal to managers. Where appropriate, the system could interact with the corporate training platform and assign specific data integrity training to applicable teams. The potential increase in integrity of data while reducing the headcount needed to do so could be significant.

Final Thoughts

Lab directors, IT professional and the Lab Informatics industry are quite rightly focusing on the digital lab and digital lab transformations. Done right, this will form and excellent platform for the next level of informatics development using AI and ML to propel not just digital science forward, but to revolutionise the everyday life of scientists. Personally, I cannot wait!

To find out more about how Scimcon can support your informatics project, contact us today.

In order to work as intended, this site stores cookies on your device. Accepting improves our site and provides you with personalized service.
Click here to learn more