Reducing laboratory carbon footprint in biotech and pharma?

My Green Lab – a non-profit organisation that is focused on improving the sustainability of scientific research – recently reported that the carbon footprint produced by the biotech and pharmaceutical industry (including laboratories) increased from 3.9 percent in 2021 to five percent in 2022.

But, more and more companies are committing to the UN’s Race to Zero campaign, which aims to halve total carbon emissions by 2030 and reach net zero emissions by 2050.

In addition to reducing Scope 1 emissions (direct emissions from owned or controlled sources) and Scope 2 emissions (indirect emissions from the purchase and use of electricity, steam, heating and cooling), there is a growing focus on Scope 3 emissions (indirect emissions that occur in the upstream and downstream activities of an organisation).

My Green Lab found that overall, Scope 3 emissions are 4.6 times greater than Scope 1 and 2 combined in the biotech and pharma sector. The impact of this is that pressure to reduce carbon use is being applied down the supply chain, impacting labs at every phase of development, scale-up and manufacturing.

According to CPHI’s 2023 annual survey, 93 percent of executives state that ‘visibility on supply chain partner’s sustainability record’ is either ‘extremely important’ or ‘important’.

There are a number of ways in which laboratories can demonstrate their commitment to sustainability – and help the organisations they are providing services to reduce their Scope 3 emissions – some to consider include:

  1. Obtain My Green Lab certification: considered the gold standard for laboratory sustainability best practices around the world, the program provides scientists and the teams that support laboratories with actionable ways to make meaningful change.
  2. Switch to laboratory products that have the ACT Environmental Impact Factor Label: by emphasizing Accountability, Consistency, and Transparency (ACT) around manufacturing, energy and water use, packaging, and end-of-life, ACT makes it easy to choose more sustainable products.
  3. Identify opportunities for energy efficiency in the laboratory: the Center for Energy Efficient Laboratories (CEEL) provides useful reports and advice.
  4. Join the Sustainable European Laboratories Network: a network of local sustainability teams as well as independent ‘green labs’ networks, which aims to transform the way science is done so that it better responds to the environmental challenges of our era.
  5. If your lab is part of an academic institution, consider joining the LEAF Programme, a standard set by University College London – and followed by 85 global institutions – to improve the sustainability and efficiency of laboratories.

There are many other networks, initiatives and accreditations aimed at helping labs become more sustainable. Tapping into these resources, as well as finding ways to make your lab more efficient, can help you to both reduce carbon emissions and save costs. Importantly, it will ensure your lab does not lose out in future when sustainability becomes a deciding factor in procurement.

Scimcon continues its commitment to reducing its carbon footprint, having signed up to the Science Based Targets Initiative (SBTi) and providing a target whilst also gaining an award for sustainability from Ecovadis. As we continue to add value in the complex lab informatics field, we work closely with our clients to detail Scimcon’s Scope 3 assessments and action plans. 

Ear to the ground: the latest trends in lab informatics­­­­­?

We recently sat down with Rizwan Chaudhrey, a well-connected figure in the life science and pharmaceutical industries, to discuss the changes he has seen across the lab sector in recent years; how this has been impacted by COVID; and any new trends in lab informatics.

Rizwan, tell us about yourself and your work.

I’ve been involved in this field for a long time, building up a portfolio of connections across life science, biopharma, and pharma. I have worked alongside key opinion leaders for the past 8 years, including both members of the media and decision makers within the companies themselves. I have been involved in a myriad of different projects in the industry in that time, from event management to sales strategy. I now work across the whole value chain, aiming to connect and inform people through news, interviews and other forms of content.

I speak to people across all disciplines and roles, and generally host 2 types of interview. In-person interviews, usually at an industry event,where my interviewees generally talk about the company, what they’re showcasing at an event, and any product launches that might be coming up. Video interviews are often more topical, highlighting a specific subject or industry challenge.

During my time in this industry, I’ve visited many different eventsand spoken to hundreds of companies in this space, from large organisations to smaller start-ups. In the time I have been in the industry, lab informatics has changed a lot – it is very much an industry that keeps you on your toes!

What are some of the main trends you are seeing in the laboratory science sector?

Digitalisation is obviously a topic that is heavily discussed, certainly in the events I attend and the interviews I conduct. I think there’s been an interesting shift recently though. The whole industry thought that the COVID-19 pandemic would have a significant effect and drastically speed up the rate of adoption, like we’ve seen in other industries. Everything was going to move to the cloud, and remote access requirements led everyone to believe that we were going to move towards digitalisation at a rate of knots. While we are seeing increasing use of digitised systems, the shift has not been as quick or dramatic as people expected.

From my discussions with lab-based organisations, it appears that one of the big barriers to following through on digital transformation is not knowing where to start. At present, it doesn’t appear as though one vendor has “cracked it” and developed an all-in-one solution that addresses every lab’s needs – there are many different companies offering an array of services and solutions, which can be daunting for a lab-based organisation that is stuck somewhere on its digitalisation journey. For example, major vendors might offer solutions and software packages for their own instruments, but on another level you can look at platforms that focus on specific therapies – there are so many layers to the topic, which is why I believe there are still so many shows with exhibitors talking about what they can bring to the table.

AI/ML is a hot topic at present. Is this something that has come up in your interviews?

Certainly – you can’t avoid artificial intelligence as a topic at the moment! And you can understand why, it has plenty of advantages for labs.

AI can help labs not only generate insights from millions of cells, but also interpret that data and help identify the most valuable results. Machine learning (ML) also provides clear benefits in terms of equipment servicing, as ML-enabled instruments can help engineers and customers through self-diagnosis and troubleshooting. It also facilitates lab automation, through features like automatic refill notifications.

Are there any other trends you have identified?

Post-COVID, we’ve definitely seen a rise in collaboration. Organisations and scientists seem more willing than ever before to share information. We’re also seeing a shift towards automated processes in the lab, with systems using learnt information to lessen the need for human intervention.

In the current environmental climate, sustainability is naturally a big talking point too. Every company I speak to is keen to showcase their ESG practices, especially considering the impact the life science and pharmaceutical industry has on the environment.

Navigating the fog

We agree with Rizwan that  the field of lab informatics is at an exciting crossroads. Still emerging from the madness of COVID, and with the growing promise of AI seeming more inevitable by the day, the industry is facing a period of unpredictability.

As scientists ourselves, the team at Scimcon is well-placed to help lab-based companies address their challenges. Find out more about how Scimcon can help you navigate the fog by visiting our website.

Industry leader interviews: Jana Fischer?

We’re kicking off 2023 with a new industry leader interview, and shining a spotlight on Jana Fischer, Co-Founder and CEO of Navignostics.

In this blog, we speak to Jana about Navignostics’ mission, and how the team plans to revolutionise personalised oncology treatments with the help of data and AI.

Tell us about Navignostics

Navignostics is a start-up personalised cancer diagnostics business based in Zurich, Switzerland. Our goal is simple – we want to revolutionise cancer treatment by identifying a highly personalized and thus optimal treatment for every patient, to ensure that each patient’s specific cancer is targeted and fought as needed. Our capabilities allow us to do this by analysing tumour material, through extracting spatial single-cell proteomics information. and using this data to analyse many proteins simultaneously in individual cells within the tissue.

What is spatial single-cell proteomics?

Single-cell proteomics comprises of measuring and identifying proteins within a single cell, whereas spatial proteomics focuses on the organisation and visualisation of these proteins within and across cells. Combining these two research tools allows the team at Navignostics to characterise tumours on a cellular level, by identifying the proteins present across cells in a tumour, and also how these proteins and cells are organised. This means that the team can provide a more accurate estimate for how certain tumours will respond to different medications and treatments.

Proteins are typically the target of cancer drugs and measuring them on a cellular level allows us to identify different types of tumour cells, as well as immune cells that are present and how the two interact. This data is highly relevant to inform clinicians of the best form of (immuno-) oncology and combinatorial treatment for individual patients. Also, this information is highly relevant to pharma companies in order to accelerate their oncology drug development, by providing insight on drug mode of action, and signatures to identify responders to novel drugs.

The kind of data that we are able to extract from different types of tumours are monumentally valuable, so the work doesn’t stop there. All of the data we harness from these tumours is stored centrally, and we plan on utilising this data by building it into a system we refer to as the Digital Tumour, that will continuously allow us to improve the recommendations we can make to our clinical and pharma partners. Our journey has been rapid, though it is built on years of research and preparation: we founded the business in 2022, as a spin-off from the Bodenmiller Lab at the University of Zurich.

The dream became a reality for us in November 2022, when we secured a seed investment of 7.5m CHF. This seed funding will allow us to pursue our initial goals of establishing the company, achieving certification for our first diagnostic product and developing our Digital Tumour. By extension, collaborating with pharma and biotech partners in oncology drug development. It has also given us the resource we need to move to our own premises. We are due to move off university campus in May 2023. This offers us great opportunity to push forward with the certification processes for our new lab, and it gives us to the chance to grow our team and expand our operation. We will be located in a start-up campus for life science organisations in the region of Zurich, so we’ll be surrounded by companies operating in a similar field and at a similar capacity.

Tell us more about the Digital Tumour – how does it work?

The Digital Tumour will be the accumulation of all the molecular data we have extracted from every tumour that we have analysed to date, and ongoing. Connected to that, we store information on the clinical parameters and patient response to treatment. Over time, our aim is to utilize this central data repository to identify new tumour signatures, and build a self-learning system that will provide fully automated treatment suggestions for new patients, based on how their molecular properties compare to previously analysed patients that have been successfully treated.

Sounds interesting – are there any challenges to working with a database of this size?

Our data storage is quite advanced, so volume isn’t really a challenge for us. Our main focus is standardising the input of data itself. The technology is based on years of research and the data analysis requires a great deal of experience and in-depth expertise. In order to extract the full value from this data, it must be completely standardised. Data integrity is therefore vital to our work, and allows us to get the maximum value from past analyses. Our past experience in the Bodenmiller Lab allowed us to develop standardised processes to ensure that all of our data is fully comparable, which means that we can learn more and more from our past data, and apply this to new cases that we analyse.

It is also important to report on our complex data in a comprehensive but easily interpretable manner to the clinician/tumour board who needs to organise a treatment plan. We’re currently working with our clinical collaborators to develop readily understandable and concise reporting outputs. Unlike genomics analysis, our reports focus on proteins in tissue, which is the same information that clinicians are used to working with. So, there is a common language there that offers us the unique opportunity to provide clinicians with data they can easily interpret and work with.

What does this kind of research and data mean for oncology, both in terms of pharmaceuticals, biologics, and healthcare?

It’s important to note that personalised treatment approaches and precision medicine are not new concepts in the diagnostics space. However, our technology and algorithms allow us to extract novel types of biomarkers which were previously inaccessible or unknown, so we’re helping to level up the playing field and give clinicians and drug developers’ comprehensive information to individualize therapies.

Comprehensive tumour data is truly at the heart of what we do, and one key benefit of our technology is that we’re able to analyse very small amounts of sample – such as fine needle biopsies – to provide therapy suggestions. We can also analyse bio banked tumour material, so if there is any old material that has been stored, we have the ability to analyse those samples retrospectively. Not only does this help us to fuel our Digital Tumour with more data, but it also allows us to examine new fields such as long-term survival rates of patients with these tumours. This is of huge value to fuel our product development pipeline because it allows us to identify different molecular properties between individuals that may not have been considered on a clinical level, but may have played a role in patient responses to treatments and survival outcomes in the long-term.

This kind of retrospective data also plays a key role in the evolution of healthcare and drug development, as having the technologies available to acquire this sort of data and mine it to our advantage will provide enormous benefits. These include improving individual treatment courses for patients, as well as expediting the development of novel cancer drugs so pharma companies can get more effective treatments to market sooner.

For example, one commonly cited statistic is that 90% of clinical drug development fails during phase I, II, III trials and drug approval. Often, this may arise from a lack of available information to identify the subset of patients most likely to benefit from a novel drug. Having access to Navignostics’ technology and algorithms and a database such as the Digital Tumour will offer the potential to pre-select the right patients to enroll in clinical trials, and more easily identify the patients that do respond to the novel treatment, which could substantially expedite the speed of drug development in the trial stage, and help bring more effective drugs to the market.

Even unsuccessful trials offer valuable opportunities: it is possible to repurpose and reanalyse material from previous failed trials. Such high rates of failure in clinical development means that there are a large number of companies that have invested $millions in developing drugs that have not come to fruition, so if companies want to re-mine their data, our team can reinterpret the existing work into identifying more successful strategies, so we can give those drugs another chance and offer a better chance of Return on Investment.

A failure no longer needs to be a failure. Navignostics and its offerings can bring value to our pharma and biotech partners, and will also bring direct benefit to patients and clinicians once we launch our diagnostics product. So, data from every facet of the oncology industry, from curing a patient to halting the development of a drug, can offer us valuable insight that both we and the Digital Tumour could learn from when developing treatments.

What does 2023 and beyond have in store for Navignostics?

The next three years will be critical for our work, and we have projected timelines and key milestones for our diagnostics developments that we will achieve until our next funding round. Along the way, we are actively speaking to biotech and pharmaceutical organisations to identify projects and build the foundation for long lasting collaborations. We are looking forward to a successful continuation of the Navignostics development in 2023!

Scimcon is proud to showcase start-up companies like Navignostics, and we’re looking forward to seeing how the company will grow over the coming years.

To contribute to our industry leader blog series, or to find out more about how Scimcon supports organisation with lab informatics and data management solutions, contact us today.

Scimcon sponsors SmartLab Exchange and identifies priority themes for 2022 lab informatics?

The SmartLab Exchange, from April 26-27, 2022 at the InterContinental At Doral Miami – Doral, FL is one of the global meetings for lab informatics leaders. Scimcon continues its proud sponsorship of this event, and attended in person to facilitate one-to-one meetings with a number of informatics customers from big pharma and lab-centric sectors. Scimcon sponsors the SmartLab Exchange because it provided a useful access to the community of senior R&D, Quality Assurance and Quality Control decision-makers from industry in North America.

Speakers at the 2022 SmartLab Exchange included the best of the best, with attendees from Proctor & Gamble, Biovia, Bayer, AstraZeneca, Sanofi and Amgen, among others. SmartLab Exchange is attended by invite-only decision-makers. The unique invite-only format of the event means that both sponsors, speakers and delegates can access a closed community that meets their individual needs. 

Feedback and Voice of the Industry

Attending from Scimcon were Geoff Parker and Dave Sanders, and during the event they took the opportunity to poll the customers and contacts from many of the attending organizations, to identify the current 2022 trends in the lab informatics industry. SmartLab Exchange represents the lab informatics community across industries including:

  • Pharmaceutical
  • Bio-pharmaceutical
  • Biotech
  • Biobanking
  • Medical device
  • Petrochemical
  • Bio-fuel
  • Chemicals 
  • Cosmetics
  • Food & beverage
  • Defence
  • Forensics
  • Water
  • Environmental
  • Agriculture
  • Consumer Goods

Geoff and Dave spoke with representatives from a multitude of organizations to take a pulse of the trends in the industry. Geoff explains:

“Scimcon works globally as a lab informatics consultant and implementation partner, with big pharma and biotech companies as well as vaccine manufacturers. We tend to see similar challenges from lab to lab, from organization to organization, and it is useful to take events like SmartLab Exchange as a means of checking in and ensuring that our customers’ needs are current.”

Summary of trends in lab informatics for the modern lab

In the informal poll of attendees at SmartLab Exchange, Scimcon was able to identify key trends and themes that are important to the modern lab in 2022.     

The subjects identified as highest interest to the delegates were:

  • Data standardization
  • Data Quality and Integrity
  • Instrument Connectivity/ IoT 

Interest in product areas for the lab was high, especially for:

  • Scientific Data Management Systems,
  • Lab Automation
  • ELN
  • LIMS

There was a general trend for interest and support in data integration and systems integration.

Scimcon sponsors SmartLab Exchange 2022, summarizes trends from laboratory informatics leaders

Geoff summarizes “As lab informatics consultants with a global customer base in pharma and biopharma labs, it is important to us to check in with influential decision-makers from the lab. SmartLab Exchange gave us a useful ability to poll the attendees and see trends that will impact the modern lab decision-maker, and will help us at Scimcon to hone the way we partner with our customers.”

 

 


 

 

Scimcon is proud to sponsor SmartLab Exchange, and support customers in life sciences with their lab informatics management and strategy. For more information about Scimcon’s services, contact us today.

Podcast: Scimcon discusses digital transformation?

Scimcon has been on quite a journey since its founding in 2000. Our co-founder Geoff Parker recently spoke with John Storton at Yellow Spider Media for its Business Spotlight podcast, where he discussed Scimcon’s experience in informatics projects over the last 21 years, how implementation projects have changed, and trends in digital lab transformation.

You can listen to the discussion below.

Interested in hearing more from Scimcon? Make sure you’re following us on LinkedIn and Twitter for regular updates.

To learn more about digital lab transformation, visit one of our earlier blogs here.

Hosting tradeshows in a virtual world – Lab of the Future LIVE?

2020 saw the migration of in-person events to virtual. Although this was a difficult decision for many organisers, online events do present organisers with the opportunity to reach their audience in new and innovative ways.

As a follow up to his first blog, we caught up with Luke Gibson, Founding Director of Open Pharma Research, about his experience moving Lab of the Future online in April 2021.

2020 was an odd year, especially in tradeshows – do you expect to remain online or return to in-person events?

We debated going virtual for quite a long time, whilst many events organisers around us made the jump quite quickly. We looked at a lot of different platforms but had some doubts, as we are very sensitive about putting out a poor quality product and we know that you can’t just mimic online what you offer in person.

So, we decided to dip our toe in the water with our range of Digital Dialogues, which are essentially a variety of debates and discussions which keep us talking with our community. Following the success of these, we took the plunge and went ahead with our virtual conference in April 2021. It went really well, and not only did we learn a lot from the event, but we managed to reach a lot of people as well – we had 1,500 registrants, and at any one time we had over 550 people online in at the same time.

It sounds like the move online paid off for Lab of the Future this year – does this mean you’ll be continuing with the virtual approach?

It definitely appeals to us to explore this approach further. We were growing anyway, and every time we hosted a new Digital Dialogue we were reaching new people, so there are definitely positives to moving online – you get a wider audience, it’s more accessible for a lot of people, and it does really allow you to go global.

On the flip side, the interactivity isn’t the same as with an in-person event. Physical events gather a lot of momentum each year they take place, and we had exciting growth expectations, which do tend to flatline when you pause physical activity. Stimulating the activity of people online takes a lot more management as well, but there are definitely elements we can take forward. On the whole though, I think people are looking forward to a return to physical events.

What did you find were the main differences in terms of virtual vs in-person experiences?

In terms of technology, the conference industry has actually had the opportunity to go virtual for around 20 years now. Although we’ve known that everything can be delivered online, we’ve continued with physical events, and it’s because they give you that human interactivity which can’t be mirrored online. The same sentiment can be applied to concerts – it’s just not the same streaming a live show as it is being in the crowd, and there is also a higher level of technology risk, such as those experienced in the recent Glastonbury event where users weren’t able to log in.

Virtual events don’t allow you to break down barriers the way that comes naturally in a physical environment, such as just chatting with someone in the coffee queue. Because you have that shared experience of being at the same event, you already have that common ground that opens up communication. A lot of people attend conferences due to the networking aspect, which can only occur when you’re surrounded by like-minded people.

So I think the value of physical conferences has been reinforced by their absence. However, our Digital Dialogues have been wonderful and relatively easy to do, so we’ve gained from this experience and will definitely look to continue those in the future. The debate we face now is what would hybrid events look like? There is a lot to consider; the main thing is that, rather than compromising and delivering an event that is part virtual and part physical, you need to offer a virtual component in addition to a full physical event. For example, you want to be able to host an event that is open to people who may not necessarily be able to travel or attend in person, so that would be an addition to the event. What you don’t want is people deciding to host talks and keynotes from the comfort of their own office because it is easier than making the trip, losing the network opportunity. Physical events would be the goal, with virtual access as an added opportunity.

Have scientists changed over the last 12 months?

The speed at which vaccines were brought to market to target COVID-19 has been an incredible win over the last 12 months. It has allowed us to break down the assumption of “we have to do things this way because that’s how we’ve always done it.” If we use the COVID-19 vaccine development as a case study, we can apply this attitude to other areas within life science. What else can we do in half the time? How can we unlock innovation?

This goes further still in showing us that scientists are able to work in different environments too. I think a lot of scientists have been surprised by what they’ve managed to achieve even when they’ve not been able to go to the lab. When people have an appetite to see the job through, and are trusted to deliver on their objectives, its remarkable to see how they can adapt and push through. It creates a whole other mindset, which feeds into notions of what the Lab of the Future looks like.

What’s in the future for Lab of the Future?

Realistically, we’ve always been focused on the innovation and the people. We’ve looked at the data and the technology, but it’s the people that make everything happen. This whole experience of 2020 and 2021 so far has been a disruption, and any disruption that makes you stop and think differently about how people work is part of Lab of the Future.

Going forward, we would prefer to hold fire as opposed to putting something out that’s only halfway there. So, we’ve decided that we’ll be hosting virtual again in the Autumn, on 26th & 27th October 2021, and return to physical events in Boston, MA in Spring 2022 on 22nd & 23rd March, and Amsterdam, Europe on 3rd & 4th October 2022.

What were some of the main take-aways from Lab of the Future Spring 2021?

One thing we did note was that the energy of the keynote speakers was truly remarkable. We felt it was important to host our talks live, so our presenters were collaborating on developing their presentations, so they got a lot out of it, and that was really reflected in the enthusiasm of their messages. Working together provided energy, which really came across, and having these events live and interactive definitely added to the buzz of these talks.

Another key take-away was the role played by attendance analysis. Although it is useful being able to monitor activity through analytics, it has a potential flip side. We had to really blend our conversations with any product discussion to ensure that it wasn’t a case of people ‘skipping the ads’ in a sense and only tuning into case studies. We blended discussions on the variety of solutions with operational content from life science practitioners to make it one conversation, so this wasn’t an issue.

I think a lot of events organisers may have some trepidation around the use of data, as it can give you perhaps more information than you want to know. But a bonus on that point for us is that you gain a real insight into customer profiles, which in turn makes it easier to communicate and highlight relevant areas. We’ve definitely learnt from our experience of hosting the event virtually, and I think we’ve proven to ourselves that it is possible to deliver a great product, at times different to our expectation and our business plan! We’ve found a new way of working, and even with 30 years of experience each, we’ve challenged our past learnings and we’re now looking at how this could shape our future – which is exactly what Lab of the Future sets out to achieve.


Scimcon is proud to sponsor Lab of the Future, and we can’t wait to see you at the Autumn virtual congress on 26-27th October 2021. Contact us today to learn more about our participation in the event, and visit part 1 of our conversation with Luke to learn more about Lab of the Future.

Top tips for best approaches to data use in clinical trials?

Maintaining data quality is critical in clinical trials. As a follow up to his first blog, we have worked with Industry Leader Mark Elsley to create this infographic, outlining Mark’s top tips for managing clinical trial data.

Mark Elsley is a Senior Clinical Research / Data Management Executive with 30 years’ experience working within the pharmaceutical sector worldwide for companies including IQVIA, Boehringer Ingelheim, Novo Nordisk and GSK Vaccines.  His specialist area of expertise is in clinical data management, and he has published a book on this topic called A Guide to GCP for Clinical Data Management.

Mark Elsley outlines top tips for clinical trial data management in this inforgraphic.
Industry leader interview: Luke Gibson?

2020 has been a difficult year for most industries, not least for event and tradeshow providers. Luke Gibson, Founding Director of Open Pharma Research and Lab of the Future, shares his experience of running events in the laboratory industry, and what makes Lab of the Future such a unique event.

Luke, please tell us a bit more about yourself and Lab of the Future

My name is Luke Gibson, and I am one of the three founding directors of Open Pharma Research. I have 30 plus years of experience in developing and running events, primarily in the financial and trade and commodity sectors. My colleagues Kirianne Marshall and Zahid Tharia bring a similar level of experience to the company.

Kirianne has had many years of experience in managing the commercial side of large congresses, such as Partnering in Clinical Trials, and research and development congresses. Zahid has 30 years of events experience too, particularly in running life science portfolios, and launching congresses/events. Our paths have crossed many times throughout our years working in events, and we eventually hit a point where all 3 of us had the capacity to try something new – something that was worthwhile, fun, and different to the corporate worlds we had become accustomed to. So that was why we created Lab of the Future – with a view to running events in a different way.

Did you feel that there was a gap in the market for this type of event?

I’m not sure if I would describe it as a gap in the market, more an ambition to do things differently. There was a desire from all of us to build an event with a different approach to the one we would take when working for large organisations, because when you’re working on a large portfolio of global events that cover a variety of topics, you and your team are always looking ahead to the next event, and the focus on the longevity of a single event isn’t always there.

We wanted something that we can nurture and grow, something that we can work on year-round without getting distracted by the next thing on our list. It also allows us to stay within this space and build our community, without having to face pressures such as a year-on-year development strategy or diverse P&L. Our desire was to avoid these constraints, and create an event that we can continue to work on for a long time.

Are you building just the one event, or are you looking at hosting a series? Has your business plan changed since starting?

We want to be able to live and breathe Lab of the Future, but one of the interesting things about it is that it’s such a broad concept. On the one hand we deal with informatics, but on the other hand, we deal with equipment, technology, and all the connectivity between them – but even that’s just one part of it. We are not an informatics conference; we are not strictly an instrumentation conference; we also look at the innovation side of things.

I think the best way to describe how we see Lab of the Future is as a proxy for how you do science in the future. Everything pertains to more efficient processes; better results; or ways of creating breakthrough innovation, and these are all part of the picture of science in the future. And that is the lab of the future – where the lab is the proxy for the environment where you do the science that matters.

So what is the main focus for Lab of the Future?

When we started off, we found we received a lot of queries from industry contacts who wanted to get involved, but certain topics they wanted to discuss didn’t necessarily pertain to the physical laboratory itself. But if it was relevant to science, then it was relevant to us. Things like data clouds and outsourced services may not be directly linked to the lab, but they still relate to how you work. So, within that, the scope for the Lab of the Future gets wider still, looking at areas such as how we can create virtual clinical trials, or use real world-data to feed back into R&D.

People are also keen to learn more from their peers and from other areas of the industry. Lab of the Future allows us to host senior speakers and keynotes who can tell us where we’re heading, and show us how the efforts of one area within life science feed into other areas. It presents us with an almost ever-changing jigsaw image, and it’s this strategic element that I think sets us apart from other events.

Who is your main audience for Lab of the Future?

We attract a real mix of attendees, and that’s what I love about it. You can run a conference for people in a specific job function, such as a data scientist or an R&D manager, but what people really want to know is what the people around them are doing, to almost give them context of the industry as a whole. So, our conference doesn’t just exist to help you do your own job better, but it helps you to develop a concept of where your department is heading in the future, and what you should think about longer term. We aren’t telling scientists how to do their job today; we’re helping them think about their responsibilities for delivery in the future.  Lab of the Future is about the delivery of science of the future.

Our sponsors and solution providers that support the conference are also very much part of our community, as they’re all innovating and making waves in this space as well. They’re in a space that’s always evolving to build the Lab of the Future; and they are part of that solution. So, we don’t merely facilitate a conference of buying and selling between providers and services, we offer a space where everyone is evolving together. It’s a real melting pot, and that’s the fun bit really.

How do you build the Lab of the Future Community?

Zahid’s background in life sciences definitely gave us a starting point. Further to that, we’ve found that every time we put something out, that our community engages, and as a consequence we’re introduced to people we never expected to be introduced to. The fact we’re always talking to people enriches our content – the people we meet and conversations we have change our way of thinking, and shape what we’re doing.

Although I’m in charge of our marketing operations, I have to say I’m not always sure where some of our contacts come from! One thing I’ve found quite surprising is the lack of reliance on a database – there’s a lot of power in word-of-mouth, especially in this space where everyone is working on something – why not share that? As we’re seen as adding value to the conversation, it allows people to find us through their connections and our supporters.

Scimcon is proud to sponsor Lab of the Future, and we can’t wait to see you at the Autumn virtual congress on 26 – 27th October 2021. Contact us today to learn more about our participation in the event, and stay tuned on our Opinion page for part 2 of our conversation with Luke.

Industry leader interviews: Mark Elsley?

Mark, please introduce yourself

I am Mark Elsley, a Senior Clinical Research / Data Management Executive with 30 years’ experience working within the pharmaceutical sector worldwide for companies including IQVIA, Boehringer Ingelheim, Novo Nordisk and GSK Vaccines. I am skilled in leading multi-disciplinary teams on projects through full lifecycles to conduct a breadth of clinical studies including Real World Evidence (RWE) research. My specialist area of expertise is in clinical data management, and I have published a book on this topic called “A Guide to GCP for Clinical Data Management” which is published by Brookwood Global.

Please can you explain what data quality means to you?

Data quality is a passion of mine and now receives a lot of focus from the regulators, especially since the updated requirements for source data in the latest revision of ICH-GCP. It is a concept which is often ill-understood, leading to organisations continuing to collect poor quality data whilst risking their potential rejection by the regulators.

White and Gonzalez1 created a data quality equation which I think is a really good definition: They suggested that Data Quality = Data Integrity + Data Management. Data integrity is made up of many components. In the new version of ICH-GCP it states that source data should be attributable, legible, contemporaneous, original, accurate, and complete. The Data Management part of the equation refers to the people who work with the data, the systems they use and the processes they follow. Put simply, staff working with clinical data must be qualified and trained on the systems and processes, processes must be clearly documented in SOPs and systems must be validated. Everyone working in clinical research must have a data focus… Data management is not just for data managers!

By adopting effective strategies to maximise data quality, the variability of the data are reduced. This means study teams will need to enrol fewer patients because of sufficient statistical power (which also has a knock-on impact on the cost of managing trials).2 Fewer participants also leads to quicker conclusions being drawn, which ultimately allows new therapies to reach patients sooner.

Why is data quality such an important asset in pharma?

I believe that clinical trials data are vitally important. These assets are the sole attribute that regulators use to decide whether to approve a marketing authorization application or not, which ultimately allows us to improve patient outcomes by getting new, effective drugs to market faster. For a pharmaceutical company, the success of clinical trial data can influence the stock price and hence the value of a pharmaceutical company3 by billions of dollars. On average, positive trials will lead to a 9.4% increase while negative trials contribute to a 4.5% decrease. The cost of managing clinical trials amounts to a median cost per patient of US$ 41,4134 or US$ 69 per data point (based on 599 data points per patient).5. In short, clinical data have a huge impact on the economics of the pharmaceutical industry.

Why is the prioritization of data quality so important for healthcare organizations?

Healthcare organizations generate and use immense amounts of data, and use of good study data can go on to significantly reduce healthcare costs 6, 7. Capturing, sharing, and storing vast amounts of healthcare data and transactions, as well as the expeditious processing of big data tools, have transformed the healthcare industry by improving patient outcomes while reducing costs. Data quality is not just a nice-to-have – the prioritization of high-quality data should be the emphasis for any healthcare organization.

However, when data quality is not seen as a top priority in health organizations, subsequently large negative impacts can be seen. For example, Public Health England recently reported that nearly 16,000 coronavirus cases went unreported in England. When outputs such as this are unreliable, guesswork and risk in decision making are heightened. This exemplifies that the better the data quality, the more confidence users will have in the outputs they produce, lowering risk in the outcomes, and increasing efficiency. 

Data quality, where should organisations start?

ICH-GCP8 for interventional studies and GPP9 for non-interventional studies contain many requirements with respect to clinical data so a thorough understanding of those is essential. It is impossible to achieve 100% data quality so a risk-based approach will help you decide which areas to focus on. The most important data in a clinical trial are patient safety and primary end point data so the study team should consider the risks to these data in detail. For example, for adverse event data, one of the risks to consider could include the recall period of the patient if they visit the site infrequently. A patient is unlikely to have a detailed recollection of a minor event that happened a month ago. Collection of symptoms via an electronic diary could significantly decrease the risk and improve the data quality in this example. Risks should be routinely reviewed and updated as needed. By following the guidelines and adopting a risk-based approach to data collection and management, you can be sure that analysis of the key parameters of the study is robust and trust-worthy.

If you were to give just one tip for ensuring data quality in clinical trials, what would it be?

Aside from the risk-based approach which I mentioned before, another area which I feel is important is to only collect the data you need; anything more is a waste of money, and results in delays getting drugs to patients. If you over-burden sites and clinical research teams with huge volumes of data this increases the risks of mistakes. I still see many studies where data are collected but are never analysed. It is better to only collect the data you need and dedicate the time saved towards increasing the quality of that smaller dataset.

Did you know that:

In 2016, the FDA published guidance12 for late stage/post approval studies, stating that excessive safety data collection may discourage the conduct of these types of trials by increasing the resources needed to perform them and could be a disincentive to investigator and patient participation in clinical trials.

The guidance also stated that selective safety data collection may facilitate the conduct of larger trials without compromising the integrity and the validity of trial results. It also has the potential to facilitate investigators and patients’ participation in clinical trials and help contain costs by making more-efficient use of clinical trial resources.

What is the role of technology on data quality?

Technology, such as Electronic Health Records (HER) and electronic patient reported outcomes (ePRO), drug safety systems and other digital-based emerging technologies are currently being used in many areas of healthcare. Technology such as these can increase data quality but simultaneously increase the number of factors involved. It impacts costs, involves the management of vendors and adds to the compliance burden, especially in the areas of vendor qualification, system validation, and transfer validation.

I may be biased as my job title includes the word ‘Data’ but I firmly believe that data are the most important assets in clinical research, and I have data to prove it!

Scimcon is proud to support clients around the globe with managing data at its highest quality. For more information, contact us.


References

1White, Christopher H., and Lizzandra Rivrea González. “The Data Quality Equation—A Pragmatic Approach to Data Integrity.” Www.Ivtnetwork.Com, 17 Aug. 2015, www.ivtnetwork.com/article/data-quality-equation%E2%80%94-pragmatic-approach-data-integrity#:~:text=Data%20quality%20may%20be%20explained. Accessed 25 Sept. 2020.

2Alsumidaie, Moe, and Artem Andrianov. “How Do We Define Clinical Trial Data Quality If No Guidelines Exist?” Applied Clinical Trials Online, 19 May 2015, www.appliedclinicaltrialsonline.com/view/how-do-we-define-clinical-trial-data-quality-if-no-guidelines-exist. Accessed 26 Sept. 2020.

3Rothenstein, Jeffrey & Tomlinson, George & Tannock, Ian & Detsky, Allan. (2011). Company Stock Prices Before and After Public Announcements Related to Oncology Drugs. Journal of the National Cancer Institute. 103. 1507-12. 10.1093/jnci/djr338.

4Moore, T. J., Heyward, J., Anderson, G., & Alexander, G. C. (2020). Variation in the estimated costs of pivotal clinical benefit trials supporting the US approval of new therapeutic agents, 2015-2017: a cross-sectional study. BMJ open, 10(6), e038863. https://doi.org/10.1136/bmjopen-2020-038863

5O’Leary E, Seow H, Julian J, Levine M, Pond GR. Data collection in cancer clinical trials: Too much of a good thing? Clin Trials. 2013 Aug;10(4):624-32. doi: 10.1177/1740774513491337. PMID: 23785066.

6Khunti K, Alsifri S, Aronson R, et al. Rates and predictors of hypoglycaemia in 27 585 people from 24 countries with insulin-treated type 1 and type 2 diabetes: the global HAT study. Diabetes Obes Metab. 2016;18(9):907-915. doi:10.1111/dom.12689

7Evans M, Moes RGJ, Pedersen KS, Gundgaard J, Pieber TR. Cost-Effectiveness of Insulin Degludec Versus Insulin Glargine U300 in the Netherlands: Evidence From a Randomised Controlled Trial. Adv Ther. 2020;37(5):2413-2426. doi:10.1007/s12325-020-01332-y

8Ema.europa.eu. 2016. Guideline for good clinical practice E6(R2). [online] Available at: https://www.ema.europa.eu/en/documents/scientific-guideline/ich-e-6-r2-guideline-good-clinical-practice-step-5_en.pdf [Accessed 10 May 2021].

9Pharmacoepi.org. 2020. Guidelines For Good Pharmacoepidemiology Practices (GPP) – International Society For Pharmacoepidemiology. [online] Available at: https://www.pharmacoepi.org/resources/policies/guidelines-08027/ [Accessed 31 October 2020].

10Medical Device Innovation Consortium. Medical Device Innovation Consortium Project Report: Excessive Data Collection in Medical Device Clinical Trials. 19 Aug. 2016. https://mdic.org/wp-content/uploads/2016/06/MDIC-Excessive-Data-Collection-in-Clinical-Trials-report.pdf

11O’Leary E, Seow H, Julian J, Levine M, Pond GR. Data collection in cancer clinical trials: Too much of a good thing? Clin Trials. 2013 Aug;10(4):624-32. doi: 10.1177/1740774513491337. PMID: 23785066.

12FDA. Determining the Extent of Safety Data Collection Needed in Late-Stage Premarket and Postapproval Clinical Investigations Guidance for Industry. Feb. 2016.

How to work effectively with informatics consultants in life sciences?

As a leader in a pharmaceutical or life sciences organisation, getting the most out of your team and resources is always a top priority. After making the decision to proceed with a critical investment in consulting services, there may even be more pressure to find the optimal use of these time-limited external resources. So, how can you make sure you are using these resources to their full potential? In this blog, our industry expert Micah Rimer will show you how.

During Micah’s 20 years’ working at big pharma & vaccines corporations, including Bayer, Chiron, Novartis and GSK, he has successfully deployed consultancy groups within lab informatics and clinical projects. Micah has worked with Scimcon to support his teams on high profile critical projects

Frame the problem – what does your implementation project need to achieve?

As with any business situation, it is important that there is a common goal that everyone is aligned around.

It is essential that you do not waste valuable time revisiting the same conversations. Ask yourself: “Is it obvious what problem we are trying to solve?” Often, issues can arise when people are arguing about implementing a solution, whilst losing sight of the challenge at hand. 

Take the example of Remote Clinical Monitoring: You might decide that it would be beneficial to have your Clinical Research Associates (CRAs) track and monitor the progress of a clinical study without traveling to clinical sites. That sounds like it could be very promising, but what is the problem that needs to be solved?

Without clear goals on what you want to accomplish with Remote Clinical Monitoring, it will be difficult to declare an implementation a success. In addition, if you and your organisation do not know what you are trying to achieve with a particular technical solution, it will be impossible to give your informatics consultants a clear set of deliverables.

So, first things first, agree on the problem statement!

One of the first times I hired Scimcon to support me with an informatics project, I had recently joined a pharma company and found myself in the middle of conflicting department objectives, with what seemed to be no clear path out of the mess I had inherited. The organisation had purchased an expensive new software system that had already become a failed implementation. After spending a year continuously configuring and programming it, it was no closer to meeting the business needs than when the project had started. There were two loud criticisms to address on that point:

This also highlighted a far wider range of issues, such as some people who felt their skills were not being properly utilised while problems went unsolved, and that the bioinformatics department might not have the right goals to begin with.

To solve this challenge, we sat down with Scimcon to identify all the different problems associated with the inherited project, and to clarify what we needed to do to turn it into a success. In taking time to review the situation and without too much effort, we were able to come up with four key areas to address: 

  1. Understand how bioinformatics/ IT priorities should map to the organisation’s priorities – before we spent any more time and money, what did the organisation actually need?
  2. Solve the bioinformatics problem that the software had been purchased for (assuming that was indeed a verified need).  
  3. Determine how roles and work could be shifted and changed so that we were utilising the talents and the resources in the department.
  4. If possible, put the purchased software to use! 

With the help of Scimcon, we were able to define these problems and then focus on finding answers to each of the questions. In the end it turned out to be one of our most successful engagements together, award winning even. By just asking senior management what their biggest challenge was, we found their overriding priority was to have an overview of all the R&D projects going on. And while the new software was not particularly well suited for solving the bioinformatics problem that it had been acquired for, it could easily be used to map out the R&D process for portfolio tracking. Then, we turned our attention to the bioinformatics problem, which was easily solved by a bit of custom code from one of the bioinformatics programmers who felt that previously his skills were not being properly utilised.

Once we knew where we were, and where we wanted to get to, all we had to do was get there one challenge at a time.

Manage internal expectations – how will the informatics consultants work with your clincial/analytical teams?

Once you have identified and agreed on the problem that you want to solve, the next step is making sure the organisation is ready to work with your consultants. As with all relationships, business or otherwise, a crucial step is to make sure that everyone has the same expectations, and that all the relevant stakeholders are on the same page.

People have many different perspectives on why consultants are brought in.

As there can be so many different roles and perspectives on the use of consultants, you need to make sure that you address all the different stakeholder perspectives. It is important to establish a positive situation, as you want the consultants to be able to work with your teams without unnecessary tension.

When I was just starting out with my first LIMS implementation (Laboratory Information Management System), I remember being impressed that you could hire someone who had the specific experience and expertise to guide you on something they had done before but that was new to you. I wondered, “why was that not done all the time? Why do so many implementation projects fail when you can bring people in who had solved that particular problem before?” When I asked Russell Hall, a consultant at Scimcon for us on that first project, he said that not everyone is comfortable admitting they need help. As my career has progressed, I have come to value that feedback more and more. There are many people who are highly competent and effective in their jobs, but are not comfortable with the appearance that they are not sufficient on their own. It is always important to manage for those situations, rather than assuming that everyone will welcome external help.

Lastly, it is also critical to manage expectations, regarding the use of consultants. Your boss may need to defend the budget, or be prepared to stand behind recommendations or conclusions that are delivered from people outside of the organisation. It should also be considered that management might not readily accept something that might seem obvious to employees working at a different level. By liaising with senior leaders from the outset, you can make sure both parties are aligned how the consultants will interact with people in the company, and what their role will be. This is important both to achieve what you want internally and also to make sure the consultants have a proper expectation of how their efforts will be utilised. 

Communicate and adjust – how is your information managed between your team and consultants?

While it can be very tempting to feel that you can leave the majority of the project to the experts, the reality is things rarely go as smoothly as planned. As the life science business and information management have advanced over the last few decades, the amount of complexity and details has grown tremendously. It is more and more difficult for a single person to maintain an overview of all the relevant facts. The only way to be successful is to communicate and make sure that the right people have the right information at the right time. Your consultants are no different. 

Many organisations have challenges in terms of taking decisions and communicating them effectively. For your consultants who do not typically have all the same access and networks in the organisation that internal staff do, it is imperative that you make sure they are kept up to date. You want to avoid them spending valuable time on focusing on areas and deliverables that have shifted to being less important. Finding ways to keep consultants informed on all the latest developments is absolutely necessary for them to be able to deliver successfully. Figure out what makes sense by considering the organisation culture and the consulting engagement setup. Whether it is by use of frequent check-ins or online collaboration, be prepared to put in additional efforts to make sure that the information gets to where it needs to go.

As well as good communication, organisations have to be able to adjust as needed. Occasionally everything does work out according to plan, but that is more the exception than the rule when it comes to complex life science informatics projects. While timelines and commitments are critical, it is important to view any project as a collaboration. There will be unexpected software issues. There will be unplanned organisational changes and problems. People get sick, life happens. By having open and continuous dialogue, you can be best prepared to make the adjustments needed to find solutions together to unexpected problems.

Ensuring success in your informatics projects

Consultants can be hugely valuable to you and your organisation.

But you have to setup the right conditions for everything to work out well.

  1. Know what the problem you are trying to solve is, and make sure you have as much alignment around the problem statement as possible.  
  2. Make sure the organisation is ready for the collaboration by ensuring that your team and management know what to expect out of the engagement, and that your consultants similarly know the scope and what their mission is.
  3. Lastly, you need to keep in constant communication and make sure that you are ready to work together to adjust to the inevitable bumps that will come up on the road.

Working together, you can get to where you need to go.

If you’re interested in working with Scimcon on your upcoming informatics project, contact us today for a no-commitment chat about how we can help you succeed.

In order to work as intended, this site stores cookies on your device. Accepting improves our site and provides you with personalized service.
Click here to learn more