Trends in the lab informatics landscape in 2023?

With our sponsorship of SmartLab Exchange Europe and US earlier in 2023, and our sponsorship of FutureLabs this week, we’ve developed a view of key insights on what is happening across the lab informatics industry, and where priorities lie for lab-centred organisations globally. We have also provided insight into the areas budget-holders are looking to invest in new technologies.

Investment priorities for the modern lab

Attending conferences globally means that our team can provide key insight to share with fellow informatics peers. Face-to-face interactions provide an opportunity to receive instant feedback and insight into lab informatics trends, which we can extract valuable data from.

Having spoken to delegates in North America and Europe this year already, we have identified some of the high priority investment areas for lab informatics in 2023 by comparing what is important to event attendees, who include representatives from leading pharma, biotech, material science, crop science, FMCG, and food companies. Of the global companies who attended, more than 120 people were polled:

Figure 1 represents the data from both SmartLab Exchange Europe and US, to give an overall view of lab informatics priorities across the entirety of 2023 thus far:

The graph also demonstrates other key lab informatics investment priorities (from the EU and US summits), and these include:

We can see a real trend towards intelligent systems this year, as data consolidation and reusability take centre stage and budget-holders looks towards automation, both physical and within software systems, to reduce the risks of human and manual errors. This isn’t a trend that’s isolated to a particular lab sector either – we’re seeing similar trends across all sectors.

What other areas of lab informatics innovation are taking centre stage?

Extracting feedback from delegates at conferences in all geographies means we can identify patterns in the data in order of priority. While Figure 1 highlights high priority investment areas, Figure 2 shows exactly what delegates at SmartLab Exchange Europe and US are planning to assign budget to in the next 12 months:

From Figure 2, we can see that immediate investment priorities for SmartLab Exchange Europe and US attendees are as follows:

What does this mean for lab informatics in 2023?

From both events in both geographies, we can see that automation and digitalisation rank highly in terms of investment priorities for 2023. Laboratories are technologically innovating to suit growing capacity and speed to market. Automation also substantially reduces the risk of human error, as repetitive and manual tasks can be carried out with ease using automated solutions.

We also learn that lab users are prioritising areas such as lab scheduling, method development, data governance, connectivity, artificial intelligence (AI), and machine learning (ML). As throughput expectations increase for labs around the world, the need to digitalise and streamline operations is more prevalent than ever. The aim of many laboratories is to increase efficiency within the lab, and digitalisation acts as a catalyst in this process.

You can find our team between Wednesday 31st May – Friday 2nd June at FutureLabs Live, where we’ll be developing more lab informatics insights from fellow sponsors and guests. Stay up to date with our LinkedIn, to be notified of other tradeshows Scimcon is attending this year.

Visit Scimcon at the event and contact us directly to book a conversation, to learn more about how we can support your lab informatics projects.

Industry leader interviews: Jana Fischer?

We’re kicking off 2023 with a new industry leader interview, and shining a spotlight on Jana Fischer, Co-Founder and CEO of Navignostics.

In this blog, we speak to Jana about Navignostics’ mission, and how the team plans to revolutionise personalised oncology treatments with the help of data and AI.

Tell us about Navignostics

Navignostics is a start-up personalised cancer diagnostics business based in Zurich, Switzerland. Our goal is simple – we want to revolutionise cancer treatment by identifying a highly personalized and thus optimal treatment for every patient, to ensure that each patient’s specific cancer is targeted and fought as needed. Our capabilities allow us to do this by analysing tumour material, through extracting spatial single-cell proteomics information. and using this data to analyse many proteins simultaneously in individual cells within the tissue.

What is spatial single-cell proteomics?

Single-cell proteomics comprises of measuring and identifying proteins within a single cell, whereas spatial proteomics focuses on the organisation and visualisation of these proteins within and across cells. Combining these two research tools allows the team at Navignostics to characterise tumours on a cellular level, by identifying the proteins present across cells in a tumour, and also how these proteins and cells are organised. This means that the team can provide a more accurate estimate for how certain tumours will respond to different medications and treatments.

Proteins are typically the target of cancer drugs and measuring them on a cellular level allows us to identify different types of tumour cells, as well as immune cells that are present and how the two interact. This data is highly relevant to inform clinicians of the best form of (immuno-) oncology and combinatorial treatment for individual patients. Also, this information is highly relevant to pharma companies in order to accelerate their oncology drug development, by providing insight on drug mode of action, and signatures to identify responders to novel drugs.

The kind of data that we are able to extract from different types of tumours are monumentally valuable, so the work doesn’t stop there. All of the data we harness from these tumours is stored centrally, and we plan on utilising this data by building it into a system we refer to as the Digital Tumour, that will continuously allow us to improve the recommendations we can make to our clinical and pharma partners. Our journey has been rapid, though it is built on years of research and preparation: we founded the business in 2022, as a spin-off from the Bodenmiller Lab at the University of Zurich.

The dream became a reality for us in November 2022, when we secured a seed investment of 7.5m CHF. This seed funding will allow us to pursue our initial goals of establishing the company, achieving certification for our first diagnostic product and developing our Digital Tumour. By extension, collaborating with pharma and biotech partners in oncology drug development. It has also given us the resource we need to move to our own premises. We are due to move off university campus in May 2023. This offers us great opportunity to push forward with the certification processes for our new lab, and it gives us to the chance to grow our team and expand our operation. We will be located in a start-up campus for life science organisations in the region of Zurich, so we’ll be surrounded by companies operating in a similar field and at a similar capacity.

Tell us more about the Digital Tumour – how does it work?

The Digital Tumour will be the accumulation of all the molecular data we have extracted from every tumour that we have analysed to date, and ongoing. Connected to that, we store information on the clinical parameters and patient response to treatment. Over time, our aim is to utilize this central data repository to identify new tumour signatures, and build a self-learning system that will provide fully automated treatment suggestions for new patients, based on how their molecular properties compare to previously analysed patients that have been successfully treated.

Sounds interesting – are there any challenges to working with a database of this size?

Our data storage is quite advanced, so volume isn’t really a challenge for us. Our main focus is standardising the input of data itself. The technology is based on years of research and the data analysis requires a great deal of experience and in-depth expertise. In order to extract the full value from this data, it must be completely standardised. Data integrity is therefore vital to our work, and allows us to get the maximum value from past analyses. Our past experience in the Bodenmiller Lab allowed us to develop standardised processes to ensure that all of our data is fully comparable, which means that we can learn more and more from our past data, and apply this to new cases that we analyse.

It is also important to report on our complex data in a comprehensive but easily interpretable manner to the clinician/tumour board who needs to organise a treatment plan. We’re currently working with our clinical collaborators to develop readily understandable and concise reporting outputs. Unlike genomics analysis, our reports focus on proteins in tissue, which is the same information that clinicians are used to working with. So, there is a common language there that offers us the unique opportunity to provide clinicians with data they can easily interpret and work with.

What does this kind of research and data mean for oncology, both in terms of pharmaceuticals, biologics, and healthcare?

It’s important to note that personalised treatment approaches and precision medicine are not new concepts in the diagnostics space. However, our technology and algorithms allow us to extract novel types of biomarkers which were previously inaccessible or unknown, so we’re helping to level up the playing field and give clinicians and drug developers’ comprehensive information to individualize therapies.

Comprehensive tumour data is truly at the heart of what we do, and one key benefit of our technology is that we’re able to analyse very small amounts of sample – such as fine needle biopsies – to provide therapy suggestions. We can also analyse bio banked tumour material, so if there is any old material that has been stored, we have the ability to analyse those samples retrospectively. Not only does this help us to fuel our Digital Tumour with more data, but it also allows us to examine new fields such as long-term survival rates of patients with these tumours. This is of huge value to fuel our product development pipeline because it allows us to identify different molecular properties between individuals that may not have been considered on a clinical level, but may have played a role in patient responses to treatments and survival outcomes in the long-term.

This kind of retrospective data also plays a key role in the evolution of healthcare and drug development, as having the technologies available to acquire this sort of data and mine it to our advantage will provide enormous benefits. These include improving individual treatment courses for patients, as well as expediting the development of novel cancer drugs so pharma companies can get more effective treatments to market sooner.

For example, one commonly cited statistic is that 90% of clinical drug development fails during phase I, II, III trials and drug approval. Often, this may arise from a lack of available information to identify the subset of patients most likely to benefit from a novel drug. Having access to Navignostics’ technology and algorithms and a database such as the Digital Tumour will offer the potential to pre-select the right patients to enroll in clinical trials, and more easily identify the patients that do respond to the novel treatment, which could substantially expedite the speed of drug development in the trial stage, and help bring more effective drugs to the market.

Even unsuccessful trials offer valuable opportunities: it is possible to repurpose and reanalyse material from previous failed trials. Such high rates of failure in clinical development means that there are a large number of companies that have invested $millions in developing drugs that have not come to fruition, so if companies want to re-mine their data, our team can reinterpret the existing work into identifying more successful strategies, so we can give those drugs another chance and offer a better chance of Return on Investment.

A failure no longer needs to be a failure. Navignostics and its offerings can bring value to our pharma and biotech partners, and will also bring direct benefit to patients and clinicians once we launch our diagnostics product. So, data from every facet of the oncology industry, from curing a patient to halting the development of a drug, can offer us valuable insight that both we and the Digital Tumour could learn from when developing treatments.

What does 2023 and beyond have in store for Navignostics?

The next three years will be critical for our work, and we have projected timelines and key milestones for our diagnostics developments that we will achieve until our next funding round. Along the way, we are actively speaking to biotech and pharmaceutical organisations to identify projects and build the foundation for long lasting collaborations. We are looking forward to a successful continuation of the Navignostics development in 2023!

Scimcon is proud to showcase start-up companies like Navignostics, and we’re looking forward to seeing how the company will grow over the coming years.

To contribute to our industry leader blog series, or to find out more about how Scimcon supports organisation with lab informatics and data management solutions, contact us today.

Scimcon sponsors SmartLab Exchange and identifies priority themes for 2022 lab informatics?

The SmartLab Exchange, from April 26-27, 2022 at the InterContinental At Doral Miami – Doral, FL is one of the global meetings for lab informatics leaders. Scimcon continues its proud sponsorship of this event, and attended in person to facilitate one-to-one meetings with a number of informatics customers from big pharma and lab-centric sectors. Scimcon sponsors the SmartLab Exchange because it provided a useful access to the community of senior R&D, Quality Assurance and Quality Control decision-makers from industry in North America.

Speakers at the 2022 SmartLab Exchange included the best of the best, with attendees from Proctor & Gamble, Biovia, Bayer, AstraZeneca, Sanofi and Amgen, among others. SmartLab Exchange is attended by invite-only decision-makers. The unique invite-only format of the event means that both sponsors, speakers and delegates can access a closed community that meets their individual needs. 

Feedback and Voice of the Industry

Attending from Scimcon were Geoff Parker and Dave Sanders, and during the event they took the opportunity to poll the customers and contacts from many of the attending organizations, to identify the current 2022 trends in the lab informatics industry. SmartLab Exchange represents the lab informatics community across industries including:

  • Pharmaceutical
  • Bio-pharmaceutical
  • Biotech
  • Biobanking
  • Medical device
  • Petrochemical
  • Bio-fuel
  • Chemicals 
  • Cosmetics
  • Food & beverage
  • Defence
  • Forensics
  • Water
  • Environmental
  • Agriculture
  • Consumer Goods

Geoff and Dave spoke with representatives from a multitude of organizations to take a pulse of the trends in the industry. Geoff explains:

“Scimcon works globally as a lab informatics consultant and implementation partner, with big pharma and biotech companies as well as vaccine manufacturers. We tend to see similar challenges from lab to lab, from organization to organization, and it is useful to take events like SmartLab Exchange as a means of checking in and ensuring that our customers’ needs are current.”

Summary of trends in lab informatics for the modern lab

In the informal poll of attendees at SmartLab Exchange, Scimcon was able to identify key trends and themes that are important to the modern lab in 2022.     

The subjects identified as highest interest to the delegates were:

  • Data standardization
  • Data Quality and Integrity
  • Instrument Connectivity/ IoT 

Interest in product areas for the lab was high, especially for:

  • Scientific Data Management Systems,
  • Lab Automation
  • ELN
  • LIMS

There was a general trend for interest and support in data integration and systems integration.

Scimcon sponsors SmartLab Exchange 2022, summarizes trends from laboratory informatics leaders

Geoff summarizes “As lab informatics consultants with a global customer base in pharma and biopharma labs, it is important to us to check in with influential decision-makers from the lab. SmartLab Exchange gave us a useful ability to poll the attendees and see trends that will impact the modern lab decision-maker, and will help us at Scimcon to hone the way we partner with our customers.”

 

 


 

 

Scimcon is proud to sponsor SmartLab Exchange, and support customers in life sciences with their lab informatics management and strategy. For more information about Scimcon’s services, contact us today.

Industry leader interviews: Mark Elsley?

Mark, please introduce yourself

I am Mark Elsley, a Senior Clinical Research / Data Management Executive with 30 years’ experience working within the pharmaceutical sector worldwide for companies including IQVIA, Boehringer Ingelheim, Novo Nordisk and GSK Vaccines. I am skilled in leading multi-disciplinary teams on projects through full lifecycles to conduct a breadth of clinical studies including Real World Evidence (RWE) research. My specialist area of expertise is in clinical data management, and I have published a book on this topic called “A Guide to GCP for Clinical Data Management” which is published by Brookwood Global.

Please can you explain what data quality means to you?

Data quality is a passion of mine and now receives a lot of focus from the regulators, especially since the updated requirements for source data in the latest revision of ICH-GCP. It is a concept which is often ill-understood, leading to organisations continuing to collect poor quality data whilst risking their potential rejection by the regulators.

White and Gonzalez1 created a data quality equation which I think is a really good definition: They suggested that Data Quality = Data Integrity + Data Management. Data integrity is made up of many components. In the new version of ICH-GCP it states that source data should be attributable, legible, contemporaneous, original, accurate, and complete. The Data Management part of the equation refers to the people who work with the data, the systems they use and the processes they follow. Put simply, staff working with clinical data must be qualified and trained on the systems and processes, processes must be clearly documented in SOPs and systems must be validated. Everyone working in clinical research must have a data focus… Data management is not just for data managers!

By adopting effective strategies to maximise data quality, the variability of the data are reduced. This means study teams will need to enrol fewer patients because of sufficient statistical power (which also has a knock-on impact on the cost of managing trials).2 Fewer participants also leads to quicker conclusions being drawn, which ultimately allows new therapies to reach patients sooner.

Why is data quality such an important asset in pharma?

I believe that clinical trials data are vitally important. These assets are the sole attribute that regulators use to decide whether to approve a marketing authorization application or not, which ultimately allows us to improve patient outcomes by getting new, effective drugs to market faster. For a pharmaceutical company, the success of clinical trial data can influence the stock price and hence the value of a pharmaceutical company3 by billions of dollars. On average, positive trials will lead to a 9.4% increase while negative trials contribute to a 4.5% decrease. The cost of managing clinical trials amounts to a median cost per patient of US$ 41,4134 or US$ 69 per data point (based on 599 data points per patient).5. In short, clinical data have a huge impact on the economics of the pharmaceutical industry.

Why is the prioritization of data quality so important for healthcare organizations?

Healthcare organizations generate and use immense amounts of data, and use of good study data can go on to significantly reduce healthcare costs 6, 7. Capturing, sharing, and storing vast amounts of healthcare data and transactions, as well as the expeditious processing of big data tools, have transformed the healthcare industry by improving patient outcomes while reducing costs. Data quality is not just a nice-to-have – the prioritization of high-quality data should be the emphasis for any healthcare organization.

However, when data quality is not seen as a top priority in health organizations, subsequently large negative impacts can be seen. For example, Public Health England recently reported that nearly 16,000 coronavirus cases went unreported in England. When outputs such as this are unreliable, guesswork and risk in decision making are heightened. This exemplifies that the better the data quality, the more confidence users will have in the outputs they produce, lowering risk in the outcomes, and increasing efficiency. 

Data quality, where should organisations start?

ICH-GCP8 for interventional studies and GPP9 for non-interventional studies contain many requirements with respect to clinical data so a thorough understanding of those is essential. It is impossible to achieve 100% data quality so a risk-based approach will help you decide which areas to focus on. The most important data in a clinical trial are patient safety and primary end point data so the study team should consider the risks to these data in detail. For example, for adverse event data, one of the risks to consider could include the recall period of the patient if they visit the site infrequently. A patient is unlikely to have a detailed recollection of a minor event that happened a month ago. Collection of symptoms via an electronic diary could significantly decrease the risk and improve the data quality in this example. Risks should be routinely reviewed and updated as needed. By following the guidelines and adopting a risk-based approach to data collection and management, you can be sure that analysis of the key parameters of the study is robust and trust-worthy.

If you were to give just one tip for ensuring data quality in clinical trials, what would it be?

Aside from the risk-based approach which I mentioned before, another area which I feel is important is to only collect the data you need; anything more is a waste of money, and results in delays getting drugs to patients. If you over-burden sites and clinical research teams with huge volumes of data this increases the risks of mistakes. I still see many studies where data are collected but are never analysed. It is better to only collect the data you need and dedicate the time saved towards increasing the quality of that smaller dataset.

Did you know that:

In 2016, the FDA published guidance12 for late stage/post approval studies, stating that excessive safety data collection may discourage the conduct of these types of trials by increasing the resources needed to perform them and could be a disincentive to investigator and patient participation in clinical trials.

The guidance also stated that selective safety data collection may facilitate the conduct of larger trials without compromising the integrity and the validity of trial results. It also has the potential to facilitate investigators and patients’ participation in clinical trials and help contain costs by making more-efficient use of clinical trial resources.

What is the role of technology on data quality?

Technology, such as Electronic Health Records (HER) and electronic patient reported outcomes (ePRO), drug safety systems and other digital-based emerging technologies are currently being used in many areas of healthcare. Technology such as these can increase data quality but simultaneously increase the number of factors involved. It impacts costs, involves the management of vendors and adds to the compliance burden, especially in the areas of vendor qualification, system validation, and transfer validation.

I may be biased as my job title includes the word ‘Data’ but I firmly believe that data are the most important assets in clinical research, and I have data to prove it!

Scimcon is proud to support clients around the globe with managing data at its highest quality. For more information, contact us.


References

1White, Christopher H., and Lizzandra Rivrea González. “The Data Quality Equation—A Pragmatic Approach to Data Integrity.” Www.Ivtnetwork.Com, 17 Aug. 2015, www.ivtnetwork.com/article/data-quality-equation%E2%80%94-pragmatic-approach-data-integrity#:~:text=Data%20quality%20may%20be%20explained. Accessed 25 Sept. 2020.

2Alsumidaie, Moe, and Artem Andrianov. “How Do We Define Clinical Trial Data Quality If No Guidelines Exist?” Applied Clinical Trials Online, 19 May 2015, www.appliedclinicaltrialsonline.com/view/how-do-we-define-clinical-trial-data-quality-if-no-guidelines-exist. Accessed 26 Sept. 2020.

3Rothenstein, Jeffrey & Tomlinson, George & Tannock, Ian & Detsky, Allan. (2011). Company Stock Prices Before and After Public Announcements Related to Oncology Drugs. Journal of the National Cancer Institute. 103. 1507-12. 10.1093/jnci/djr338.

4Moore, T. J., Heyward, J., Anderson, G., & Alexander, G. C. (2020). Variation in the estimated costs of pivotal clinical benefit trials supporting the US approval of new therapeutic agents, 2015-2017: a cross-sectional study. BMJ open, 10(6), e038863. https://doi.org/10.1136/bmjopen-2020-038863

5O’Leary E, Seow H, Julian J, Levine M, Pond GR. Data collection in cancer clinical trials: Too much of a good thing? Clin Trials. 2013 Aug;10(4):624-32. doi: 10.1177/1740774513491337. PMID: 23785066.

6Khunti K, Alsifri S, Aronson R, et al. Rates and predictors of hypoglycaemia in 27 585 people from 24 countries with insulin-treated type 1 and type 2 diabetes: the global HAT study. Diabetes Obes Metab. 2016;18(9):907-915. doi:10.1111/dom.12689

7Evans M, Moes RGJ, Pedersen KS, Gundgaard J, Pieber TR. Cost-Effectiveness of Insulin Degludec Versus Insulin Glargine U300 in the Netherlands: Evidence From a Randomised Controlled Trial. Adv Ther. 2020;37(5):2413-2426. doi:10.1007/s12325-020-01332-y

8Ema.europa.eu. 2016. Guideline for good clinical practice E6(R2). [online] Available at: https://www.ema.europa.eu/en/documents/scientific-guideline/ich-e-6-r2-guideline-good-clinical-practice-step-5_en.pdf [Accessed 10 May 2021].

9Pharmacoepi.org. 2020. Guidelines For Good Pharmacoepidemiology Practices (GPP) – International Society For Pharmacoepidemiology. [online] Available at: https://www.pharmacoepi.org/resources/policies/guidelines-08027/ [Accessed 31 October 2020].

10Medical Device Innovation Consortium. Medical Device Innovation Consortium Project Report: Excessive Data Collection in Medical Device Clinical Trials. 19 Aug. 2016. https://mdic.org/wp-content/uploads/2016/06/MDIC-Excessive-Data-Collection-in-Clinical-Trials-report.pdf

11O’Leary E, Seow H, Julian J, Levine M, Pond GR. Data collection in cancer clinical trials: Too much of a good thing? Clin Trials. 2013 Aug;10(4):624-32. doi: 10.1177/1740774513491337. PMID: 23785066.

12FDA. Determining the Extent of Safety Data Collection Needed in Late-Stage Premarket and Postapproval Clinical Investigations Guidance for Industry. Feb. 2016.

Digital transformation: Revolutionising the labs of the future?

Scimcon has worked with many lab-based clients throughout our 20 years in the industry, across a vast range of projects. Here we discuss the current challenges that labs are facing in 2020, and the work that needs to be done through digital transformation to ensure that labs in the future can streamline and manage their data.


The limitations of the current laboratory information systems landscape

Today’s labs are facing similar challenges as camera companies. Camera manufacturers such as Nikon and Canon are now faced with the challenge of selling to a new generation of budding photographers, most of whom by now have grown up with increasingly higher-quality smartphone cameras. As a result of having access to technology that is designed for ease of use, this generation of users find themselves progressively more frustrated with traditional technology and methods required to operate today’s ‘real’ cameras. Where smartphones can offer instant uploads to online services; amazing results that leverage computational photography; and synchronicity between multiple devices, traditional cameras appear complicated, difficult to control and impractical. Camera companies therefore face the challenge of building usability, such as that found in smartphone cameras into their existing products, otherwise they risk losing a whole demographic of potential customers.

The analogy is that modern labs are facing a similar problem. As new generations of scientists join laboratory settings, many are finding the lack of synchronicity and usability of information management systems increasingly frustrating. Why can’t we check instruments remotely whenever we want? Why can’t data be easily transferred between devices or colleagues? Why isn’t all this information seamless? Limitations such as these can be hugely time-consuming, as well as resulting in reduced productivity and security risks for data with minimal protection. Similar to the camera makers, we are risking losing the best new talent to other areas of science. Digital transformation addresses these challenges head on, with the proficiency to make your lab more intuitive and efficient.

What is digital transformation and how can it enhance your current lab setup?

Digital transformation involves the integration of new technology and methods into existing lab technology. Although this advancement in technology is a relatively new development within the laboratory setting, lab managers are quick to realise that digital transformation is essential in optimising workflows and productivity. In 2018, 70% of labs were reported to have a digital lab strategy in place or were working towards one1– a number that we can only expect to have significantly increased since then.

Significant effort has taken place in laboratories over the past two decades or more, which has delivered substantial benefits. This effort has been focused on the key lab workflows and the matching informatics systems such as CDS, LIMS, ELN, LES and SDMS, to mention a few. The next decade needs to build on this success to create a true digital laboratory.

Digital labs of the future: what can we expect?

Digital lab transformation is more than just implementing informatics systems, it involves taking these systems and pushing them a step further. For example, a lab could connect instruments bi-directionally to LIMS or ELN, but digital lab transformation would also facilitate online monitoring of instrument status, automatic ordering of consumables, reserving instrument time, auto-tracking utilisation and the use of telemetry data to predict faults before they happen.

A digital lab may also utilise a feature rich LIMS, ELN or LES that enables collation and review of all results for an experiment, but a digitally transformed lab would also be able to collate results across potentially several LIMS and ELNs throughout an organisation. This would allow the promotion of internal and external collaborations, enabling the ‘science later’ paradigm of cross team, cross technique and cross experiment data mining. This, in turn, will progress artificial intelligence and machine learning.  

Overall, a digital transformation is more than just providing scientists with the means to spend more time on actual science. It provides the complete toolset of a lab wherever a scientist may be, whether that is in the lab itself, in an off-site office, in a café or even at the kitchen table.

At present, even top laboratories face problems with a lack of modernisation, and this is a problem that is slowly trickling down to smaller labs that are starting to face similar challenges. If we continue to drive forward with the help of innovative technology, we could expect to see many labs becoming more efficient, more supportive of science and more reliable than ever before.

However, to do this, it is up to laboratory leaders to have a clear vision of where they see their lab going. It is hard to transform any business by only doing little bits, so it is up to the higher levels of lab personnel to decide what steps to take to ensure that their labs are working at optimal capacity and potential. This is where Scimcon can help.

How can Scimcon help to revolutionise your lab?

Scimcon is proud to offer a range of digital lab services to assist in digitising a lab, many of which are outlined in our introductory blog. We are also able to help labs go that step further, with our collective wealth of experience in the lab, both as scientists and project leaders. Whether it is the development of the strategy, the running of the programme, or providing resources and leadership for your projects, Scimcon can help you understand what you want to achieve, and how to reach it.

To find out more about types of projects we support, and how we can help you to transform your lab, get in touch.

Reference:

1 ‘Despite steady growth in digital transformation initiatives, companies face budget and buy-in challenges’, https://www.zdnet.com/article/survey-despite-steady-growth-in-digital-transformation-initiatives-companies-face-budget-and-buy-in/

In order to work as intended, this site stores cookies on your device. Accepting improves our site and provides you with personalized service.
Click here to learn more