In order to work as intended, this site stores cookies on your device. Accepting improves our site and provides you with personalized service.
Click here to learn more
Click here to learn more
As an information systems consultancy dedicated to successfully delivering lab-based information systems, we help our clients to overcome many different challenges. There are some important questions that we are frequently asked to evaluate.
In part one of this blog series, we’ll summarise the considerations to make when answering 3 common questions about lab informatics systems, all in the theme of ‘is a single system better than multiple similar systems?’
Here the context matters. If one were to generalise, R&D labs tend to be experiment-based, answering questions like ‘What ingredient changes in the product formulation will increase effectiveness and reduce environmental impact?’. On the other hand, QC labs are more focused on samples taken from production runs, and questions such as ‘Are the % composition of key ingredients within a production batch within specification?’
If we use the above generalisation and apply lab informatics thinking, in broad terms, ELNs are centred on recording experiments and therefore more suited to R&D. LIMS, being sample, test and results biased, are generally more suitable to QC labs.
However, it is not that simple. For example, perhaps one of the R&D labs provides analytical services to various teams executing R&D experiments – this type of ‘service’ lab is often better served by LIMS than ELNs.
The type of labs involved is not the only factor to consider. For example, CDS systems are generally applicable to both R&D and QC. The methods and use of the instruments may well vary across R&D and QC, but the instrument data systems can be exactly the same.
Finally, regulatory needs, specifically for QC can also be a driving factor in answering the question. We will consider this further in one of the following questions.
When Scimcon first started nearly three decades ago, the focus within large multi-national companies was on implementing large, monolithic lab systems. This approach still has its place, particularly where the distributed labs are very close in terms of operational and analytical workflows.
Current thinking, however, looks to best support the diversity of lab workflows across global sites. While this should not mean a different system in every single lab, it should ensure some flexibility in selecting systems locally. This has several benefits, including a better informatics fit for each lab, and the increased local user buy-in gained by allowing flexibility.
However, against the background of the drive to increased data aggregation, data science and analytics, and AI/ML, this local approach can be counterproductive. It is therefore important to set standards and guardrails about how these systems are implemented, and how the data is structured and linked via reference data, so that consolidation into centralised reporting tools and data lakes is facilitated.
There is a well-used saying within regulatory-compliant organisations: ‘If a system contains just 1% of GxP data, then the whole system is required to be implemented, managed and maintained in a regulatory compliant manner.’
This statement leaves compliant organisations questioning:
The first step to answering the question is to determine the delta between administering a GxP system, and administering a non GxP system. LIMS, ELN, SDMS, CDS and other lab informatics systems are often classified by labs as mission-critical. Most organisations wouldn’t countenance a lack of system administration rigour or releasing untested changes to mission-critical systems, so this delta may be lower than it first seems.
The next step is an open conversation with QA teams about the types of data being held, and the control systems that will be put in place. In the past, we have successfully taken a two-tier approach, where the administration procedures for non-GxP are simpler than those for GxP data in the same system. However, for this type of arrangement to be viable, a detailed risk assessment is required, and the ongoing management and control of the administration has to be very well executed.
Finally, before making the decision, it’s worth considering whether there are shared services or functions involved. For example, if the GxP and non-GxP work uses the same inventory management, it might be complex to get the inventory system interfacing and updating two systems simultaneously.
Hopefully, we have illustrated the importance of being clear about what your requirements are before answering these key questions about lab informatics systems. Each case is unique, and your decision will usually be based on a wide range of influencing factors. We help organisations to consider all of the options and roll out their chosen model.
Stay tuned for part 2 of this blog series, where we will look at the key question of how you can prepare your data for AI and machine learning.
Industry leader interviews: Pascale Charbonnel?In February and March 2023, Scimcon is hosting panel discussions at both SmartLab Exchange Europe and SmartLab Exchange US. The events, taking place in Amsterdam, The Netherlands and San Diego, North America take place on an annual basis as a forum for scientists in the modern lab to interact, form new connections, and learn more about the evolving technology that is disrupting the lab. Attendees and speakers will debate themes including: Lab of the Future, Data, Digitalisation, Quality Management and Standardisation during the conferences.
As a sponsor and panel chair in 2023, Scimcon’s opening panel discussion ‘What Is The Future For Human Scientists as AI & ML Deliver the Promised Step Change in Laboratory Practice?’ explored the future of human input in the lab, and how artificial intelligence (AI) and machine learning (ML) could impact the structures and processes in place.
Following introductions by Birthe Nielsen of the Pistoia Alliance, the session discussions will be led by Geoff Parker, co-founder of Scimcon. The panel discussion in Amsterdam took place on Wednesday 22nd February 2023, and featured key opinion leaders on the panel including, Edith Gardenier from Genmab, and Andy Phillips and Robin Brouwer from AstraZeneca. The San Diego panel is scheduled for Wednesday 22nd March 2023, and panel participants include Robert Pluim from Genmab, Miu-Ling Lau from Merck, and Scott Stanley from the University of Kentucky.
AI and ML are everywhere we look – in the news, on our phones and other smart devices, and are increasingly making their way into other areas of our daily lives. In transport, we’re seeing steps being made towards self-driving vehicles. But what will happen to those engaged with the transport sector when human input is no longer required?
The same questions can be asked about the lab. We have seen similar disruptions in the past, and many scientists will still remember the days of cutting out chromatograms to weigh them and calculate peak areas – a task which now is fully automated. Through the employment of similar automated technologies – from sample prep, to HTS, and sophisticated instrumentation – we have been able to give more time back to scientists, to allow them to spend longer on the science that matters.
Our panel at SmartLab Exchange Europe and US will dig deeper into AI and ML, and how it will impact the role played by human scientists in years to come.
The panellists will debate the big questions facing scientists on the topics of AI and ML during the sessions, including:
Following the SmartLab Exchange, Scimcon will summarize topics of key interest to the audiences in a future blog.
To join the discussion and hear more how AI/ML will impact laboratories and scientific operations, contact our team for more information.
Industry leader interviews: Jana Fischer?2020 has been a difficult year for most industries, not least for event and tradeshow providers. Luke Gibson, Founding Director of Open Pharma Research and Lab of the Future, shares his experience of running events in the laboratory industry, and what makes Lab of the Future such a unique event.
My name is Luke Gibson, and I am one of the three founding directors of Open Pharma Research. I have 30 plus years of experience in developing and running events, primarily in the financial and trade and commodity sectors. My colleagues Kirianne Marshall and Zahid Tharia bring a similar level of experience to the company.
Kirianne has had many years of experience in managing the commercial side of large congresses, such as Partnering in Clinical Trials, and research and development congresses. Zahid has 30 years of events experience too, particularly in running life science portfolios, and launching congresses/events. Our paths have crossed many times throughout our years working in events, and we eventually hit a point where all 3 of us had the capacity to try something new – something that was worthwhile, fun, and different to the corporate worlds we had become accustomed to. So that was why we created Lab of the Future – with a view to running events in a different way.
I’m not sure if I would describe it as a gap in the market, more an ambition to do things differently. There was a desire from all of us to build an event with a different approach to the one we would take when working for large organisations, because when you’re working on a large portfolio of global events that cover a variety of topics, you and your team are always looking ahead to the next event, and the focus on the longevity of a single event isn’t always there.
We wanted something that we can nurture and grow, something that we can work on year-round without getting distracted by the next thing on our list. It also allows us to stay within this space and build our community, without having to face pressures such as a year-on-year development strategy or diverse P&L. Our desire was to avoid these constraints, and create an event that we can continue to work on for a long time.
We want to be able to live and breathe Lab of the Future, but one of the interesting things about it is that it’s such a broad concept. On the one hand we deal with informatics, but on the other hand, we deal with equipment, technology, and all the connectivity between them – but even that’s just one part of it. We are not an informatics conference; we are not strictly an instrumentation conference; we also look at the innovation side of things.
I think the best way to describe how we see Lab of the Future is as a proxy for how you do science in the future. Everything pertains to more efficient processes; better results; or ways of creating breakthrough innovation, and these are all part of the picture of science in the future. And that is the lab of the future – where the lab is the proxy for the environment where you do the science that matters.
When we started off, we found we received a lot of queries from industry contacts who wanted to get involved, but certain topics they wanted to discuss didn’t necessarily pertain to the physical laboratory itself. But if it was relevant to science, then it was relevant to us. Things like data clouds and outsourced services may not be directly linked to the lab, but they still relate to how you work. So, within that, the scope for the Lab of the Future gets wider still, looking at areas such as how we can create virtual clinical trials, or use real world-data to feed back into R&D.
People are also keen to learn more from their peers and from other areas of the industry. Lab of the Future allows us to host senior speakers and keynotes who can tell us where we’re heading, and show us how the efforts of one area within life science feed into other areas. It presents us with an almost ever-changing jigsaw image, and it’s this strategic element that I think sets us apart from other events.
We attract a real mix of attendees, and that’s what I love about it. You can run a conference for people in a specific job function, such as a data scientist or an R&D manager, but what people really want to know is what the people around them are doing, to almost give them context of the industry as a whole. So, our conference doesn’t just exist to help you do your own job better, but it helps you to develop a concept of where your department is heading in the future, and what you should think about longer term. We aren’t telling scientists how to do their job today; we’re helping them think about their responsibilities for delivery in the future. Lab of the Future is about the delivery of science of the future.
Our sponsors and solution providers that support the conference are also very much part of our community, as they’re all innovating and making waves in this space as well. They’re in a space that’s always evolving to build the Lab of the Future; and they are part of that solution. So, we don’t merely facilitate a conference of buying and selling between providers and services, we offer a space where everyone is evolving together. It’s a real melting pot, and that’s the fun bit really.
Zahid’s background in life sciences definitely gave us a starting point. Further to that, we’ve found that every time we put something out, that our community engages, and as a consequence we’re introduced to people we never expected to be introduced to. The fact we’re always talking to people enriches our content – the people we meet and conversations we have change our way of thinking, and shape what we’re doing.
Although I’m in charge of our marketing operations, I have to say I’m not always sure where some of our contacts come from! One thing I’ve found quite surprising is the lack of reliance on a database – there’s a lot of power in word-of-mouth, especially in this space where everyone is working on something – why not share that? As we’re seen as adding value to the conversation, it allows people to find us through their connections and our supporters.
Scimcon is proud to sponsor Lab of the Future, and we can’t wait to see you at the Autumn virtual congress on 26 – 27th October 2021. Contact us today to learn more about our participation in the event, and stay tuned on our Opinion page for part 2 of our conversation with Luke.
From vision to reality: enabling true digital transformation in the lab?I am Mark Elsley, a Senior Clinical Research / Data Management Executive with 30 years’ experience working within the pharmaceutical sector worldwide for companies including IQVIA, Boehringer Ingelheim, Novo Nordisk and GSK Vaccines. I am skilled in leading multi-disciplinary teams on projects through full lifecycles to conduct a breadth of clinical studies including Real World Evidence (RWE) research. My specialist area of expertise is in clinical data management, and I have published a book on this topic called “A Guide to GCP for Clinical Data Management” which is published by Brookwood Global.
Data quality is a passion of mine and now receives a lot of focus from the regulators, especially since the updated requirements for source data in the latest revision of ICH-GCP. It is a concept which is often ill-understood, leading to organisations continuing to collect poor quality data whilst risking their potential rejection by the regulators.
White and Gonzalez1 created a data quality equation which I think is a really good definition: They suggested that Data Quality = Data Integrity + Data Management. Data integrity is made up of many components. In the new version of ICH-GCP it states that source data should be attributable, legible, contemporaneous, original, accurate, and complete. The Data Management part of the equation refers to the people who work with the data, the systems they use and the processes they follow. Put simply, staff working with clinical data must be qualified and trained on the systems and processes, processes must be clearly documented in SOPs and systems must be validated. Everyone working in clinical research must have a data focus… Data management is not just for data managers!
By adopting effective strategies to maximise data quality, the variability of the data are reduced. This means study teams will need to enrol fewer patients because of sufficient statistical power (which also has a knock-on impact on the cost of managing trials).2 Fewer participants also leads to quicker conclusions being drawn, which ultimately allows new therapies to reach patients sooner.
I believe that clinical trials data are vitally important. These assets are the sole attribute that regulators use to decide whether to approve a marketing authorization application or not, which ultimately allows us to improve patient outcomes by getting new, effective drugs to market faster. For a pharmaceutical company, the success of clinical trial data can influence the stock price and hence the value of a pharmaceutical company3 by billions of dollars. On average, positive trials will lead to a 9.4% increase while negative trials contribute to a 4.5% decrease. The cost of managing clinical trials amounts to a median cost per patient of US$ 41,4134 or US$ 69 per data point (based on 599 data points per patient).5. In short, clinical data have a huge impact on the economics of the pharmaceutical industry.
Healthcare organizations generate and use immense amounts of data, and use of good study data can go on to significantly reduce healthcare costs 6, 7. Capturing, sharing, and storing vast amounts of healthcare data and transactions, as well as the expeditious processing of big data tools, have transformed the healthcare industry by improving patient outcomes while reducing costs. Data quality is not just a nice-to-have – the prioritization of high-quality data should be the emphasis for any healthcare organization.
However, when data quality is not seen as a top priority in health organizations, subsequently large negative impacts can be seen. For example, Public Health England recently reported that nearly 16,000 coronavirus cases went unreported in England. When outputs such as this are unreliable, guesswork and risk in decision making are heightened. This exemplifies that the better the data quality, the more confidence users will have in the outputs they produce, lowering risk in the outcomes, and increasing efficiency.
ICH-GCP8 for interventional studies and GPP9 for non-interventional studies contain many requirements with respect to clinical data so a thorough understanding of those is essential. It is impossible to achieve 100% data quality so a risk-based approach will help you decide which areas to focus on. The most important data in a clinical trial are patient safety and primary end point data so the study team should consider the risks to these data in detail. For example, for adverse event data, one of the risks to consider could include the recall period of the patient if they visit the site infrequently. A patient is unlikely to have a detailed recollection of a minor event that happened a month ago. Collection of symptoms via an electronic diary could significantly decrease the risk and improve the data quality in this example. Risks should be routinely reviewed and updated as needed. By following the guidelines and adopting a risk-based approach to data collection and management, you can be sure that analysis of the key parameters of the study is robust and trust-worthy.
Aside from the risk-based approach which I mentioned before, another area which I feel is important is to only collect the data you need; anything more is a waste of money, and results in delays getting drugs to patients. If you over-burden sites and clinical research teams with huge volumes of data this increases the risks of mistakes. I still see many studies where data are collected but are never analysed. It is better to only collect the data you need and dedicate the time saved towards increasing the quality of that smaller dataset.
Did you know that:
In 2016, the FDA published guidance12 for late stage/post approval studies, stating that excessive safety data collection may discourage the conduct of these types of trials by increasing the resources needed to perform them and could be a disincentive to investigator and patient participation in clinical trials.
The guidance also stated that selective safety data collection may facilitate the conduct of larger trials without compromising the integrity and the validity of trial results. It also has the potential to facilitate investigators and patients’ participation in clinical trials and help contain costs by making more-efficient use of clinical trial resources.
Technology, such as Electronic Health Records (HER) and electronic patient reported outcomes (ePRO), drug safety systems and other digital-based emerging technologies are currently being used in many areas of healthcare. Technology such as these can increase data quality but simultaneously increase the number of factors involved. It impacts costs, involves the management of vendors and adds to the compliance burden, especially in the areas of vendor qualification, system validation, and transfer validation.
I may be biased as my job title includes the word ‘Data’ but I firmly believe that data are the most important assets in clinical research, and I have data to prove it!
Scimcon is proud to support clients around the globe with managing data at its highest quality. For more information, contact us.
1White, Christopher H., and Lizzandra Rivrea González. “The Data Quality Equation—A Pragmatic Approach to Data Integrity.” Www.Ivtnetwork.Com, 17 Aug. 2015, www.ivtnetwork.com/article/data-quality-equation%E2%80%94-pragmatic-approach-data-integrity#:~:text=Data%20quality%20may%20be%20explained. Accessed 25 Sept. 2020.
2Alsumidaie, Moe, and Artem Andrianov. “How Do We Define Clinical Trial Data Quality If No Guidelines Exist?” Applied Clinical Trials Online, 19 May 2015, www.appliedclinicaltrialsonline.com/view/how-do-we-define-clinical-trial-data-quality-if-no-guidelines-exist. Accessed 26 Sept. 2020.
3Rothenstein, Jeffrey & Tomlinson, George & Tannock, Ian & Detsky, Allan. (2011). Company Stock Prices Before and After Public Announcements Related to Oncology Drugs. Journal of the National Cancer Institute. 103. 1507-12. 10.1093/jnci/djr338.
4Moore, T. J., Heyward, J., Anderson, G., & Alexander, G. C. (2020). Variation in the estimated costs of pivotal clinical benefit trials supporting the US approval of new therapeutic agents, 2015-2017: a cross-sectional study. BMJ open, 10(6), e038863. https://doi.org/10.1136/bmjopen-2020-038863
5O’Leary E, Seow H, Julian J, Levine M, Pond GR. Data collection in cancer clinical trials: Too much of a good thing? Clin Trials. 2013 Aug;10(4):624-32. doi: 10.1177/1740774513491337. PMID: 23785066.
6Khunti K, Alsifri S, Aronson R, et al. Rates and predictors of hypoglycaemia in 27 585 people from 24 countries with insulin-treated type 1 and type 2 diabetes: the global HAT study. Diabetes Obes Metab. 2016;18(9):907-915. doi:10.1111/dom.12689
7Evans M, Moes RGJ, Pedersen KS, Gundgaard J, Pieber TR. Cost-Effectiveness of Insulin Degludec Versus Insulin Glargine U300 in the Netherlands: Evidence From a Randomised Controlled Trial. Adv Ther. 2020;37(5):2413-2426. doi:10.1007/s12325-020-01332-y
8Ema.europa.eu. 2016. Guideline for good clinical practice E6(R2). [online] Available at: https://www.ema.europa.eu/en/documents/scientific-guideline/ich-e-6-r2-guideline-good-clinical-practice-step-5_en.pdf [Accessed 10 May 2021].
9Pharmacoepi.org. 2020. Guidelines For Good Pharmacoepidemiology Practices (GPP) – International Society For Pharmacoepidemiology. [online] Available at: https://www.pharmacoepi.org/resources/policies/guidelines-08027/ [Accessed 31 October 2020].
10Medical Device Innovation Consortium. Medical Device Innovation Consortium Project Report: Excessive Data Collection in Medical Device Clinical Trials. 19 Aug. 2016. https://mdic.org/wp-content/uploads/2016/06/MDIC-Excessive-Data-Collection-in-Clinical-Trials-report.pdf
11O’Leary E, Seow H, Julian J, Levine M, Pond GR. Data collection in cancer clinical trials: Too much of a good thing? Clin Trials. 2013 Aug;10(4):624-32. doi: 10.1177/1740774513491337. PMID: 23785066.
12FDA. Determining the Extent of Safety Data Collection Needed in Late-Stage Premarket and Postapproval Clinical Investigations Guidance for Industry. Feb. 2016.
The role of AI and ML in the future of lab informatics?As a leader in a pharmaceutical or life sciences organisation, getting the most out of your team and resources is always a top priority. After making the decision to proceed with a critical investment in consulting services, there may even be more pressure to find the optimal use of these time-limited external resources. So, how can you make sure you are using these resources to their full potential? In this blog, our industry expert Micah Rimer will show you how.
During Micah’s 20 years’ working at big pharma & vaccines corporations, including Bayer, Chiron, Novartis and GSK, he has successfully deployed consultancy groups within lab informatics and clinical projects. Micah has worked with Scimcon to support his teams on high profile critical projects
As with any business situation, it is important that there is a common goal that everyone is aligned around.
It is essential that you do not waste valuable time revisiting the same conversations. Ask yourself: “Is it obvious what problem we are trying to solve?” Often, issues can arise when people are arguing about implementing a solution, whilst losing sight of the challenge at hand.
Take the example of Remote Clinical Monitoring: You might decide that it would be beneficial to have your Clinical Research Associates (CRAs) track and monitor the progress of a clinical study without traveling to clinical sites. That sounds like it could be very promising, but what is the problem that needs to be solved?
Without clear goals on what you want to accomplish with Remote Clinical Monitoring, it will be difficult to declare an implementation a success. In addition, if you and your organisation do not know what you are trying to achieve with a particular technical solution, it will be impossible to give your informatics consultants a clear set of deliverables.
So, first things first, agree on the problem statement!
One of the first times I hired Scimcon to support me with an informatics project, I had recently joined a pharma company and found myself in the middle of conflicting department objectives, with what seemed to be no clear path out of the mess I had inherited. The organisation had purchased an expensive new software system that had already become a failed implementation. After spending a year continuously configuring and programming it, it was no closer to meeting the business needs than when the project had started. There were two loud criticisms to address on that point:
This also highlighted a far wider range of issues, such as some people who felt their skills were not being properly utilised while problems went unsolved, and that the bioinformatics department might not have the right goals to begin with.
To solve this challenge, we sat down with Scimcon to identify all the different problems associated with the inherited project, and to clarify what we needed to do to turn it into a success. In taking time to review the situation and without too much effort, we were able to come up with four key areas to address:
With the help of Scimcon, we were able to define these problems and then focus on finding answers to each of the questions. In the end it turned out to be one of our most successful engagements together, award winning even. By just asking senior management what their biggest challenge was, we found their overriding priority was to have an overview of all the R&D projects going on. And while the new software was not particularly well suited for solving the bioinformatics problem that it had been acquired for, it could easily be used to map out the R&D process for portfolio tracking. Then, we turned our attention to the bioinformatics problem, which was easily solved by a bit of custom code from one of the bioinformatics programmers who felt that previously his skills were not being properly utilised.
Once we knew where we were, and where we wanted to get to, all we had to do was get there one challenge at a time.
Once you have identified and agreed on the problem that you want to solve, the next step is making sure the organisation is ready to work with your consultants. As with all relationships, business or otherwise, a crucial step is to make sure that everyone has the same expectations, and that all the relevant stakeholders are on the same page.
People have many different perspectives on why consultants are brought in.
As there can be so many different roles and perspectives on the use of consultants, you need to make sure that you address all the different stakeholder perspectives. It is important to establish a positive situation, as you want the consultants to be able to work with your teams without unnecessary tension.
When I was just starting out with my first LIMS implementation (Laboratory Information Management System), I remember being impressed that you could hire someone who had the specific experience and expertise to guide you on something they had done before but that was new to you. I wondered, “why was that not done all the time? Why do so many implementation projects fail when you can bring people in who had solved that particular problem before?” When I asked Russell Hall, a consultant at Scimcon for us on that first project, he said that not everyone is comfortable admitting they need help. As my career has progressed, I have come to value that feedback more and more. There are many people who are highly competent and effective in their jobs, but are not comfortable with the appearance that they are not sufficient on their own. It is always important to manage for those situations, rather than assuming that everyone will welcome external help.
Lastly, it is also critical to manage expectations, regarding the use of consultants. Your boss may need to defend the budget, or be prepared to stand behind recommendations or conclusions that are delivered from people outside of the organisation. It should also be considered that management might not readily accept something that might seem obvious to employees working at a different level. By liaising with senior leaders from the outset, you can make sure both parties are aligned how the consultants will interact with people in the company, and what their role will be. This is important both to achieve what you want internally and also to make sure the consultants have a proper expectation of how their efforts will be utilised.
While it can be very tempting to feel that you can leave the majority of the project to the experts, the reality is things rarely go as smoothly as planned. As the life science business and information management have advanced over the last few decades, the amount of complexity and details has grown tremendously. It is more and more difficult for a single person to maintain an overview of all the relevant facts. The only way to be successful is to communicate and make sure that the right people have the right information at the right time. Your consultants are no different.
Many organisations have challenges in terms of taking decisions and communicating them effectively. For your consultants who do not typically have all the same access and networks in the organisation that internal staff do, it is imperative that you make sure they are kept up to date. You want to avoid them spending valuable time on focusing on areas and deliverables that have shifted to being less important. Finding ways to keep consultants informed on all the latest developments is absolutely necessary for them to be able to deliver successfully. Figure out what makes sense by considering the organisation culture and the consulting engagement setup. Whether it is by use of frequent check-ins or online collaboration, be prepared to put in additional efforts to make sure that the information gets to where it needs to go.
As well as good communication, organisations have to be able to adjust as needed. Occasionally everything does work out according to plan, but that is more the exception than the rule when it comes to complex life science informatics projects. While timelines and commitments are critical, it is important to view any project as a collaboration. There will be unexpected software issues. There will be unplanned organisational changes and problems. People get sick, life happens. By having open and continuous dialogue, you can be best prepared to make the adjustments needed to find solutions together to unexpected problems.
Consultants can be hugely valuable to you and your organisation.
But you have to setup the right conditions for everything to work out well.
Working together, you can get to where you need to go.
If you’re interested in working with Scimcon on your upcoming informatics project, contact us today for a no-commitment chat about how we can help you succeed.