The evolution of pharmacovigilance?

Jamie, please tell us a bit about yourself and your background 

My name is Jamie Portnoff, and I am the founder and principal consultant at JMP Consulting. JMP Consulting assists clients in the pharmaceutical industry to achieve and sustain compliance and improve overall performance in pharmacovigilance (PV) and related functions like quality, medical information and regulatory affairs. Before founding JMP Consulting, I worked in the pharmaceutical industry. Not many management consultants working in PV have hands-on, real-world PV experience; this experience means I understand the realities of day-to-day work in and around PV, and how challenging it can be to deliver against requirements and expectations.  In my earliest days in industry, I especially enjoyed working with people and on projects, and I soon realised that I wanted to marry up my problem solving and analytical skills with my practical industry knowledge, and after a few years of working with big consultancy companies I decided to start JMP Consulting.

Big changes are coming in PV, but before we look at the future, we need to understand the past

Let us look at the last three decades.

In the 1990’s there were basic PV safety database systems, such as ArisG, ArisLite and ClinTrace. Fax machines were a huge part of the tech that enabled PV processes, with a high volume of incoming and outgoing data by fax. Processes were extremely paper-intensive and were designed to accommodate transactional work, such as processing of cases and putting aggregate reports together; everything was very compliance-focused. Consequently, there was demand for full-time roles dedicated to paper management, typing up documents and data entry. Teams were typically regionalized, and everything was done “onshore”.

In the 2000’s, PV technology became more sophisticated, more globally oriented. There were advances in what the technology could do, and consolidation of major tech players due to M&A activity. Paper-based processes began to give way to more digitization and electronic workflow management. Analytics tools become more prevalent and more user-friendly.  However, a typical PV department was still very paper-intensive. Some of the regionalized models began to consolidate to one system, one process, and one organization, particularly between US and Europe. 

Throughout this decade, more stringent regulatory requirements were continually being introduced, such as the Risk Evaluation and Mitigation Strategy (REMS), as well as Volume 9a. Consequently the bar was being raised for the calibre of work, and quality management expectations were increasing. We saw more focused teams dedicated to signal detection and risk management, and specialized teams emerged to manage increasing business system needs as the regulatory requirements led to increasingly complex systems. Dedicated vendor oversight teams were also required as companies began to work offshore with vendors.

Over the last decade, good pharmacovigilance practices (GVP) were introduced in the European Union (EU). The Qualified Person for Pharmacovigilance (QPPV) is not a new requirement, but it became clear that this person needs a whole team around them to support them and help shoulder the workload.

Offshore work has grown in magnitude, partnerships between companies have become an integral part of how business is done, and next generation technology is rolling out to improve efficiency and consistency. Safety systems have become truly global, enabling a scalable end-to-end safety process within a single system.

Figure: Illustrated examples of the way the world of PV has evolved.

What will happen with the advent of next-generation technology?

Big changes are coming with PV technology, which will drive major shiftsin the way we think about how PV work gets done. We have seen evolution in PV technology before, but it seems this time around will be more impactful than anything from the past 20 years.

With the advent of next-generation technology, new hard skills will be required, such as understanding of machine learning, natural language processing and artificial intelligence. Organizations need to be able to manage transformation of the PV business effectively and regularly, and leverage advanced analytical tools to derive meaningful insights from various data sets. Additional ‘soft’ skills will also be needed, such as adaptability, flexibility, open-mindedness as well as the ability to ‘think outside the box’ to drive improvements through innovative thinking.

New roles within the organisation will emerge, with specific roles dedicated to:

Meanwhile, other roles will fade out and teams of people (in-house or outsourced) performing transactional activities will become a thing of the past.

What does it mean to be future-ready in pharmacovigilance?

From a process perspective – Processes must be highly scalable to accommodate growth in volume and complexity, and a blend of proven and cutting-edge technology is needed to support and enable this. A future-ready process has metrics to enable continuous improvement; it can efficiently evolve and adapt to simultaneously accommodate new regulations, innovative products and evolving stakeholder expectations.

From a technology perspective – Highly agile, flexible and robust, technology needs to be business-led with strong IS support and should be woven into an organisation’s processes, not vice versa.

From a people perspective – People in the organisation must accept increasing automation of processes – you can have the best technology in the world, but if the people in the team are rejecting it, it is not going to be successful. Well-managed resource models are also hugely important.  The organisational structure must be designed around the business’ needs, not vice versa. Employees should offer more than one skillset and in return, they must have a pathway to develop professionally. It is critical that a team can approach things from different angles and can adapt to change – these days excelling in just one area is often not enough.

The challenges of implementing ePRO – part one?

Benefits of Implementing ePRO 

When it comes to documenting the advantages of using ePRO over paper in clinical trials, the benefits are clear. 

With all the advantages to using ePRO over paper it seems to be a no-brainer to use ePRO whenever possible. However, it’s important to be mindful of certain considerations and challenges that come with the implementation of ePRO within your organization before jumping in.   

Challenges and Considerations 

Historically the implementation of electronic clinical systems in general has been challenging. In the majority of cases it requires the move from a paper-based process to an electronic system in an environment where the reliance has always been on paper, hindering the adoption of computer systems that are seen as alien. Taking EDC as an example, the response to an international survey cited that 46% of respondents identified inertia or concern with changing current process, and 40% identified resistance from investigative sites as the major causes for adoption delays [2].    

ePRO is not immune to these challenges. In fact, it could be argued that ePRO is even more susceptible. While ePRO suffers from the traditional technical issues and user acceptance that EDC experiences, ePRO is also placed in the hands of potentially thousands of study participants many of whom may have little technical understanding. Additionally, ePRO relies on hardware (mobile device or tablet), cell network or WIFI connectivity, translation into the participant’s local language, multiple userbase (study teams, investigators and participants) and local helpdesk support, all of which comes with their own set of challenges and associated costs and few, if any, of which are encountered with EDC.  

ePRO is one of the few electronic systems that directly collects source data and as a result comes under increased scrutiny from a data integrity and quality perspective, especially when used for primary or secondary end point data collection. The system must always be available in order to allow subjects to be activated on ePRO devices. If a participant leaves a clinical site without an active device, this can result in missed data which can be construed as a serious quality issue and perhaps put subject safety at risk.  

Before I go into the more detailed challenges associated with ePRO, let’s first consider the financial costs. 

Cost   

On the surface of it, it would appear that implementing ePRO is significantly more costly than paper. The expense of the devices, associated logistics and data usage (monthly SIM costs), the licenses, helpdesk and translations all contribute to costs that range from hundreds of thousands of dollars to multi-million-dollar contracts per study.  

When making a business case for ePRO it is important to take into account the hidden costs associated with paper in order to compare the two.  

  1. The additional eCRF’s that must be created in your EDC to house the data transcribed from the paper PRO.  
  1. The time spent by the site staff transcribing data from the paper PRO into the EDC and the associated monitoring required to source data verify the transcribed data. For example; this activity requires onsite visits by the CRA.  
  1. The data cleaning at the end of the study by the Data Management team, which takes time and effort with multiple communications back and forth to the investigator. ePRO can significantly reduce the timeline associated with data cleaning due to the nature of these data being electronic source data.  

When conducting a full assessment, the gap between the cost of implementing ePRO vs paper reduces significantly. ePRO vendors have attempted to provide examples which result in paper diaries actually contributing more cost to a study budget than ePRO.  

The business case for implementing ePRO should not be solely based on raw cost. This will likely result in failure to get agreement at the leadership level. You will find it easier to get acceptance if you can prove that ePRO costs are comparable to paper while also concentrating on the non-tangible benefits as, in the case of ePRO, these are the real reasons for its consideration. Increasing the quality of your data collection results in more confidence in that data, which in turn reduces the likelihood of rejection when submitted to the regulators (predominantly for primary and secondary end point data). Receiving the data in real time and reducing the need for data cleaning can aid the ability to get a product to market quicker by shortening the timelines to close the study, which in turn results in cost avoidance.   

Many ePRO vendors will provide a cost calculator; a spreadsheet where the sponsor can plug in parameters associated with their study to provide an estimate of costs before engagement with the vendor. Only a small number of parameters are required to calculate a good estimate with the most important being the length of the study in months and the number of participants. The length of the study drives the helpdesk, data usage and PM costs, whereas the number of participants drives the device, logistics and shipping costs. There are other costs associated with the configuration of the system, translation, shipping, number of sites etc, however these are often negligible in comparison for larger studies.   

In summary; it is important to build a business case for ePRO within your organization in order to assist in gaining acceptance at a leadership level. The business case should include areas of efficiency over paper together with examples of ePRO costs using the cost calculators provided by the vendors, as well as emphasizing the other benefits of ePRO, such as subject safety, compared to paper solutions.  

Software 

In the past, ePRO implementations were customized pretty much from the ground up, coding the study specifics into the vendor’s study builder toolkit. This resulted in a huge effort required to validate the system to ensure errors and bugs were captured before studies went live. Inevitably despite all this testing some issues did make it through to the live study, causing frustration for the participants, investigators and study teams.   

Over the past decade the systems have become more sophisticated. Less code is required during the implementation phase, which has been replaced with configuration. Vendors have also introduced library functionality which allow sponsors to define questionnaires up front that can be reused across studies. As the questionnaire is not rebuilt every time there is less opportunity to introduce errors. Additionally, the ability to reuse questionnaires from a library also results in less work by the vendor per study, less validation on behalf of the sponsor and can reduce the time and costs during the implementation phase.  

It may also be possible to standardize other areas of functionality, perhaps the workflow as to when questionnaires are made available to the participants, or the alerting system, or the visit schedule. It may not be possible to standardize across therapeutic areas, but within a therapeutic area where multiple studies collect the same data this approach can result in substantially reduced timelines during the implementation phase, while reducing the risk of software errors on the studies.     

Hardware (mobile device or tablet) 

ePRO can be implemented in a number of different modalities. In this blog, we are concentrating on provisioned devices; devices that are provided by the ePRO vendor at a cost to the study Sponsor, and “bring your own device” (BYOD); where a subject’s own device is used as the ePRO instrument. It should be noted that all studies require provisioned devices to a certain degree in order to cater for cases where a subject either does not own a compatible mobile device or does not own a mobile device at all. 

When provisioning devices, ePRO vendors are responsible for the associated logistics such as software installation and shipping. Vendors are generally very knowledgeable when it comes to the customs regulations in many countries, including the average timelines required to get a shipment to a site. 

In scenarios where competitive recruitment between sites is employed, it is particularly important to plan ahead.  As it may not be possible to predict the number of subjects that will be recruited at a specific site and therefore the number of devices required at that site, it is necessary to purchase a sensible overage of provisioned devices. Although costly it ensures sites will not run out of devices.  

With an increasing number of the world’s population now owning smart phones, BYOD was seen as the natural progression for ePRO. It reduces the costs and burden of acquiring devices and the associated logistics and also reduces the monthly costs for data usage. These costs do not completely disappear as a certain level of provisioning is required for those cases where participants don’t own a compatible smart phone. BYOD also reduces risks associated with not having enough devices on site, especially, as mentioned above, with competitive recruitment. However, BYOD does come with own set of unique challenges, mainly associated with data integrity and privacy. Some considerations might be: 

  1. How do you ensure equivalence between participants devices to guarantee the questionnaire is displayed to each participant without the introduction of bias? 
  1. As the participants devices are not locked down participants can turn off alarms/alerts, change the date and time etc. How can this be managed? 
  1. Who pays for the data usage on the participants SIM card? 
  1. Participants are not required to carry multiple devices (study device and their own smart phone). This is seen as a benefit and can increase compliance further. 
  1. As the participant will use their own personal smart phone and it is not dedicated to the study, the participant may have concerns over the security of their non-study related private information. 

There are clearly a lot of benefits of using BYOD over provisioned devices with more and more sponsors feeling comfortable moving into this space, however it is important to consider the implications before doing so.    

In the next instalment of this blog, we will discuss some other challenges of implementing ePRO in your organisation, such as connectivity, translations, and end user acceptance testing. Keep an eye on our Opinion page for part two of the series, coming soon. 

References 

[1]  ‘Electronic for Industry – Electronic Source Data in Clinical Investigations’, FDA 2013 http://www.fda.gov/downloads/drugs/guidancecomplianceregulatoryinformation/guidances/ucm328691.pdf  

[2]  ‘Welker JA. Implementation of electronic data capture systems: barriers and solutions.’ Contemp Clin Trials. 2007 May;28(3):329-36. doi: 10.1016/j.cct.2007.01.001. Epub 2007 Jan 11. PMID: 17287151. 

Planning for successful User Acceptance Testing in a lab or clinical setting?

What is User Acceptance Testing?

User Acceptance Testing (UAT) is one of the latter stages of a software implementation project. UAT fits in the project timeline between the completion of configuration / customisation of the system and go live. Within a regulated lab or clinical setting UAT can be informal testing prior to validation, or more often forms the Performance Qualification (PQ).

Whether UAT is performed in a non-regulated or regulated environment it is important to note that UAT exists to ensure that business processes are correctly reflected within the software. In short, does the new software function correctly for your ways of working?

Identifying and managing your requirements

You would never go into any project without clear objectives, and software implementations are no exception. It is important to understand exactly how you need software workflows and processes to operate.

To clarify your needs, it is essential to have a set of requirements outlining the intended outcomes of the processes. How do you want each workflow to perform? How will you use this system? What functionality do you need and how do you need the results presented? These are all questions that must be considered before going ahead with a software implementation project.

Creating detailed requirements will highlight areas of the business processes that will need to be tested within the software by the team leading the User Acceptance Testing.

Requirements, like the applications they describe, have a lifecycle and they are normally defined early in the purchase phase of a project. These ‘pre-purchase’ requirements will be product independent and will evolve multiple times as the application is selected, and implementation decisions are made.

While it is good practice to constantly revise the requirements list as the project proceeds, it is often the case that they are not well maintained. This can be due to a variety of reasons, but regardless of the reason you should ensure the system requirements are up to date before designing your plan for UAT.

Assessing your requirements

A common mistake for inexperienced testing teams is to test too many items or outcomes. It may seem like a good idea to test as much as possible, but this invariably means all requirements from critical to the inconsequential are tested to the same low level.

Requirements are often prioritised during product selection and implementation phases according to MoSCoW analysis. This divides requirements into Must-have, Should-have, Could-have and Wont-have and is a great tool for assessing requirements in these earlier phases.

During the UAT phase these classifications are less useful, for example there may be requirements for a complex calculation within a LIMS, ELN or ePRO system. These calculations may be classified as ‘Could-have’ or low priority because there are other options to perform the calculations outside of the system. However, if these calculations are added to the system during implementation, they are most likely, due to their complexity, a high priority for testing.

To avoid this the requirements, or more precisely their priorities, need to be re-assessed as part of the initial UAT phase.

A simple but effective way to set priority is to assess each requirement against the risk criteria and assign a testing score. The following criteria are often used together to assess risk:

Once the priority of the requirements has been classified the UAT team can then agree how to address the requirements in each category.

A low score could mean the requirement is not tested or included in a simple checklist.

A medium score could mean the requirement is included in a test script with several requirements.

A high score could mean the requirement is the subject of a dedicated test script.

Planning UAT

A key question often asked of our team is how many test scripts will be needed and in what order should they be executed? These questions can be answered by creating a Critical Test Plan (CTP). The CTP approach requires that you first rise above the requirements and identify the key business workflows you are replicating in the system. For a LIMS system these would include:

Sample creation, Sample Receipt, Sample Prep, Testing, Result Review, Approval and Final Reporting.

Next the test titles required for each key workflow are added in a logical order to a CTP diagram, which assists in clarifying the relationship between each test. The CTP is also a great tool to communicate the planned testing and helps to visualise any workflows that may have been overlooked.

Now that the test titles have been decided upon, requirements can be assigned to a test title and we are ready to start authoring the scripts.

Choosing the right test script format

There are several different approaches to test script formats. These range from simple checklists, ‘objective based’ where an overview of the areas to test are given but not the specifics of how to test, to very prescriptive step by step instruction-based scripts.

When testing a system within the regulated space you generally have little choice but to use the step by step approach.

Test scripts containing step by step instruction should have a number of elements for each step:

A typical example is given below.

A screenshot of a cell phone
Description automatically generated

However, when using the step by step format for test scripts, there are still pragmatic steps that can be taken to ensure efficient testing.

Data Setup – Often it is necessary to create system objects to test within a script. In an ELN this could be an experiment, reagent or instrument, or in ePRO a subject or site. If you are not directly testing the creation of these system objects in the test script, their creation should be detailed in a separate data setup section outside of the ‘step by step’ instructions. Besides saving time during script writing, any mistakes made in the data setup will not be classified as script errors and can be quickly corrected without impacting test execution.

Low Risk Requirements – If you have decided to test low risk requirements then consider the most appropriate way to demonstrate that they are functioning correctly. A method we have used successfully is to add low risk requirements to a table outside of the step by step instructions. The table acts as a checklist with script executers marking each requirement that they see working correctly during executing the main body of step by step instructions. This avoids adding the low requirements into the main body of the test script but still ensures they are tested.

Test Script Length – A common mistake made during script writing is to make them too long. If a step fails while executing a script, one of the resulting actions could be to re-run the script. This is onerous enough when you are on page 14 of a 15 page script. However, this is significantly more time-consuming if you are on page 99 out of 100. While there is no hard and fast rule on number of steps or pages to have within a script, it is best to keep them to a reasonable length. An alternative way to deal with longer scripts is to separate them into sections which allows the option of restarting the current block of instructions within a script, instead of the whole script.

Are all the requirements covered?

An important task when co-ordinating UAT is be fully transparent about which requirements are to be tested and in which scripts. We recommend adding this detail against each requirement in the User Requirements Specification (URS). This appended URS is often referred to as a Requirements Trace Matrix. For additional clarity we normally add a section to each test script that details all the requirements tested in the script as well as adding the individual requirement identifiers to the steps in the scripts that test them.

What comes next?

UAT is an essential phase in implementing new software, and for inexperienced users it can become time-consuming and difficult to progress. However, following the above steps from our team of experts will assist in authoring appropriate test scripts and leading to the overall success of a UAT project. In a future blog we will look at dry running scripts and formal test execution, so keep an eye on our Opinion page for further updates.

Industry leader interviews – Ajit Nagral. An insight into the world of scientific entrepreneurship?

Ajit, please introduce yourself. 

My career history is uncomplicated because straight out of college I started working for myself, and it has been that way for my entire career. I graduated in the US with a background in computing science and I decided I did not want to take the traditional path of finding a job and building a career that way. I always liked to be doing things differently. Coming out of college with very little money and a limited skillset, the reasonable thing to do was to get into software consulting because that did not require a whole lot of capital. Since then, I have founded four companies in the life science sector.

What led you to the science industry?

I ended up in the pharma and life science industry very early on by chance. After I graduated, I was in Boston with database and computing skills, and I started a small consulting company called Megaware. I found out there was a large life science vendor in Massachusetts which had some opportunities around a new life science product they were building. After a lot of persistence, the CIO reluctantly gave me 15 minutes to speak with him. I told him about my background and what I was attempting to do – he said they didn’t have anything for me within his organisation, but he would connect me to his counterpart at their analytical instrument division. I then got a contract to help this analytical instrument company build a part of their (then) new product, the first database driven instrument software, and that was my entry into the pharma world. That seems simple, but it was fortuitous and persistence more than anything else.

Can you give us a potted history of each of your companies? 

My first company was Megaware. Back in those days labs were making the transition from VAX.VMS-based systems to PC-based systems. Enterprises had huge investments in VAX.VMS systems and in HP printers. We produced a product to be able to print from a VAX.VMS system onto a HP printer. It seems straightforward (but it was not!) because you are printing to a Windows-based printer but from a non-Windows system. We built a whole system to rectify this issue. I ran and built Megaware for 4 – 5 years and it was then sold to a boutique consulting company.

NuGenesis was my second company. We identified the issue that labs had several different instruments, from different vendors, but they did not talk to each other. You had large pharma companies, printing reams of paper, spending $millions of dollars on running labs across the globe and eventually all of that intellectual property ended up on paper! In those times when they submitted a new drug application, those applications were on tens of hundreds of pieces of paper which were carried to the regulatory agencies on trucks for review! It made no sense that something started out electronically and ended up on paper to be read by somebody manually. We were able to intercept print streams and capture a lot of information to make the data live. It is remarkable that 20 years later it is still being used – that says a lot about the value and sustainability of NuGenesis. I sold the business to Waters in 2004.

After I sold NuGenesis I was clear I wanted to stay in life sciences but do something different. I went back to many of my clients and ask what problems they are facing – that is when I landed into outsourcing in the area of drug safety, clinical and regulatory.

What was new and different was an area called pharmacovigilance. If you recall, there were a couple of landmark cases related to drugs in the market that had caused deaths. That is when the regulatory agencies realised they do not have a handle around adverse events. They approve drugs, they come to market and years later you start seeing adverse reactions that you did not see during the trial period. The regulatory agencies started mandating reporting of all adverse effects. With the visibility and potential liability, the biopharma industry sprang into action and the flood gates opened to drug safety outsourcing. This was when we launched Sciformix – a scientific knowledge-based outsourcing provider for the life science industry. Any given year when we got to the maturity stage we were doing around 1 million cases. We did everything from cancer drugs to consumer products, from cancer medication to sunscreen lotion. Our success at Sciformix was due to our ability to combine enough science and a very good process. Again, the company grew very rapidly to over 1300-1400 people globally, and it came to a point where it felt like the time was right to divest in 2018. Sciformix was acquired by LabCorp/Covance, a top three CRO (and currently a leader in COVID19 diagnostic testing).

What was the motivation behind the launch of Scitara? 

Having done tech, services and global delivery, I thought I should combine these skills and focus on my finale!

Our core team believes Scitara is more than just a business, it is a goal of ours to solve a major problem that still exists in the scientific laboratory: data connectivity. We are pioneering a new digital revolution when it comes to lab data connectivity. We have invented a platform called Scitara DLX (data lab exchange), and our goal with this platform is to connect your instrument, application, or anything else you use in the lab to our platform and we guarantee they can talk to each other.

Our goal is that science labs can log into any system that they are currently using and can access data from any other system that is in the lab. We have a mantra of ‘no application or instrument left behind’. For us to achieve this goal we need cooperation from the industry, which is why I am calling this a finale. It will require all our connections we have made over the years and the reputation we have built to reach out to everyone in the ecosystem. Companies are making significant headway in their digital transformation initiatives, except they do not know how to get their lab data onto their digital platforms, and that is where we come in.

How did you find your entrepreneurial drive?

I am very driven to be independent. I am useless when it comes to working for someone else and fortunately, I have never had to. My personality drives me to try new things and dive into uncertainty and this has always pushed me into something completely new.

The building of my companies motivated me. What excites me is the building from the ground up. Each time the building is easier, but the expectations are higher. I do not build to divest – I build to create value, disrupt, and hopefully deliver a meaningful impact, and the rest takes care of itself.

If anyone comes to me for advice or mentoring, I ask them why? Why do you want to do it? Why you? What is the motivation? That tells a lot very early on about the chances of success that a person may or may not achieve. It does not guarantee success, but if you have a good understanding of the ‘why’ it helps you go a long way. Beyond that I’d say it is important to find a mentor from the industry – people need to recognise that investments happen in teams not necessarily ideas. Do not latch onto an idea too much because things can change.

Create a loyal fanbase, people often think I have 500+ clients, but it is not the number that counts, it is whether you have a handful of loyal clients who make a lot of noise and reopen doors. That becomes exceedingly important.

What would you say makes you a successful entrepreneur? 

We do not rely on big sales engines in our industry. It is about building solid connections and networks. When clients learn that I created the concept behind several successful companies, people admire that. There is no better way to connect with a client than something that they are fond of and that I am proud of.

I have learnt the hard way; you build the best partnerships in tough times. When things do not go right, it is how you react that defines not only your relationship but your career as an entrepreneur. I have sold to the same clients across multiple companies. Most of those clients I have had difficult moments with, and it has made our relationship much more resilient.

Having a non-scientific viewpoint has also really helped, particularly when it comes to products. To be able to look at the consumer world, or industrial world or finance world and understand how technology has evolved there and bring those learnings into the scientific world is invaluable.

What does the future hold for Ajit Nagral? 

This is the first time after having done this for 20+ years that I have the liberty and luxury to say if this part of my journey were to end, what would be my new journey? It is the first time I have thought about it, and I think it comes with experience and the safety net I have built for me and my family. I am eternally grateful to my customer, employees and investors to put me in this position.

Hopefully Scitara is my last company, as an operating founder. There are many other things I want to do. In addition to being a tech guy I am also a musician. There are things I am doing in music production that I have started already – hopefully in a few years once I am done with Scitara, that is where I will end up!

Industry leader interviews – Andrew Miles. The role of pre-sales demos in successful vendor selection events?

Andrew, please introduce yourself.

I am Andrew Miles and I have worked with commercial vendors in informatics pre-sales for over 20 years. I originally trained for seven years as a biomedical scientist in haematology and blood transfusion science, however, I made the decision to move into an IT role, and enjoyed my responsibilities implementing IT systems into the lab.

Tell us more about your role in the lab software sector

I was in my first IT role for seven years, training scientists on how to roll out lab software. As my role progressed, I started to work on pre-sales, and I really enjoyed this aspect of the job. I then started looking for a pre-sale focused role and landed a position at LabVantage Solutions in 2000, which was a relatively small vendor at the time. Most of my experience comes from here, as it was a hugely diverse role working with customers from every area of lab testing imaginable.

In 2016 I moved to the healthcare sector on diagnostics solutions at InterSystems, before I made the decision to retire in April 2020.

Do all Lab Informatics vendors have pre-sales specialists?

I would say the majority of vendors do have specialists in this field. You can split the sector into big industry players which offer a broad range of solutions, and then smaller companies which may offer simpler or more niche solutions, or only operate in one field. These big vendors would certainly have a dedicated team, and this team will be made up of individuals (like myself) with lab experience, as an understanding of how labs operate is vital.

In addition to pre-sales and sales, some labs involve consultancies like Scimcon to help lead their projects and advise. This is how I met and built a relationship with the Scimcon team – Scimcon was consulting on a project with one of our prospective clients and was impressed with our demo, and we got to know each other quite well.

How long is the process?

In terms of timing, it really varies. I worked on one project which ran for about five to six weeks in total. I have also worked on a project which spanned over 18 months until it was completed. It depends on the complexity of the project being undertaken, and the size. A small project in the UK covering only one or two labs will take much less time than a global project encompassing multiple sites.

If a pre-sales demo is successful, there is an official handover when the contract has been signed between pre-sales and sales teams, and the implementation teams. However, it would not be a clean-cut finish point, and pre-sales and sales would normally remain involved in some way as the implementation teams will have questions as the project moves forward. For example, if we had offered bespoke functionality, we would need to be held accountable and help the implementation teams understand what was offered and why. This means that we would be involved in a process for much longer in some cases.

How are customer expectations managed?

When it comes to customers, it is important to remember that labs have a clearer idea of where they are heading in the future than software vendors do, so it’s about doing your research and making sure you understand your customer as well as they understand what they need to achieve. It is difficult to create a product which ticks every single box, but most vendors are now aiming to create a product that works straight out of the box and can be configured to meet requirements. This is particularly helpful in demos, as you can offer a wide range of features within the product as it comes, but you can still offer configuration if a particular feature is needed.

What best practice have you observed in how potential clients prepare internally and brief vendors during selection processes?

I think it is particularly important that multiple people from the lab are involved in the demo. Not just the decision makers, but the lab technicians, the managers and IT departments who can understand the tech. Everyone who will be impacted by new software and products at all levels should be involved in the process, and in the past, we have found it really unusual to give a demo to only one or two people.

It also helps when clients have a clear idea of what they are looking for, and what they want to get out of their product. With this kind of information, it makes it much easier for pre-sales teams to advise further and offer a solution that is most appropriate for the client’s need. However, at the same time, a client does need to be prepared to compromise. It is difficult to create a one-size-fits-all product, and pre-sales teams have to be prepared to walk away if they are unable to meet the customers’ requirements without creating a hugely complex product. If a client is willing to negotiate, and is happy to cover additional costs, then technology vendors could then consider a more bespoke approach and solution.

How can you tell when a demo has gone well?

As I said before, a demo should be reaching the whole room, not just the decision makers. A sale is obviously a clear sign you have presented a successful demo, but you can normally grasp the general feeling in the room about halfway through. You will find that people start leaning forward and engaging more through questions and comments, which shows you have piqued their interest. It also helps if you build allies in the room, as you will find they are able to counter any negative opinions, and interest in the product spreads around the room.

Were there ever any times when you believe a demo could have gone better?

Of course, we have not always got it right, and those are the times that taught us how important market knowledge is in securing a prospective client. One instance I recall was when myself and member of the sales team were asked to do a last-minute demo, and my colleague said that the client was in the mineral mining industry. The first thing the coordinator asked us during the demo was if we had any experience with customers in the prospecting and diamond mining industry, which we did not. We managed to recover with some relevant content, but even so, it is still a situation we would definitely want to avoid. If you are not showing relevance or credibility as a vendor to prospective clients, you really set yourself up for a fall.

What is the outlook for the Lab Informatics industry, where do you see it going?

I have already seen a lot of change in my 20 years in pre-sales. Technological changes are definitely the biggest area, but more recently we’re seeing surges in specific scientific areas such as genomics and the use of biorepositories for specimen storage, donor consent and retrieval.

As technology has advanced, we are also seeing more intelligent technology come into the field. As artificial intelligence algorithms are programmed into systems such as LIMS, we could expect another surge towards extended automation, which will save both vendors and client’s money in the long term.

Do you believe the current pandemic situation will have lasting effect on technology vendors?

I do not believe the current situation will have a long-lasting impact on vendors. In the short term we may see issues with implementation of systems, as current restrictions only allow necessary work to be carried out, adhering to social distancing measures. In the long-term implementation activities will return to normal, there may be additional opportunities for vendors to provide data handling system as some labs have geared specifically towards COVID-19 testing. I am confident that the LIMS market will eventually return to something like normal.

Laboratory IS strategies promise a step change?

Do you need improved laboratory informatics systems?

Systems such as LIMS, ELN, LES, SDMS and CDS can contribute significantly to increasing efficiency in the lab, releasing time for core science activities and enabling a wider audience to utilise valuable scientific data.

These systems, of course, come in a variety of formats suited for different industries and processes, from the earliest stages of cutting-edge research to the defined workflows in QC laboratories.

There is a myriad of drivers for adding or upgrading such systems however these drivers are nearly always linked to delivering faster decisions, that are more accurate and that are made with increased confidence.


Where should we invest next?

With a plethora of informatics systems all competing for your organisation’s attention, it can be complex to decide your next move. The existing systems landscape within a laboratory adds to this conundrum, as few labs are greenfield these days. Decisions on information systems made several years ago contribute to systems entanglement that influences todays’ direction.

Developing a laboratory information system strategy that defines the desired target state of systems can vastly assist in taking decisions in the short and medium term that deliver real impact without tying your hands in the longer term.

A great example of long-term goals effecting current decisions is a project we worked on twenty years ago. We worked with a Fast-Moving Consumer Goods (FMCG) company to implement a LIMS, configured to manage the day to day activities of new formulation research. A key requirement was to output the formulation candidate results to what would have been called a data warehouse back in the day. This ‘pool’ of successful and unsuccessful formulation data was to be used by statistics applications to predict interesting formulation areas. The future state being that this data analysis would reduce the number of formulations tested to produce a new product. Crucial in such a competitive space, this would ultimately reduce time to market. Interestingly, over the yearsthe LIMS used by this global company has changed several times, but the project to predict formulations is still a keystone of their information systems strategy.

This example was not chosen randomly, this challenge of balancing the provision of correct tools to support ‘bench science’ and enabling repurposing data for ‘desk-based science’ is of great significance to the life sciences industry.

Laboratory Information Systems Strategies

Information system strategies start by clearly defining what laboratory objectives are required in order to deliver the organisation’s overall business plan. Investigation workshops can then be structured with a cross section of laboratory personnel centred on both the laboratory objectives and the requirements of current and potential systems. Systems currently in place should also be evaluated typically assessing their Strengths, Weakness, Opportunities and Threats (SWOT).

The information gathered in the discovery phase can then be used to document the current state, develop the future state and importantly create a prioritised blueprint of how to move to the future state.

The diagram shows a typical laboratory information systems landscape.

A close up of a sign Description automatically generated

 

The diagram is of course generic and simplistic, some systems will be relevant to some organisations and not to others. Even when you have a well-defined systems high level plan, there are still many details to resolve. Questions such as which processes / data do we include in ELN vs LIMS? Is one vendor for multiple systems better than multiple best of breed vendors? How master data are created and shared can derail the strategy if they are not approached with care. 

Maintaining Relevance

Keeping the IS strategy alive and aligned to the organisation’s plans is critical to achieving long term benefits.

As the picture of the new landscape starts to become clearer, now is the time to consider the governance and maintenance of the IS strategy.

  • What will the composition of a steering group consist of?
  • How are new projects formalised and submitted to the steering group?
  • What size / scope of projects fall under the remit of the steering group?

In addition to ongoing governance, it is important to keep a watchful brief on new technology. I recently heard an industry insider decry the future of ELN and LES, ‘they will wane in popularity and disappear’. While I don’t think this will happen, I can imagine a future where ELNs and LES are driven by voice commands and VR tools, such as Microsoft HoloLens, which could be used to replace existing keyboard, mouse and monitor interfaces.

Another good example is the use of vendor neutral instrument data formats. These have been touted as the future of archive and data reuse for a considerable number of years. However, with the ever-increasing emphasis on ‘desk-based science’ and the growing interest in AI and ML, this could be ‘vendor neutral data format’s’ time in the spotlight.

Final Thoughts

IS strategies develop a structured approach for IS projects and focus your budget on achievable business objectives and promise a step change in utility for scientists.

But until more companies start to adopt best practice approaches to IS strategy, they will struggle to get the most out of their investments, not to mention the knowledge, data and resources trapped in their organisations.

Digital transformation: Revolutionising the labs of the future?

Scimcon has worked with many lab-based clients throughout our 20 years in the industry, across a vast range of projects. Here we discuss the current challenges that labs are facing in 2020, and the work that needs to be done through digital transformation to ensure that labs in the future can streamline and manage their data.


The limitations of the current laboratory information systems landscape

Today’s labs are facing similar challenges as camera companies. Camera manufacturers such as Nikon and Canon are now faced with the challenge of selling to a new generation of budding photographers, most of whom by now have grown up with increasingly higher-quality smartphone cameras. As a result of having access to technology that is designed for ease of use, this generation of users find themselves progressively more frustrated with traditional technology and methods required to operate today’s ‘real’ cameras. Where smartphones can offer instant uploads to online services; amazing results that leverage computational photography; and synchronicity between multiple devices, traditional cameras appear complicated, difficult to control and impractical. Camera companies therefore face the challenge of building usability, such as that found in smartphone cameras into their existing products, otherwise they risk losing a whole demographic of potential customers.

The analogy is that modern labs are facing a similar problem. As new generations of scientists join laboratory settings, many are finding the lack of synchronicity and usability of information management systems increasingly frustrating. Why can’t we check instruments remotely whenever we want? Why can’t data be easily transferred between devices or colleagues? Why isn’t all this information seamless? Limitations such as these can be hugely time-consuming, as well as resulting in reduced productivity and security risks for data with minimal protection. Similar to the camera makers, we are risking losing the best new talent to other areas of science. Digital transformation addresses these challenges head on, with the proficiency to make your lab more intuitive and efficient.

What is digital transformation and how can it enhance your current lab setup?

Digital transformation involves the integration of new technology and methods into existing lab technology. Although this advancement in technology is a relatively new development within the laboratory setting, lab managers are quick to realise that digital transformation is essential in optimising workflows and productivity. In 2018, 70% of labs were reported to have a digital lab strategy in place or were working towards one1– a number that we can only expect to have significantly increased since then.

Significant effort has taken place in laboratories over the past two decades or more, which has delivered substantial benefits. This effort has been focused on the key lab workflows and the matching informatics systems such as CDS, LIMS, ELN, LES and SDMS, to mention a few. The next decade needs to build on this success to create a true digital laboratory.

Digital labs of the future: what can we expect?

Digital lab transformation is more than just implementing informatics systems, it involves taking these systems and pushing them a step further. For example, a lab could connect instruments bi-directionally to LIMS or ELN, but digital lab transformation would also facilitate online monitoring of instrument status, automatic ordering of consumables, reserving instrument time, auto-tracking utilisation and the use of telemetry data to predict faults before they happen.

A digital lab may also utilise a feature rich LIMS, ELN or LES that enables collation and review of all results for an experiment, but a digitally transformed lab would also be able to collate results across potentially several LIMS and ELNs throughout an organisation. This would allow the promotion of internal and external collaborations, enabling the ‘science later’ paradigm of cross team, cross technique and cross experiment data mining. This, in turn, will progress artificial intelligence and machine learning.  

Overall, a digital transformation is more than just providing scientists with the means to spend more time on actual science. It provides the complete toolset of a lab wherever a scientist may be, whether that is in the lab itself, in an off-site office, in a café or even at the kitchen table.

At present, even top laboratories face problems with a lack of modernisation, and this is a problem that is slowly trickling down to smaller labs that are starting to face similar challenges. If we continue to drive forward with the help of innovative technology, we could expect to see many labs becoming more efficient, more supportive of science and more reliable than ever before.

However, to do this, it is up to laboratory leaders to have a clear vision of where they see their lab going. It is hard to transform any business by only doing little bits, so it is up to the higher levels of lab personnel to decide what steps to take to ensure that their labs are working at optimal capacity and potential. This is where Scimcon can help.

How can Scimcon help to revolutionise your lab?

Scimcon is proud to offer a range of digital lab services to assist in digitising a lab, many of which are outlined in our introductory blog. We are also able to help labs go that step further, with our collective wealth of experience in the lab, both as scientists and project leaders. Whether it is the development of the strategy, the running of the programme, or providing resources and leadership for your projects, Scimcon can help you understand what you want to achieve, and how to reach it.

To find out more about types of projects we support, and how we can help you to transform your lab, get in touch.

Reference:

1 ‘Despite steady growth in digital transformation initiatives, companies face budget and buy-in challenges’, https://www.zdnet.com/article/survey-despite-steady-growth-in-digital-transformation-initiatives-companies-face-budget-and-buy-in/

ePRO works better in lockdown: How ePRO serves to keep clinical trials on track, from a distance?

Demonstrating the need for ePRO: COVID-19 makes the point

Outbreaks in recent years, such as SARS, avian flu, and Ebola, in retrospect seem to have been a testing ground for the current COVID-19 lockdown.

During those outbreaks, study sponsors experienced the challenges of managing clinical trials in traditional ways, and many therefore pushed forward their adoption of eClinical platforms to ensure they could still manage their trials in remote locations, while reducing the impact on their project timelines, ensuring their investigators and monitors remained safe, and enabling their subjects to demonstrate compliance.

Why ePRO works so well in lockdown  

The traditional process of conducting clinical trials involves face-to-face interaction with subjects, which is proving difficult in the current lockdown. Subjects still need to make visits to clinical study sites to meet with healthcare professionals however the need to take part in lengthy reviews of their paper diaries is removed when using ePRO. By adopting ePRO for subject reporting, it is possible to significantly reduce the need for close face-to-face interaction with subjects and speed up the collection of quality data. In addition study sponsors benefit from a huge reduction in the travel of the onsite monitoring teams as ePRO increases the ability to conduct remote study monitoring.

With ePRO, the reporting and audit trail is also improved, since it is possible to prove that patients respond daily in accordance with the study Protocol (not possible with manual records).

BYOD (Bring your own device) is another consideration. It is recognised by Scimcon that BYOD reduces hardware challenges associated with the shipment of devices, but more importantly in the current situation removing the need for devices to be passed from human to human both within the logistics departments of the vendors and between the investigators and subjects (often ePRO devices are reused between subjects on an individual study). An upcoming post will cover the BYOD topic in more detail,

Moving from paper: better reporting and ALCOA principles

The regulator’s requirements for Accurate, Legible, Contemporaneous, Original and Attributable (ALCOA) data are achievable with ePRO platforms, where it has always proved difficult to remove doubt with paper-based Patient Reported Outcomes. In paper records, it is far more difficult for instance to prove when information was recorded, whereas in eClinical platforms when and by whom data is entered is automatically recorded, giving more accurate insight. Legiblity issues obviously become an issue of the past with ePRO.

The clinical trials community has been debating the benefits of ePRO for many years, and with the advent of COVID-19 where face-to-face interactions need to be limited, its adoption seems prescient. ePRO is the no-brainer product for now and future trials, both on data quality and in order to reduce concerns of unnecessary social interactions. Scimcon has hands-on experience with global clinical trials projects and is proven with ePRO platforms, doing what we do best: serving the science community with our skilled project teams to manage data projects globally.

Read more about ePRO:

https://www.ncbi.nlm.nih.gov/pubmed/25300613  and http://www.fda.gov/downloads/drugs/guidancecomplianceregulatoryinformation/guidances/ucm328691.pdf

What we have learned from decades of providing remote support to clients in life sciences?

At Scimcon our aim is to show our customers how best to manage ongoing projects successfully by employing a flexible approach to project definition and implementation.

Our teams work both on and off site, with our consultants spending much of their time at our clients’ premises. Depending on the project, we can work anywhere between 20% – 100% on site, however Covid 19 has compelled our team to embrace and adapt to remote working full time.
We are in a fortunate position, as a global project partner, in that the team is already accustomed to working remotely, so Scimcon is still able to move projects forward despite the current climate.

How has our usual Modus Operandi impacted the ability to navigate lockdowns and still deliver consultancy to our clients?

It is imperative for Scimcon to continue to provide a high-quality service without disrupting client projects or relationships. We have a global customer base, a global project team and customer sites all over the world with multiple projects in analytical and clinical applications on the go simultaneously. This means that we have needed to be flexible, adjustable and efficient for years, so we are in a fortunate position where our staff can continue to work in this new way with negligible disruption.

Because we work as an outsourced supplier on complex information management projects for pharma and biopharma, we are already well accustomed to having customer laptops that have virtual private networks that connect through to the organisation, so we do not appear any differently to anyone else. With this remote working already in place, Scimcon is therefore not suffering from a great change, which gives us a strong position to remain a constant in this disruption, despite the everchanging environment.

We have adapted our conventional processes for the proportion of projects which would usually require our onsite support, ensuring the project is not impacted and the deliverables remain the same. The pandemic has certainly clarified what successful and sustainable remote working requires from our team, and we are ensuring we go the extra mile so our clients are still meeting their deadlines, despite the challenges we may face in the process.

How has Scimcon embraced and adapted to remote working since the Covid-19 pandemic?

We are still doing 99% of what we would have originally been doing, we just need to modify certain elements of our workflow process to ensure deliverables remain consistent. The use of white boards in face to face meetings is a tool we have always used extensively for both discovery and brainstorming. We are actively seeking alternative routes to mirror this approach virtually to ensure our project teams can continue to work at full capacity. We currently have multiple graphics tablets, iPads and digitising camera setups which we are evaluating to deliver this.

Things may have changed in the lab, but projects and deadlines don’t change, and this is where Scimcon’s capabilities come to the fore. For example, one current client project involves a LIMS (Laboratory information management system) validation in North America. The reality is the customer needs to meet an organizational deadline to get the LIMS live and operational, and we cannot be on site to run tests. We have dedicated ourselves to this project to ensure it still moves forward and deadlines are met. We have logged in samples remotely from our system in the UK, and then this is received by our contact in the US who accesses the printout and sends a scan of the printed barcode back to us in the UK, which we test. This is a small example of a simple workaround, how our process can still work despite us not being on site.

We have actually noticed that having people on the client side more available at this time means we are shortening the turnaround times within the project for deliverables. We are receiving more feedback, more quickly, meaning more issues are being resolved and projects are moving forward a lot quicker than they were before.

What are the challenges our clients have faced from remote working and how has Scimcon adapted to address these?

The key challenge for us and our customers is, of course, travel restrictions. Travel restrictions have inhibited us from meeting face-to-face and visiting the organisation’s premises, and there is a good chance the freedom to travel will not go back to the way it was for a long time to come. Personal interaction is a key part of our business, as our personality as a company and personal nature as individuals helps us to drive project success. We understand that our clients may be less adapted to working remotely than we are, especially in laboratories, so we collaborate to ensure our expertise in remote support will fill this gap. We are remote, but we are always in contact.

We realise there is an environmental awareness and cost to meeting face to face, and we recognise as a company we need to adapt the way as the new reality probably will not be the same as the old one. We are making a conscious effort to engage with and help our customers at this difficult time, and we have found that the use of video cameras on web calls has compensated for the loss of face-to-face interaction on both sides. We have found that turning on the camera is surprisingly more effective than previously thought! Communication, including body language, tone and intonation, is essential in understanding issues and requirements, being able to sense when something is important and knowing when to stop pushing an important point. Video therefore helps us to maintain and nurture the relationship we have with our clients – the social tools gap fill the social interactions to ensure we can continue to cement and strengthen relationships with our customers.

How do we continue to build on client relationships despite the current climate?

When we are leading a project, we need to involve our customers as much as possible, as they understand how their lab works and can explain the details clearly. With the right people on the team we can build support and momentum, despite challenging circumstances. It is therefore our priority during this difficult time to communicate and sustain good working relationships with our clients to ensure we continue to deliver successful projects.
Together, we have learnt that common struggle makes interactions easier. We are simultaneously riding the same wave, which means we can understand and empathise with one another. We have found that this worldwide common struggle has helped to build, strengthen and sustain our client relationships.

Contact us to learn about our tips for successful project leadership of analytical and clinical information management

Welcome to Scimcon, the Scientific Information Management Consultancy?

Digital Laboratory Transformation

With 91 million results in google search for the term ‘digital laboratory transformation’, this area seems to be the buzzword of the 20s. Scimcon has long been a stalwart in this area, having provided global partnership in information management to big pharma, bio and clinical organizations for over two decades.

The combined experience of our team spans more than 200 years of hands-on project roles in life sciences! What Information Management projects need, Scimcon has delivered: we have seen every type of project from every type of angle and have a unique perspective on how to ensure success.

And digital laboratory transformation is our lifeblood.

So with the establishment of our new website, we are launching a blog to address the subject of the digital lab and eClinical systems, and try to tackle some of the challenges, whilst also dispelling some of the myths.

Analytical projects including Information System Strategies, LIMS, ELN & LES, SDMS, CDS, DMS, Stability Management and Instrument Integration to mention a few. 

Scimcon’s original pedigree lies in the field of Information Management projects in the analytical laboratory. The company and its project consultants have extensive experience in:

  • LIMS (Laboratory Information Management Systems)
  • ELNs (Electronic Lab Notebooks) & LES (Lab Execution Systems)
  • SDMS (Scientific Data Management Systems)
  • CDS (Chromatography Systems)
  • Integration of laboratory instruments
  • Document Management Systems
  • Sample and Freezer Management Systems
  • Biobank Management Systems
  • Stability Management Systems
  • Information System Strategies
  • Project leadership
  • Business analysis
  • Technical specialists
  • User requirements gathering
  • Technical Audits
  • Gap analysis
  • Project requirement scoping
  • Management of tenders and RFP management
  • Information systems implementation management
  • Systems validation
  • Vendor audits and vendor selection
  • Regulatory compliance including ISO 17025
  • E-Signatures and compliance with Part 11
  • Designing and delivering systems training

Clinical projects including eCOA, ePRO, eDiaries, Drug Safety Systems & EDC 

With such an extensive knowledge of Information Management in laboratories and life sciences, Scimcon has more recently been invited to supply similar services for the clinical trials industry and especially in the move from paper to digital records. The development and adoption of new Drug Development Tools (DDTs) lies at the intersection between regulatory, science, academia and pharma drug innovation. It is generally thought that paper is a poor format for patient compliance in diaries, and data/ electronic adoption improves both compliance and record-keeping in clinical trials. With our relatable experience, Scimcon has been involved in applying its successful project leadership to the areas of:

  • ePRO (electronic patient reported outcomes)
  • eCOA (Electronic Clinical Outcome Assessments)
  • eConsent
  • BYOD (Bring your own devices)
  • Electronic patient diaries
  • Drug Safety Systems
  • EDC (Electronic Data Capture)

Top tips for partnering on projects

Our blogs will help you to navigate the world of outsourcing project leadership for either your analytical or clinical Information Management project.

The first step is to recognize why partnering is helpful:

Successful projects happen when you can trust the partner who is helping you. You do not need to relinquish control of your Information Management projects, but to successfully take them forward a trusted supplier who is not afraid to challenge, and who is vendor-neutral, can bring so much to the table.

In both the largest and the smallest organization alike, the knowledge of the organization rests with its employees. This leads to a natural gap of knowledge between best practice, external and competitor knowledge, and learning from others.

Scimcon fills that gap, its combination of deep hands-on experience, combined with information systems skills, extensive scientific systems and organizational experience, and a commitment to project success, means that we are always committed to bringing successful projects over the line.

We know the shortcuts, we understand the limitations with vendor capabilities, we recognize the scope of your existing systems and we know how to get the best from your teams. And our 100% year-in, year-out experience as a project partner means that we are always working, always able to move projects forward, even when faced with organizational challenges.

Talk to us now, or contact us to discuss your projects, past and future alike.

In order to work as intended, this site stores cookies on your device. Accepting improves our site and provides you with personalized service.
Click here to learn more