Close this search box.

Crawl Before You Run: Steps to Increase in vivo Data Efficiency

Improving how researchers capture data and make decisions through adoption of digital technologies.
Life sciences companies have only in recent years begun to fully seize upon the opportunities to leverage their data in a systematic way to a wide range of drug development and patient care challenges. While they increasingly see the advantages of treating data as a strategic asset in clinical development, the one exception has been in discovery, where even in recent years labs have been using paper, spreadsheets, or legacy systems. This makes it difficult to leverage in vivo data across the entire R&D lifecycle and ultimately hinders the pace of decision-making needed to get therapies to market faster.

That data is also seen as a path to new innovations. But, in order to apply advanced analytics and artificial intelligence (AI) to complex in vivo datasets, companies must adopt modern data management practices. And until companies can apply more progressive practices to manage data, the notion of applying AI is premature. During a recent webinar hosted by Rockstep, one of the panelists offered a very appropriate analogy: “Before we ask labs to run, we need to help them crawl, then walk.” The webinar, titled Separating the Science from the Fiction: Bringing the Lab of the Future to In Vivo Studies, brought together leading experts. Joining me was Jason Davis, Charles River Laboratories, Discovery Division; Andrew Buchanan, PhD, FRSC, Principal Scientist at AstraZeneca; and Ian Levine of ModernVivo.

What Makes in vivo Data Different

What many of us in the in vivo space recognize is that the design, execution, and management of in vivo studies is different to the rest of the development space. While human clinical trials are highly structured, an in vivo study can be done in hundreds of different ways, and often the information presented is highly unstructured, which makes it difficult to introduce AI tools or any type of advanced analytics.
By way of example of how manual data practices are today, several years ago a respected scientific institute came to us after a flood in the vivarium, during which they had lost all its data because it was on color-coded Post-it® notes affected by the humidity from the flood. The lab wanted help to prevent a recurrence and wanted IoT sensors to help predict if a flood was likely to occur. What they really should have been considering was putting their data on something more reliable in order to enable IoT sensors to provide those alerts. The first step, therefore, is to put the data into a digital format to support the framework for technologies such as AI.

Get Your Complementary Checklist

Are you embarking on the journey to modernize in vivo research through technology? Trying to determine where to start?
Download the readiness checklist and learn:

The Lab of the Future Should be the Lab of Today

While paper – and, remarkably, even Post-it notes – are still used, the in vivo space is starting to see more document automation as scientists seek to free up their creative space. “Some labs are moving toward generating structured machine-readable data, which requires the use of appropriate automation tools,” Buchanan said. “So, I think there’s agreement that AI is not some magical force … It’s just another tool – an impressive tool, but one that needs to be designed, developed, deployed, and reoptimized in a way that works for the particular use of a particular group.”
The next step, according to the panelists, will be to overlay IoT sensors that run machine learning algorithms to bring in more predictive models, which can support better research. The purpose of the technology should be to allow scientists and other end users to use what works best for them in a study. The best way to get labs away from archaic processes, such as spreadsheets or paper, is to provide usable solutions. Understandably, there’s very little tolerance for technology that is not as fast or efficient as what people in the lab are currently using. The purpose should always be to enable scientists in the lab to spend more time running experiments, and less on inefficient and, frankly, unreliable processes.

Connecting the Dots

While the move to technology has to support the user, it’s important that the in vivo space catch up with the rest of R&D. You would be hard pressed to find many people on the clinical side using paper. That presents a challenge because you need to have continuity of data through the pipeline.
The good news is that we are starting to see the in vivo space adopt digital tools, or at least establish a framework to enable data to be captured in a way that it can be reliably reused. Ultimately, what matters is not the whizz-bang technology but understanding the issue that needs to be solved and solving that problem, with the support of technology.
Share this post:

Articles in Your Inbox

Stay up to date on our latest articles, delivered directly to your inbox.

About Our Blog

Support researchers on their quest to adopt best practices in the in vivo lab and stay up-to-date on the latest trends and insights from the experts at RockStep Solutions. 

Organize. Automate. Accelerate.

Modernize in vivo Research

Discover how Climb can help revolutionize your in vivo workflow.