Translational medicine is an area of research in life sciences that focuses on bringing together early-stage research and development with downstream clinical outcomes to better understand the effects of drugs and therapies for real-world patient outcomes. This area has been considered a ‘holy grail’ of life sciences for decades because solving it effectively means clinical delivery of new medicines from laboratories will be more easily achievable.
To accomplish these outcomes, it is important to be able to utilize a plethora of data from different research and clinical areas. These data sources are normally spread across organizations, exist in incompatible formats and are often inconsistently labeled. This means a wide variety of data sources are required for translational medicine to be effective.
Traditionally, software development in clinical research meant that large amounts of heterogeneous data needed to be migrated together, transformed and mapped into a common repository so complex searches and analytics can be performed.
The first stage of the data migration process normally includes building a pipeline for the data to be extracted, transformed and loaded (ETL’d) into the new system – a process which can take months, or even years, to complete.
Once built, these pipelines are often rather rigid and brittle, in that they connect data together in specific ways for specific purposes. So, when the research questions change, often the ETL pipeline must be rebuilt or extended over and over. It is hard to have a flexible and ad hoc type of system that can keep up with the scientific demands of translational medicine where new topics and patterns of interest are constantly being thought of as potential research avenues.
At LeapAnalysis (LA), we have built a new type of technology for use in translational medicine, one that allows for ad hoc queries and analytics to be run by scientists directly. Using LA, researchers can focus on medical outcomes rather than data wrangling. By removing the need for ETL pipelines and the need to move or copy the data, LA provides a straightforward means to access data directly from the source via intelligent data connectors.
These connectors are driven by semantic metadata (data models representing the most basic classes, attributes and relationships in the data) and machine learning (which scans, reads and presents the original data schemas directly to LA’s engine automatically).
Users can connect directly to data sources, via a cloud-based portal, regardless of their physical location. LA provides nearly immediate access to a wide variety of sources and formats dramatically reducing the time needed for other search and analytics systems that require the expense of building data pipelines and complicated mapping strategies.
LA is changing the way that translational medicine gets done by:
[inlinead-1]
- Decreasing the time needed to find and integrate data (weeks/months become minutes/seconds).
- Increasing data availability, accuracy and overall quality by a significant margin through data enrichment (both semantic and statistical enrichment).
- Improving search and analytics results by bringing together require a wide range of data types (flat files, relational data, text data, image data, graph data, open-source data, etc.) when the user needs it.
- Cutting time for training on new systems as users can utilize existing business intelligence or analytics dashboards as well as LA interfaces. LA acts as middleware and serves up the data on the backend of these other tools.
LA recently worked with a large pharma customer to deploy its data solutions to successfully enhance its enterprise IT environment. LA provided search and analytics at high speeds – even though the customer’s data remained disconnected and was only connected virtually via metadata.
Want to hear directly from our customer and dive deeper into the solutions to accelerate your abilities to integrate data in translational medicine? Sign up to attend the live webinar, How pharmaceutical companies and biotechs can accelerate translational medicine with virtualized search and analytics on July 15.
I will present a demo of LA’s technology alongside my co-host from a big pharma company to give you the chance see the technology in action. We will be demonstrating a new way to overcome the challenges surrounding translational medicine and look forward to welcoming you. Come and learn how “Data Stays Separated, But Answers Come Together” with LA.