Smarter Data Analysis in Precision Medicine is a Bridge to Clinical Utility

Biocom

Dr. Corina Shtir, head of precision medicine for Thermo Fisher recently presented to a group of San Diego-based executives at All About Us: Getting Personal with Precision Medicine & Oncology. She was one of the featured presenters during the one-day event put on by Biocom, a 1000-member organization comprised of research institutions and life science companies.

Dr. Shtir entitled her presentation “Data Integration and Utility for Precision Medicine,” and during it she described the opportunities and challenges of integrating massive amounts precision medicine-related data. 

Starting with the opportunities, Dr. Shtir described an historic convergence of technology, funding, data and clinical expertise that is now aligning to improve health, better diagnose and treat disease and accelerate the discovery and use of new medicines. The key is “smart integration,” she says, where “multiomic data are combined with clinical observations to deliver true clinical utility.” To be feasible, however, newer approaches to diagnosis and therapy must not only be accurate, but also scalable and cost-efficient.  

For integration to be “smart,” we must get better at data analysis and interpretation. Dr. Shtir touted the importance of “optimization models” that give us more analytical insight without requiring more sample material, which is often infeasible. In other words, improving the ways we use data – instead of simply collecting more of it – will bring efficiencies that accelerate drug discovery and clinical trials as well as support faster and more accurate diagnosis and disease management at the point of care.

Building on her call for optimization, Dr. Shtir suggested that enabling “volume, velocity and variety” is key to providing the right information about a patient at the right time. To illustrate her point, she described the Oncomine Dx Target Test, a qualitative in vitro diagnostic test that uses targeted high-throughput, parallel-sequencing technology to detect sequence variations in 23 genes. Today, the test can be used recommend one of three treatments for non-small cell lung cancer simultaneously, based on three known genetic variants. But the panel can detect 20 more gene variants that could someday correspond to new FDA-approved therapies, so there’s an incentive for stakeholders to collaborate toward an end goal of higher volume, velocity and variety for this diagnostic tool.  

The Oncomine Dx Target Test is an example of what’s possible when multiple stakeholders, including researchers, pharma, clinicians and device manufacturers, collaborate. Dr. Shtir asked the audience to consider all that’s required to get tests and therapies to the point of care, from analysis of petabytes of data generated across multiple lab sites to clinical validation and reimbursement. 

Dr. Shtir believes the creation of an optimal funnel, from “p-value to clinical utility,” is only possible if data are gathered smartly, standardized and consistently applied through the lens of clinical utility. And this optimization will become increasingly important as more data enters the picture from powerful technologies such as mass spectrometry and electron microscopy. If we continue to focus on smart data integration with an eye toward clinical utility, Dr. Shtir told the audience, we’ll not only accelerate precision medicine, but also be well on our way toward precision health.