“The president gets up every morning and thinks about ISIS, the economy and the Precision Medicine Initiative,” said Kathy Hudson, deputy director for science, outreach and policy at the National Institutes of Health (NIH) during her keynote address at the Big Data in Biomedicine (BDBM) conference last month. Kathy also co-chairs the Precision Medicine Initiative, which was announced by President Obama himself in his State of the Union address in January of this year.
Precision medicine allows us to apply innovative technologies to understand the genetics, environment and lifestyle of an individual to deliver the right treatment to the right patient at the right time. One of the main aims of the Precision medicine Initiative is to create a research cohort with biological, environmental and lifestyle data of at least one million U.S. volunteers. Needless to say, this is going to be very big data! How do we scale systems to the big data requirements of the growing demands of Precision medicine?
We’ll start with best practices from other industries. The banking industry is an important one with stringent requirements for security and privacy. Take for example, fraud alerts from your credit card company. If your credit card company notices any behavior that is abnormal, they will usually deny the transaction and/or immediately contact you directly to verify the validity of that transaction.
Dr. Euan Ashley, associate professor of medicine (cardiovascular), of genetics, and co-director of Stanford’s Clinical Genomics Service, gave an example of this type of scenario during his talk at the BDBM conference. As Euan rightly said: “If my credit card company can monitor my behavior the whole time, and decide what is normal and not normal for me. Then, surely my doctor should be able to do that too.” What keeps us from transferring concepts from the banking industry into healthcare?
With the increasing adoption of Electronic Health Records (EHR), we are finally getting to a state where health records are digitized. But, how do we enable quick retrieval and analyses of all the health record data sitting in one hospital? How do we enable health record analyses across all hospitals in a network?
There are a number of challenges I haven’t even mentioned. For example, most health data is siloed data, distributed across different systems. Most of this data is unstructured, with severe security & privacy concerns. Regardless, there is an imminent need for high performance, flexible data management systems.
The whole industry is moving towards a creation of a “learning healthcare system.” To me, a learning healthcare system can mean many different things. From a technology standpoint, it could refer to the use of a high-performance data management system to integrate and extract meaningful insights from electronic health records (and other patient-related datasets, e.g. from wearable tracking devices). Or, it could refer to the aggregation of such data combined with the use of cognitive computing to ultimately derive some treatment decision support, e.g. a ranked list of treatment recommendations based on the aggregated health data.
In the former approach, using a high-performance in-memory data management system enables the real-time analyses of health data aggregated from different sources including multiple EHR systems. This approach would allow the clinician, nurse or healthcare professional to come to his or her own conclusions (and potentially treatment decision) based on slicing and dicing the data in whatever way they choose. This could make the life of the healthcare professional easier.
However, I always ask myself this question: “Do I want a computer to decide what treatment is best for me or do I want my clinician to decide?” In my opinion, technology can help to synthesize the information for the healthcare professionals, but ultimately the treatment decision should be left in the clinician’s hands.
We should provide systems and tools to integrate, analyze and visualize all this data. Treatment decision support is a natural result of such a system, but the clinician should be the one to rank the treatments, rather than a computer. On the other hand, maybe the best way to combine the capabilities of a high-performance database with some cognitive computing and the experience of the clinician treating the patient.
Regardless of the approach to a learning health system, there remains one universal truth: public-private, academic-commercial and multi-disciplinary partnerships are the only way to move forward to deliver better quality healthcare for everyone. One such example is ASCO and their CancerLinQ, a platform that will integrate EHR data from all participating hospitals and clinics and provide real-time clinical insights for cancer patients. Here are a few other partnership examples:
- Google Genomics & Broad Institute: Google Genomics will now offer Broad Institute’s Genome Analysis Toolkit (GATK) on the Google Cloud Platform service.
- IBM Watson Health, Epic & Mayo Clinic: IBM will apply Watson’s cloud-based cognitive computing capabilities to EHRs to enable personalized healthcare.
- SAP & Roche Diagnostics: Partnership has created a new preventative care package to offer to high-risk Type 2 diabetes patients, which includes a blood glucose monitor, wearable fitness tracker and an app developed by SAP on the SAP HANA Cloud platform.
I am confident that more and more of these types of collaborations will be established. In order for Precision Medicine to become a reality, we have to work in multidisciplinary teams across many industries. I’m truly excited to witness (and participate) in the innovations that technology can bring to the healthcare and life sciences industries.
Image credit: Doctor with screens © Luis Louro