Digital Health is Underfunded

digital health is underfundedOverall venture capital funding made a sharp decline in the last two quarters amid worries (justifiable or not) of a bear market and a funding bubble in technology investments. In contrast to the tech market, however, digital health funding continues to grow at a record pace. According to Rock Health, $4.5B was invested in digital health in 2015 (an increase from $4.3B from 2014) and $981 million has already been invested in the first quarter of this year. It seems on pace to be another stellar year, which is remarkable considering what is going on in other sectors.

Many are skeptical about the investment potential of healthcare technology investments and have been wary to enter the market (perhaps especially so with all the negative media that companies like Theranos and Zenefits have attracted). Additionally, regulatory barriers and the longer timeline needed with healthcare innovations tend to scare potential investors away.  But anyone familiar with the sad state of technology in healthcare can see, even with the record-breaking investments thus far, that there continues to be an enormous untapped opportunity in healthcare–greater, I believe, than in any other sector.

Digital health is vastly underfunded.

Technology is taking over most of our personal and professional lives with indispensable apps, wearables, and other connected devices and software. At home, we have smart appliances, lighting, thermostats, security systems, media systems, and even smart cars. And we have Siri, Cortana, and Alexa doing our bidding. But in healthcare, we’re still in the Stone Ages in terms of technology. Communication via faxes, for example, is still common between hospitals and doctors offices. There are small glimmers of hope, such as patient portals, higher-functionality EMR systems, and telehealth services, but the fact is that we are still a far cry from the ideal vision for healthcare, which includes a seamless cloud-based network of devices and software that can track and record a vast spectrum of patient information, the ultimate goal being the use of computational technology to help prevent, predict, diagnose, and yes, even treat disease. Ultimately, collecting information on large populations of patients could have profound impact through public health measures that can prevent disease and thereby reduce healthcare costs. This can only be accomplished with a wide-spread network of software and devices, that includes electronic health records, wearables, devices based in the hospital, office, and at-home, and with telehealth capabilities. In addition, there are too few companies working to collect, store, manage, and interpret health data.

There is still a lot that needs to be done.

According to MarketResearch.com, the healthcare “internet of things” (IoT) is expected to reach $117B by the year 2020. The fact is, the full potential of digital health won’t be seen until every hospital and doctor’s office and home is connected via cloud-based devices and software and with the development of machine learning platforms that can make sense of the reams of health information.

It is a little challenging to think of all of this in the abstract, so here are a few examples of the potential of the healthcare IoT. Imagine that a spike in certain population health data (like temperature) is detected in a region of the country that alerts public health officials to early to a disease outbreak that can then be contained to prevent an epidemic. Imagine that a change in an individual’s biometric data alerts that person to seek medical care, detecting a life-threatening disease, like cancer, early and improving the chances of cure. Imagine chronic health conditions like diabetes are monitored routinely and continuously with real-time blood glucose levels, with immediate adjustment by doctors of insulin dosages, thereby preventing hospitalizations due to uncontrolled diabetes, and also preventing long-term diabetic complications, such as kidney disease.

These are only a few examples.  There are countless other opportunities in healthcare.

In addition to the opportunity to improve healthcare delivery, there is the opportunity to improve the quality of care through tools that provide greater communication and transparency of information with patients and improve care coordination between the providers of those patients. And by changing the focus of medical care to prevention and early diagnosis of disease, there is the opportunity to decrease the outrageous cost of healthcare as well, by decreasing the need for excessive medication, surgery, unnecessary visits, and hospitalizations. According to the Commonwealth Fund, in the US we spend an outsized proportion of our GDP on healthcare versus other countries. Other developed countries spend between 8.8%-11.6% to our 17% of GDP, related in part to better-connected health IT networks.

It’s hard to fathom how much digital health tech is needed to serve a US population of 318 million and a global population of 7 billion, but one thing is certain: the market is huge.  We should stay bullish on health tech investments now, and probably for a long while to come.

 

GV’s Approach to Healthcare Investing: An Interview with Dr. Krishna Yeshwant

google-ventures-story

Please note:  This article was originally published on TechCrunch.com.

Healthcare investments — in particular, investments in digital health — are booming, and don’t seem to be slowing down. According to CB Insights, digital health funding hit nearly $5.8 billion in venture funding last year, surpassing the previous record of $4.3 billion in 2014.

One of the top venture firms, GV (previously known as Google Ventures), recently came out with their year in review, revealing that more than one-third of their investments are in the life sciences and healthcare. (They currently have $2.4 billion under management.) “I can think of no more important mission than to improve human health and global quality of life,” CEO Bill Maris said in a recent announcement.

One of the strengths of the GV life science and health investment team is having a diverse mix of PhDs and MDs as investors, including general partner Dr. Krishna Yeshwant. Yeshwant continues to practice internal medicine part-time at Brigham and Women’s Hospital in Boston, and credits that with helping to keep him in touch with the challenges facing healthcare.

I recently sat down with Yeshwant to talk about GV’s investment strategy.

Yeshwant started his career, interestingly, studying computer science at Stanford. From there, he helped found two tech companies, which were eventually acquired by Hewlett-Packard and Symantec. He could have successfully continued on his path in tech, but decided instead to go to medical school after his father became ill and needed a cardiac bypass. “I remember just being in the hospital thinking this is just messed up. There are so many areas for improvement,” he said.

He went on to pursue an MD-MBA at Harvard. During this time, he became involved in a lot of medical-device work, and even started a diagnostics company. This work eventually led him to work with Bill Maris at Google Ventures.

Thus far, one of GV’s largest investments has been with Flatiron Health, an oncology-focused technology company based in New York City. According to Yeshwant, the concept was developed by two former Google employees who received support from GV. “Flatiron is basically integrating EMR’s (electronic medical records) in the outpatient and hospital setting,“ said Yeshwant, “and it provides data back to physicians as well as aggregating data to aid with discovery and help with regulatory processes.”

Others have also recognized Flatiron’s enormous potential. Flatiron recently announced they received $175 million in Series C funding from Roche Pharmaceuticals. In addition to the funding, Roche plans to be a subscriber to Flatiron’s software platform. Their hope is to use the platform to identify and bring innovative treatments to market faster.

Yeshwant strongly believes in the need for more tech solutions in healthcare like Flatiron Health. “There’s a fundamental need for infrastructure. A single disease type of lung cancer is actually lots of diseases. Other more complex diseases are going to need more data sets, multisite trials, and we need to create infrastructure for that,” he said.

It’s hard to argue with him on that point. Massive amounts of biometric data are being collected in healthcare right now, but there aren’t nearly enough tools for storage, communication and analysis of that data. There’s a great deal of opportunity for healthcare startups that can specialize in data management and analysis.

Three such companies in which GV has invested in this space are Metabiota, which provides risk analytics to prevent and reduce epidemics; Zephyr Health, which uses global health data and machine learning to provide treatment insights to pharma and medical device companies; and DNAnexus, a company that helps companies store their genetic information.

“Once you’re in a world where you can scale up and down your computational analysis, you can ask lots of simultaneous questions of your aggregated data sets and that’s well suited to the cloud environment,” said Yeshwant. “We invest heavily in those spaces.”

Besides software-based companies, GV is investing in a diverse range of other types of companies in healthcare and the life sciences. One such area is the genomics space. Thus far, GV has made major investments in Editas, a CRISPR gene-editing company; 23andMe, which offers chromosomal analysis to consumers; and Foundation Medicine, a company that offers genomic analysis of various cancers.

Yeshwant also feels one of the biggest challenges (and opportunities) in healthcare is helping healthcare organizations shift from fee-for-service to fee-for-value. “That’s the direction we’re going,” he said. “How do we migrate big systems in that direction? That’s the fundamental question.”

GV therefore has made some significant investments in companies that are shaking up the traditional provider model, including the telemedicine company Doctor on Demand and the innovative primary care provider, One Medical Group. “Anything you can do to move healthcare from a high cost setting to a low cost setting is generally going to be successful in that model,” said Yeshwant. “Telemedicine is a good example of that. We have a company called Spruce Health which is essentially asynchronous care. Value based care is a big area for us.” (Spruce Health is a platform for dermatologic care.)

Yeshwant hinted that future projects may be in the areas of population health and chronic disease management, investment in companies that engage consumers directly and possibly even some work in women’s health. One thing’s for sure: We can expect more exciting things to come in 2016 and beyond for GV.

 

 

Cool Startup: GenoSpace

daniel meyer

Healthcare is drowning in a deluge of data. Decision-makers must somehow make sense of a heterogeneous array of information — demographic, clinical, patient-generated, treatment and outcomes data. The latest waves of information also include data from mHealth and genomic sources. It’s not hard to imagine that many in the healthcare industry suffer from information overload and struggle with a bit of ‘analysis paralysis.’ How can organizations make sense of all this big data and actually harness it to improve healthcare and outcomes?

One company helping answer this question is GenoSpace, an ambitious genomic and health data management startup based in Cambridge, Mass. Its current chairman, John Quackenbush, and CEO, Mick Correll worked together in the Center for Cancer Computational Biology at Dana Farber before co-founding the company in 2012. Contracts with notable customers like the Multiple Myeloma Research Foundation (MMRF) and PathGroup funded GenoSpace before the first round of outside funding in 2014.

It was around that time that GenoSpace hired Daniel Meyer, an entrepreneur with a background in venture capital, as Chief Operating Officer. According to him, it was GenoSpace’s ability to attract high-quality customers early on (a rarity for most early-stage companies in life sciences) that convinced him to join. Recently, we sat down with Meyer to learn more about how GenoSpace helps healthcare organizations make sense of all the big data.

Tell us about what you do at GenoSpace.

When you’re dealing with genomics and other biomedical data, there are a variety of different users and reasons for their use. So you could have an institution that has users engaged in research, clinical development, lab medicine and clinical care. They have different software application needs that cut across the same or similar data sets. One of the things we try to do is develop the tools, the interfaces and the experience that will enable all of those different people to get the most from the data.

Could you go over your major offerings?  

We have three primary categories of offerings: analysis and interpretation of a single assay result together with phenotypic and other clinical data, interactive analysis of data from many individuals as a group, such as from a large observational study (where we really excel is when a customer has integrated demographic, clinical, genomic, treatment, outcomes and other data) and enabling patients to directly report and interact with their data. We’ve created software applications and web-based sites for patients to upload their data, track their results and better understand their condition. Although we have a core competency in genomic data, we do not only deal with genomic data.  Research and clinical care rarely rely solely on a single data type.

Now that Obama has announced the Precision Medicine Initiative expanding genomic study, do you also expect your work to expand?

We think it’s a fascinating announcement and those are the types of initiatives we support. One of the interesting things is that we have customers right now solving many of the problems that the initiative will face. For example, we have been working with Inova, a healthcare system based in Northern Virginia that serves more than two million people per year in the metro DC area. They have been collecting a rich set of whole-genome sequencing data together with structured clinical data on thousands of people. Their data management and analysis needs map directly to those of precision medicine initiatives like the one announced by the White House.

I’d imagine that you’d have greater demand on the private side.  

We have spent most of our time there. Our first clinical lab customer, PathGroup, is delivering industry-leading molecular profiling across a wide geographic footprint, including to some big cities in their coverage area and also smaller cities and towns.  Our ability to help them bring academic-quality medicine to community oncology is a huge impact. Roughly 85% of oncology patients are treated in a community setting. If you’re only deploying in major cities with academic medical centers, you’re missing out.

What are your next plans? Any new projects or goals?

We are  looking to expand to different customer use cases. That can be in terms of the therapeutic indication, such as rare diseases, neurologic or cardiac disease. But it can also be integrating different kinds of data. We have a lot of experience working with demographic, diagnostic, treatment and outcomes data together with genomic results, and there are more opportunities to expand.

Are you also working on using machine learning to do predictive analytics?

We think about that a little differently. There’s supervised analysis, the user asking questions and getting answers about the data, and there’s unsupervised analysis. For many of our customers, they’re not looking for a black box. Our goal is not to replace molecular pathologists, but to work hand in hand with them to make sure their work is better, more operationally efficient and more sustainable, particularly if it’s a commercial entity.

That last piece is underappreciated by a lot of folks. We do a lot of work in genomics and in precision medicine and there’s a lot of science and advanced technology. All that work is lost in most settings if you don’t deliver it properly. You have to understand the science and the innovation, but also how to get it in the hands of people who can impact patients. That’s a big part of what we do.

Any final thoughts?

One of the fun things about being here is we have folks with a lot of different capabilities—in software engineering, interactive design, data science, etc. For a lot of the interesting problems that people are trying to solve in medicine, it takes that interdisciplinary team approach as opposed to a whole bunch of people with the same type of experience.

To learn more about GenoSpace, visit their website at genospace.com or follow them on Twitter at @GENOSPACE.

This article was originally published on MedTech Boston.

Cool Startup: twoXAR

Andrew Radin x 2
Andrew M. Radin (left) with friend and twoXAR business parter Andrew A. Radin.


It’s not every day that you meet someone with the same name as you. And it’s even less likely this person will have similar interests and be someone with whom you might want to start a business.

But that’s exactly the story of the two Andrew Radins, founders of twoXAR.

Chief Business Officer Andrew M. Radin met his co-founder and Chief Executive officer Andrew A. Radin battling over a domain name–you guessed it–andrewradin.com.  About six or seven years ago, the former asked the latter, who owned the domain, if he could buy it from him and was told (in not so many words) to get lost.

Somehow, this exchange sparked a friendship, first on Facebook, then through commonalities such as travel to China, working in science and tech and their independent, entrepreneurial pursuits.  A little over a year ago, as Andrew A. Radin completed work on a computational method to enhance drug and treatment discovery, he naturally thought of joining forces with his namesake and friend, Andrew M. Radin.

For Andrew M. who was just completing his MBA from MIT Sloan, the timing was right and the discovery compelling enough to turn down other appealing job offers and join Andrew A. in forming the aptly named twoXAR (pronounced TWO-czar). Based in Silicon Valley, the company predicts efficacy of drug candidates by applying statistical algorithms to various data sets.We caught up with Andrew M. Radin recently to hear about their exciting new venture and their progress.

Tell us about what you do at twoXAR.

We take large diverse, independent data sets including biological, chemical, clinical etc.–some subsets include gene expression assays, RNA-seq, protein binding profiles, chemical structure, drug libraries (tens  of thousands of drugs), whatever we can get our hands on–and use statistical algorithms to predict efficacy of drug candidates in a human across therapeutic areas. The raw output from our technology (DUMA Drug Discovery Platform) is the probability of a given drug to treat a given disease. It all takes only a matter of minutes.

Where do you get your data sets?  Are they from clinical trials?

Some of our data comes from clinical trials, but we pride ourselves on using data sets that are largely independent from each other and come from a variety of sources along the biomedical R&D chain–as early as basic research and as late as clinical data from drugs that have been on the market for 30 years.  All of these data sets are extremely noisy, but we specialize in identifying signal in this noise then seeking overlapping evidence from radically different data sets to strengthen that signal.

These data come from proprietary and public sources. The more data we have, the better results DUMA delivers.

Could you give an example of how you could use this tool in pharmacologic research?

Our technology allows us to better characterize the attributes of a disease beyond just gene expression. We can examine how a drug might be related to a myriad of informational evidence streams allowing a researcher to build more confidence on a prediction for drug efficacy.

Let’s take Parkinson’s Disease as an example. Existing treatments focus on managing the symptoms. The real societal win would be to stop, and possibly reverse, the progression of the disease altogether. This is what we are focusing on.

In Parkinson’s disease, we’ve acquired gene expression data on over 200 Parkinson’s patients sourced from the NIH and examined over 25,000 drug candidates and have found a handful of promising candidates across a variety of mechanisms of action.

So you can “test out” a drug before actually running a clinical trial?

That’s the idea. Using proprietary data mining techniques coupled with machine learning, we’ve developed DUMA, an in silico drug discovery platform that takes a drug library and predicts the probability of each of those drugs to treat the disease in question in a human body. We can plug in different drug libraries (small molecules, biologics, etc.) and different disease data sets as desired.

At this stage we are taking our in silico predictions to in vivo preclinical studies before moving to the clinic. Over time we aim to demonstrate that computational models can be more predictive of efficacy in humans than animal models are.

It seems, intuitively, that this would be really valuable, but I would imagine that your clients would want to see proof that this model works.  How do you prove that you have something worthwhile here?

Validation is critical and we are working on a number of programs to demonstrate the effectiveness of our platform. First, we are internally validating the model by putting known treatments for the disease into DUMA, but blinding the system to their current use. If in the results the known treatments are concentrated at the top of our list we know it’s working. Second, we take the drug candidates near the top of the list that are not yet known treatments and conduct preclinical studies with clear endpoints to demonstrate efficacy in the physical world. We are currently conducting studies with labs who have experience with these animal models to publish methods for peer-reviewed journals.

You have a really advanced tool to come up with potentially great treatments, but what’s to say that’s better than what’s going on out there now?  How do you prove it’s better or faster? 

If you look at drug industry trends, the top drug companies have moved out of R&D and become marketing houses–shifting the R&D risk to startups and small and medium drug companies. Drug prospecting is recognized to be extremely risky and established methods have produced exciting results in the past but have, over time, become less effective in striking the motherlode. Meanwhile, the drug industry suffers from the same big data woes as many industries–they can produce and collect petabytes and petabytes of data, but that goldmine is near-worthless if you don’t have the tools to interpret it and extract the gold. Advances in data science enable twoXAR to analyze, interpret, and produce actionable results with this data orders of magnitude faster than the industry has in the past.

It seems that this could be scaled up to have many different applications.  How do you see twoXAR transforming the industry? 

In regards to scale, not only can computational platforms look at more data faster than humans without bias, much smaller teams can accomplish more. At twoXAR, we have a handful of people in a garage and we can essentially do the work of many wet lab teams spanning multiple disease states. Investors, researchers, and patient advocacy groups are very interested in what we are doing because they see the disruptive potential of our technology and how it will augment the discovery of new life-saving treatments for our families and will completely recast the drug R&D space. One of the things I learned at MIT from professors Brynjolfsson and Little is that the increasingly exponential growth of technological progress often takes us by surprise. I predict that tectonic shifts in the drug industry will be coming much quicker than many folks expect.

To learn more about twoXAR, visit their website and blog.

This article was originally published on MedTech Boston.

Books: The Patient Will See You Now

Topol-The Patient

Einstein once said, “The most important decision we make is whether we believe we live in a friendly or a hostile universe.”  I suspect that readers of Eric Topol’s latest book, The Patient Will See You Now, might be polarized along these lines as well.  The ideas expressed might fill some with fright, while others, with delight.

Dr.  Eric Topol, a cardiologist, geneticist, and researcher from The Scripps Institute and editor-in-chief of Medscape, attempts to get readers up to speed on the rapidly developing innovations in healthcare.  That’s a daunting task, considering how many individuals and companies are experimenting in this space and how quickly progress is being made.

He begins the book by making a convincing argument that the digital age, with the cell phone being its best emissary, is bringing about revolution in society and healthcare akin to when the Gutenberg press was invented, which made reading accessible to the masses. Just as education was democratized, he argues, so is healthcare becoming democratized. In the not-so-distant future, patients will have all healthcare knowledge at their fingertips and as a society, we will move away from the paternalistic paradigm of traditional medicine.

Increasing accessibility to health information is fostering increased patient engagement and helping to drive innovations and advances forward.  He devotes a chapter to discussing the influential role that Angelina Jolie has played as a change agent in shaping the patient’s role in the future of healthcare, with her courageous decision to preemptively have a bilateral mastectomy to prevent breast cancer, after finding out that she is a carrier for the BRCA1 gene.  His point was driven home with the actress’ announcement this week that she has now also undergone a prophylactic bilateral salpingo-oophorectomy (removal of the fallopian tubes and ovaries) to prevent ovarian cancer.

In this information age, patients are more knowledgeable about their personal health, the medical research out there, and their options.  In Dr. Topol’s opinion, healthcare organizations need to move towards transparency at all levels, with regard to clinical information, medical knowledge and research, and that (with consent and precautions to protect privacy), this information should be aggregated to create what he calls “massive open online medicine” (MOOM).  Aggregating and applying machine learning to this data has enormous potential to prevent and cure disease and cut healthcare costs.

Particularly interesting was his description of an individual’s “geographic information system” (GIS), which is a comprehensive health “map” of a patient, looking not just at their genome, but other “-omes”, such as demographics (phenome), vital signs (physiome), anatomy (anatome), DNA/RNA (transcriptome), proteins and metabolites (proteome and metabolome), microbes (microbiome), DNA packaging (epigenome), and the environment (exposome).  Treatments of the future are likely to be very individualized based on a patient’s unique GIS.

He discusses some other fascinating innovations (too numerous to list), including the plethora of smartphone apps and wearable devices, a variety of diagnostic tests being done via mobile devices, the rise of telehealth services, genetic testing and research, the quest for transparent, fully interoperable, and secure medical records, and new tools to decrease morbidity and mortality related to medical errors.

This was an information-dense, engaging read.  For those interested in the healthcare innovation space, it’s a must-read.

What I most appreciated was his optimistic view of the future of medicine in the digital age.  Clearly, Dr. Topol believes we live in a hopeful, friendly universe.

10 Genius Ideas to Improve Healthcare

Photo courtesy Anna Shaynurova
Photo courtesy Anna Shaynurova

10 Genius Ideas to Improve Healthcare from MIT Sloan’s Bioinnovations Conference

The MIT Sloan School of Management held its 11th annual Bioinnovations Conference at the Boston Marriott Cambridge Hotel on September 20th, featuring influential speakers from the healthcare, life sciences, research, and regulatory sectors. This year’s theme was “Value in Healthcare” and brought an impressive turnout of over 350 attendees.

“Our goal for the conference was to bring together industry leaders across business, science and medicine to discuss some of the most pressing issues in healthcare,” said conference organizer Anita Kalathil. “MIT and Sloan are passionate about how to improve healthcare, whether at the molecular or systems level, and we know that any solutions are going to have to be cross functional. Our goal was to make the MIT Sloan Bioinnovations conference the connecting point for these different groups.”

There were many great takeaways from this conference, but here are 10 of the most noteworthy:

1. Delivering true value in healthcare.

Neel Shah, founder and executive of Costs of Care, was the conference’s opening speaker. “There’s a misperception that considering cost is not aligned with patient interests,” he said. Cost consideration is becoming ever more important in healthcare, as policymakers demand greater accountability and patients demand greater transparency in pricing.

2. Refocusing the future of research & development.

Mark Fishman, President of Novartis Institutes of Biomedical Research, shared in his opening keynote that aging, cancer, brain disease and genetic therapies hold the greatest promise for future research. He also shared his unique approach for R&D, which is to focus less on cost-benefit and more on areas with the greatest patient needs and solid scientific knowledge.

3. Putting Big Data to good use.

There was a lively discussion during the Big Data, Policy, and Personalized Medicine panel, highlighting the need for better ways of collecting, analyzing and interpreting the huge amounts of data that are being generated from various sources, including medical records, diagnostics, genomics, and sensory data from patient devices. The panel members represented a number of impressive companies (TwoXAR, Privacy Analytics and Genospace) that are attempting to do just that.

4. Researching therapies (and prevention).

In his keynote address, Gary Kelloff, Special Advisor to the National Cancer Institute and the National Institute of Health, shared that the present approach in cancer research involves discovering and developing targeted therapies to biomarkers of cancer. While acknowledging the importance this research, Dr. Kelloff also urged participants to invest in researching the prevention of disease.

5. Improving health IT.

In his keynote, John Halamka, CIO at Beth Israel Deaconess, discussed the ongoing challenges in health information technology that need to be addressed: lack of interoperability, providing transparency while also ensuring privacy, harnessing HIT and Big Data to improve quality of care, and facing the ongoing threat of accelerating security incidents.

6. Considering a team approach.

During a panel about medical device development, Ramesh Raskar, Associate Professor at MIT Media Lab & Head of Camera Culture Research Group, shared that he felt the sciences needed to move away from independent research (which can be slow to produce innovations) and toward a culture that allows individuals to work more collaboratively in teams (which can be faster). He also shared a memorable quote: “The innovator may or may not be an entrepreneur,” which again highlighted the advantage of a diverse team approach.

7. Incorporating patient-centered design.

Kristian Olson, Medical Director at the Consortium for Affordable Medical Technology, recommended that “patients be in the room” when designing medical innovations. And Elizabeth Johansen, Director of Product Design and Implementation at Diagnostics for All, shared her techniques for creating user-friendly devices. Particularly helpful was her advice to observe how patients interact with their devices in their own surroundings.

8. Overhauling healthcare delivery.

According to Mikki Nasch, co-founder of The Activity Exchange, “Your zip code is still a better predictor of your health than your genetic code.” Social and environmental factors are huge determinants of health, and the delivery of healthcare in old models doesn’t address this issue. Healthcare needs urgently to transition away from traditional paradigms and into newer models of care, such as ACOs, that better address these social factors.

9. Finding collaboration between payers and pharma.

There was a lively debate during one of the panels about specialty drug pricing. Panel members suggested that payers and pharma need to come together at a systems level to help advance development of treatments and cures. Dr. Winton from Biogen Idec Market Access suggested new payer-pharma models and shared risk plans.

10. Driving innovation with patients at the wheel.

The final keynote of the day was given by Jamie Heywood, Co-Founder and Chairman of PatientsLikeMe, an online platform that allows patients to share information about their medical conditions and treatments and connect with others with similar conditions. Not only does this novel website help patients, but the open platform also allows healthcare and industry professionals to better understand patients’ experiences and conditions and may help to accelerate the development of new treatments. Conference attendee Dimple Mirchandani was impressed with Heywood’s emphasis on continuous learning to better understand diseases and their treatments, and by his inspiring vision for caregivers and patients to use “data for good.”

This article was originally posted on MedTech Boston.

Electronic Health Records: Opportunities

EMR2

As discussed in the previous article in this series, the broad adoption of electronic health records (EHRs) has presented healthcare professionals with numerous challenges. It’s not surprising that many of us are left wondering: Will all of this effort to rapidly adopt EHRs even be worth it in the end?

Where We Are Now

To better understand this, it’s helpful to first take a closer look at the current state of the U.S. healthcare system. In 2000, the Institute of Medicine released their landmark report, To Err is Human, which exposed the alarming number of deaths that occur as a result of medical errors in the U.S. This was a big shock to many who assumed that the American healthcare system was the best in the world. To add insult to injury, we also discovered around that same time that healthcare costs were skyrocketing – in fact, they had doubled from 1993 to 2004.

According to the Commonwealth Fund, the United States today has the most expensive healthcare system in the world, spending about $8,500 per capita, or nearly 18% of our GDP, while also consistently ranking dead last in overall performance and quality compared to all other industrialized nations. One can’t help but wonder: What are other countries doing that we’re not? Well, two things in particular stand out when we compare our healthcare system to theirs: 1. A lack of universal healthcare coverage; and 2. A lack of high-functioning, fully-integrated health information systems. It turns out that our international counterparts have surpassed us when it comes to providing high quality, affordable, and accessible healthcare. One of the key elements of their success has been harnessing health IT.

Opportunities to Consider

Considering all of this, it’s no surprise that we have had bipartisan support for the expansion of EHRs from both Presidents Bush and Obama and that we continue to invest in creating a fully interoperable, nationwide network for health information. If EHRs can be harnessed properly, they promise to deliver lowered healthcare costs, improved quality, increased access, and improved population health. Let’s take another look at those challenges presented in the last part of this series. Where are the opportunities in these challenges?

1. Cost

Despite the high costs of implementing new EHR systems, there are also numerous studies that report that high-functioning EHR systems can help to decrease costs in the long run. One study found a 12.9 to 14.7% reduction of duplicative testing with the use of computerized provider order entry (CPOE) and clinical decision support (CDS) in an outpatient setting. Overhead costs may also be decreased through the reduction of chart pulls and from reduced paper, supplies, and storage costs, as well as via decreased transcription costs. Efficiencies can also be gained in billing processes with improved and complete documentation, improvements in the charge and capture process, and through decreases in billing errors. A study from Massachusetts found that paid malpractice claims may also be minimized with use of EHRs vs. paper records (6.1% vs. 10.8% paid claims).

It remains to be seen if EHR-induced savings will be favorable versus the cost expenditures required to operate them. But these studies show that there’s reason to be hopeful.

2. Quality & Communication

Investing in a high-quality EHR system has also been shown, in some studies, to result in higher overall quality, improvements in safety, and decreases in delayed medical decision-making. A study of hospitals in Florida found that those with greater investments in health information technology scored higher in quality measures. A similar study found that those hospitals with greater investments had lower patient complications and lower mortality rates as well. Other research has demonstrated that high-quality EHR systems improved prescribing patterns, too. In these ways, EHRs may support improved outcomes and thereby reduce malpractice and liability risks.

Highly interoperable EHR systems have also been credited with improving the communication and coordination of care between providers, and with decreasing delays in medical decision-making that can result from having to wait for the transfer of medical records. A strong health IT system can also enhance communication between providers and patients and help to foster increased patient engagement through the use of applications such as patient portals and interfaces with radiology, laboratory, and medical devices. Patients may be more apt to become actively involved in managing their health and participating in shared decision-making as a result of having easier access to their health information. 

3. Access 

Another advantage of EHRs is that they can help to provide convenient and timely access to a patient’s health record. We’re still a far way away from a fully transparent nationwide (or global) healthcare network, but these advances are coming. In addition, as the telehealth and mHealth market grows, and as we see better integration of other platforms with EHRs, we will likely see a huge revolution in access to personal health information. This need is especially urgent in light of the dire shortage of primary care physicians. Telehealth capabilities of EHRs may very well be the solution to providing access to medical care for patients in underserved or remote regions.

4. Population Health

As we succeed in integrating systems and improving interoperability, we will have the ability to aggregate huge amounts of health data for entire populations of patients. This “big data” can be used to conduct population health research, which can help identify patterns such as risk factors for diseases. With this, physicians will be better able to recommend preventative measures and evidence-based best practices. This information can also be harnessed to change practice patterns and hopefully, to affect positive healthcare outcomes on a broader scale. EHRs can also help to enhance reporting capabilities, which may help identify potentially dangerous outbreaks or treatment-related risks quickly, so that they can be managed in a more timely and effective manner.

***

The Bottom Line

EHRs hold a great deal of promise to truly transform our ailing healthcare system. How well we succeed will depend in large part on how we can overcome and manage key challenges affecting cost, interoperability, safety, and patient-centered care. It remains to be seen if the cost-to-benefit will be ultimately favorable, but these preliminary findings and evidence of international success give us reasons to be hopeful.

This article was originally published on MedTech Boston.

Precision Medicine: Pros & Cons

m150_2_014i
23 chromosomes (image from Scientific American)

This past week, President Obama announced a $215 million proposed genetic research plan, called the Precision Medicine Initiative.  According to the plan,  the NIH would receive $130 million towards a project to map the DNA of 1 million people, the National Cancer Institute would receive $70 million to research the genetic causes of cancer, the FDA would receive $10 million to evaluate new diagnostic drugs and devices, and finally, $5 million would be spent on tech infrastructure to analyze and safely store this data.

Not surprisingly, this announcement sparked some online controversy.  If internet pundits are to be believed, this plan is going to prevent you from ever finding a mate, an employer, get health insurance, cause us all to become part of a giant genetic experiment to tailor human beings, and will also put us into crippling debt and line the pockets of Big Pharma.  I’m not even sure I covered it all…The complaints ranged from reasonable to ridiculous.  The most amusing are the conspiracy theorists who are certain that Obama must be plotting a genetic apocalypse.

But, in all seriousness, I have to admit I have concerns as well, despite being mostly optimistic about this news.

Here are some of the exciting positives offered by the precision medicine plan:

  • New diagnoses:  We may finally be able to identify genetic causes of diseases that were previously unknown.
  • Prevention vs. disease management:  Knowing genetic risks ahead of time can help us to focus more on preventing disease rather than reacting after-the-fact, once the disease occurs.
  • Early diagnosis:  We may be able to detect diseases earlier and at a more treatable stage.
  • Protective genes:  Some people have certain genes that protect them against diseases or prevent them from “expressing” their bad genes.  Studying these differences may help us to learn how to protect ourselves against those diseases.
  • Drug development:  Therapies can be developed in a faster and more efficient way by targeting certain genetic problems, rather than using the traditional trial-and-error method.
  • Personalized treatments:  Treatments can be tailored to a patient’s unique genetic aberration and we can avoid giving treatments to patients that we know may cause adverse reactions or that will fail to work.
  • Population health:  We can study genetic patterns in populations of patients to find out causes of diseases, develop treatments, and find ways to prevent disease.
  • Healthcare costs:  There’s a potential to reduce healthcare costs if focus changes to prevention rather than treatment of disease and also if we can streamline drug development.

But, let’s also look at the potential downsides:

  • Data storage:  We already know that gene sequencing of an individual produces MASSIVE amounts of data.  The sequencing of a million people is going to produce unimaginable amounts of data.  How will we store all this big data and analyze it to make any sense of it?
  • Privacy/Security:  Is there anything more personal and vulnerable to cyber-attack than your genetic information?  I wonder if the $5 million allotted to this effort will really be enough.
  • Data relevance:  According to Obama, the data will be collected from 1 million volunteers.  That’s not a random cross-section of people in the US and may not represent the population adequately in order to make population health recommendations.  I’d argue that only certain types of people would sign up and other types won’t.  Would we miss certain disorders? Would we see too much of another disorder in a population of volunteers for this project?
  • Culture:  How do we prevent people from abusing this information and not using it to screen potential partners, deny insurance coverage, denying jobs?  How will this affect culture?  Will we be cultivating a different kind of racism, on a genetic basis?  Are we on the path to a real-life version of the movie Gattaca?
  • Ownership:  Who will claim ownership of this data?  Will it be the government?  I’d argue that this data should be owned by the individuals from whom it comes, but the experience of the genetic sequencing (now genetic ancestry) company 23 & Me is worrisome.  For the time being, the FDA has blocked the company from allowing individuals from having access to their own genetic information.  Will this change as part of the new initiative or not?
  • Drug/device industry:  Genetic research and development of treatments has been very promising and productive in the private sector.  How will government involvement affect research?  Will our governmental agencies work cooperatively with them or competitively?  Again, if the experience of 23 & Me is any indication, this is a real concern.
  • Healthcare Costs:  Yes, there’s potential to decrease costs, but there’s also potential in greatly increasing costs.  It’s no small feat to genetically map a population, analyze the information, store it safely and securely, and develop recommendations and treatments.

Part of me is excited about the potential and I think that it probably does take a huge governmental initiative to tackle and impact population health, but another part of me is concerned about government invading a space that is so personal and private and I wonder if it could slow down progress in developing life-saving therapies in the private sector.

What do you think?  Are you excited or nervous about President Obama’s Precision Medicine Initiative?