Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More
Health care applications of machine learning and AI have been in the news a bit more than usual recently, concurrent with the recent Healthcare Information and Management Systems Society (HIMSS) conference in Las Vegas. HIMSS is a 45,000+ attendee conference dedicated to health care IT. Surprising no one, AI was a major theme at this year’s event. There was a whole sub-conference focused on ML and AI, plus a ton of AI-focused sessions in the regular conference and a good number of announcements by industry leaders and startups alike.
I’ve only done a couple of health care-focused shows on my podcast so far, but I’m planning to dive into this area more deeply this year. Health care is arguably one of the most promising — not to mention important — areas of AI application. Researchers are making progress across a good many areas, including the following.
Image-based diagnostics like radiology lend themselves to deep learning. There are large amounts of labeled image data to work from and a degree of uniformity that’s unmatched in many other vision applications. There’s been a raft of research papers on the application of deep learning to radiology and a lot of speculation about AI eventually replacing radiologists, but also strong arguments against this ever happening.
Radiology aside, machine learning and AI has the potential to help doctors make better diagnostic calls. One company that’s been active in this space is the startup Enlitic. The company, which at one time was lead by Fast.ai founder and former Kaggle president Jeremy Howard, wants to use deep learning to help make diagnostic calls, manage patient triage and screening programs, and identify high-level population health trends. Google Brain, Google DeepMind, and IBM Watson are all very active in this area as well, among others; indeed, Google Brain recently published interesting research into the use of deep learning to predict cardiovascular diseases using retinal images, as opposed to more invasive blood tests.
Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.
Machine learning is also driving health diagnostics and monitoring into the hands of consumers. Last year Apple unveiled a research study app that uses Apple Watch’s built-in heart rate monitor to collect data on irregular heartbeats and alert patients who may be experiencing atrial fibrillation. FirstBeat, a supplier to other fitness watch makers, uses machine learning to predict wearer’s stress levels and recovery times. I spoke with Ilkka Korhonen, the company’s vice president of technology about physiology-based models for fitness and training earlier this year.
Personalized, or precision, medicine seeks to tailor medical interventions to the predicted response of the patient based on their genetics or other factors. Applications include selecting the best medicines for each patient and developing custom medications that target pathways based on an individual patient’s genetics. My interview with Brendan Frey of Toronto-based Deep Genomics explored a few of the opportunities in this space. Deep Genomics is working on “biologically accurate” artificial intelligence for developing new therapies.
Electronic health records
The major electronic health record (EHR) vendors — including Allscripts, Athenahealth, Cerner, eClinicalWorks, and Epic — all made announcements at HIMSS about ways that they would be incorporating AI into their products. Allscripts announced a partnership with Microsoft to develop an Azure-powered EHR, while Epic unveiled a partnership to integrate Nuance‘s AI-powered virtual assistant into the Epic EHR workflow. Trump Administration advisor Jared Kushner even made an appearance advocating for greater EHR interoperability as a step towards applying AI, machine learning, and big data.
Researchers are beginning to incorporate AI into the planning and execution of a variety of surgical procedures. Groups have explored a variety of surgical scenarios, including burn care, limb transplants, craniofacial surgery, cancer treatment, and aesthetic (plastic) surgery.
Of course, significant obstacles remain before we see AI fully integrated into health care delivery. Naturally, the barrier to releasing new products in health care is much higher than for other industries, since even small mistakes can have life-threatening consequences for patients. Researchers must create more robust techniques, a clear chain of accountability must be present, and administrators must make a clear justification for why and how physicians and assisting technologies make care decisions.
Improving robustness and performance will require time, a lot of data, and many rounds of testing. Increasing trust will further require new tools and techniques for explaining opaque algorithms like deep learning (the aforementioned Google Brain research using retinal images provides a good example of this).
We won’t see the autonomous robo-doctors of science fiction anytime soon, but machine learning and AI will undoubtedly play a significant role in the experience of health care consumers and providers in the years to come.
This story originally appeared in the This Week in Machine Learning & AI newsletter. Copyright 2018.
Sam Charrington is host of the podcast This Week in Machine Learning & AI (TWiML & AI) and founder of CloudPulse Strategies.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.