But Will the Algorithms Have Empathy?

How soon will it be before smart machines perform complex, multifaceted services such as looking out for our health?

Reading Time: 2 min 

Topics

Frontiers

An MIT SMR initiative exploring how technology is reshaping the practice of management.
More in this series

Every day, we hear about smart machines with new capabilities: computers that can outplay chess masters or are capable of processing natural language to answer increasingly complex questions; new cars that alert us when the driver in front of us hits the brakes, when we drift out of our designated lanes, or when a pedestrian suddenly steps off the curb. But how soon will it be before smart machines perform complex, multifaceted services such as looking out for our health?

In a recent article in The New Yorker, “A.I. Versus M.D.,” Siddhartha Mukherjee, a hematologist and oncologist at Columbia University Medical Center, describes the increasingly nuanced role computers are playing in cancer screening. Twenty years ago, Mukherjee notes, computers were used by diagnosticians to help identify suspicious patterns or waveforms and, later, to confirm a hypothesis. However, he writes, while the rate of biopsies increased, detections didn’t, and there was a jump in false positives.

More recent intelligent systems have used a computing strategy modeled after the brain, known as a “neural network,” which can “learn” how to diagnose illnesses. Mukherjee describes a 2015 study by Sebastian Thrun of Stanford University in which a smart machine was asked to classify 14,000 images that dermatologists had found to have abnormalities (either benign or cancerous). The system correctly diagnosed the problems 72% of the time, compared with 66% for two board-certified dermatologists. Then, in a related study, 21 dermatologists were asked to review a set of about 2,000 images for skin cancers. In all but a few cases, the machine did better at spotting cases of melanoma than the doctors did; what’s more, for reasons that aren’t clear, it learned to differentiate moles from cancers.

Applying similar capabilities to detect other illness early and accurately may not be far away. By monitoring a person’s speech patterns with a cellphone, for example, it may be possible to detect early signs of Alzheimer’s disease. Steering wheels with sensors to detect hesitations and tremors might identify potential cases of Parkinson’s disease. Similarly, researchers say, algorithms tracking patients’ heartbeats may identify cardiac issues before they show up in other ways. Patients concerned about skin lesions will be able to send images from their iPhones to robots, which over time will become more and more skilled at diagnosis.

Topics

Frontiers

An MIT SMR initiative exploring how technology is reshaping the practice of management.
More in this series

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.

Comment (1)
Kyle Tran
What's does this article has anything to do with AI's capability for empathy. Doctors does not spend as much time with the patients as they used to. Recent study from a survey of 2017 indicates that about 56% of the primary care doctors spend less than 16 minutes with the patient. This doesn't discount the time that doctors actually communicating with their patients during the meetings.
The amount of "empathy " patients gets in the hospital from the nurses and doctors are even worse. The ones that spend most of the time with the patients are the direct care nurses whom are unfortunately get paid the worst. So don't talk to me about empathy in healthcare, talk about its rampant frauds, unnecessary procedures, over billing and possible solutions for the average patients to find cost effective treatments instead. 
Can I assure you, the insurance industry will be the first to enforce AI in the healthcare systems, and they may have the rights to.