Artificial Intelligence to Reduce Medical Errors

Original article was published by Michael Hunter MD on Artificial Intelligence on Medium


Artificial intelligence protocols to reduce medical errors

Artificial intelligence can save lives. Let’s look at a new study recently published in the journal Nature. Stanford researchers report AI and sensor-based “ambient intelligence” protocols to prevent medical errors and improve outcomes. One can:

  • alert clinicians and visitors when they fail to sanitize their hands before entering a hospital room;
  • monitor the elderly for behavioral clues of impending health crises;
  • prompt caregivers, remote clinicians, and patients to make life-saving interventions.

By building technologies into health care delivery spaces, we can reduce the rate of fatal errors. As bedside care becomes ever more complex, such an approach will become increasingly valuable. The study authors’ examples include attempting to reduce the number of patients in the intensive care unit who experience nosocomial (originating in the hospital) infections.

Photo by Nick Karvounis on Unsplash

In one experiment, researchers placed a tablet near the door that offered a solid green screen that transitions to red when a hygiene failure happens. I am intrigued by a thermal sensor above a hospital bed that would enable the detection of writhing or twitching and alert clinical team members.

Could sensors send an alert to a clinician when frail older individuals begin to move more slowly or stop regular eating?

Medical care is becoming extraordinarily complex. One of the study investigators observes that clinicians in a neonatal ICU had 600 bedside actions daily, for each patient. Even the best of we caregivers are at risk of committing an error, without the aid of technology tools to reduce risk.

More ambient intelligence uses

Next, we turn to more examples of intuitive assistants. You probably know about Alexa, but nascent ambient products are a next-generation approach.

  1. A company called Neura is using ambient apps that learn the daily habits and medical needs of older folks. The Neura products can remind you which medicine to take at a given time. They communicate with wearable devices such as rings or watches and even monitor vital signs to show medicine efficacy.
  2. MirageTable is an augmented reality curved table that allows for a virtual 3D model, interactive gaming with real and virtual things, and a 3D teleconferencing experience.
  3. SmartSofa is a sofa equipped with a variety of sensors. Embedded in the sofa’s back are force-sensitive resistors and load sensors (connected to an Arduino microcontroller) and under its bottom pillows. These sensors allow for the detection of user presence in the room. It also notes the user’s posture (leaning forward or backward) and position (left, right, middle, etc.). Two sensors in the sofa’s side arms provide an invisible input control, allowing individuals to manipulate the interactive environment through mid-air hand gestures.
  4. Startup AIsense is using ambient intelligence to create voice technology that transcribes conversations. A device records all conference participants’ voices and makes a full transcript after the meeting. Users can then perform an in-depth search of the material.
Photo by jose aljovin on Unsplash

The perils of ambient intelligence

I would be remiss if I did not pair my excitement about ambient intelligence with its perils. With the proliferation of ambient applications in our lives, we need to prioritize privacy and data sharing. These AmI tools will often invade our lives silently.

In the health arena, are the ethical guidelines laid out in the Hippocratic Oath nearly 2,500 years ago about to collide with artificial intelligence (AI)? Learn more about medical ethics in this well-written Forbes piece:

Thank you for joining me for this brief piece about AmI. I am curious what you think about ambient intelligence, both the promise and the peril. I’m Dr. Michael Hunter.

References

Benko H., Jota R., Wilson A. MirageTable: Freehand interaction on a projected augmented reality tabletop; Proceedings of the SIGCHI conference on human factors in computing systems; Austin, TX, USA. 5–10 May 2012; New York, NY, USA: ACM; 2012. pp. 199–208.