How​ ​Robots​ ​are​ ​Diagnosing​ ​Disorders

When we talk to someone, we don’t always stop to scrutinise the length of pauses between words, the ratio of pronouns to proper nouns used, or the subtle variations in frequency and amplitude.

A two foot-tall humanoid robot called Ludwig, however, can analyse these individual aspects of speech to instantly predict whether a speaker is suffering from Alzheimer’s disease, to an accuracy of 82 percent - and rising.

Algorithmic diagnoses

Using complex AI algorithms derived from databases of speech patterns and deep learning behaviours, Winterlight Labs’ Ludwig can initiate a conversation with a patient and process their responses to predict the severity of the disease.

While healthy patients link words into sentences with very few pauses and use specific descriptions of people and things, Alzheimer’s sufferers leave longer pauses and replace nouns with pronouns more frequently as they progress through the disease.

While human physicians may be able to pick up on these differentiations, Ludwig goes one step further, analysing tiny variations in pitch, frequency and volume as the conversation plays out. While to the human ear these differences are all but imperceptible, adding subtly to the general feeling on the conversation, to an objective AI bot they are easily quantifiable.

The future for healthcare AI

While there are complex regulatory issues to overcome before AI can diagnose new patients, Ludwig’s speech analysis technology is already being piloted in retirement homes across North America, Scotland and France, assisting physicians with pinpointing the severity and progression of the disease while helping the machines to learn a range of languages, dialects and accents.

In the future, researchers hope to use the new technology to detect and predict emotions, diagnosing affective disorders such as depression and anxiety. However, Ludwig is just one remarkable step in an industry predicted to be worth a global £5 billion by 2021.

This year alone, British digital healthcare company Babylon raised nearly £50 million to build an autonomous AI diagnosis system, which could hypothetically test symptoms and devise diagnoses without input from a human physician. While in its early stages, diagnostic AI could free up doctors’ time and resources for more patient-centric interactions, such as discussing potential therapies and assisting in rehabilitation.

A question of uptake

While many patients may consider AI diagnoses impersonal and uncomfortable in principle, we already rely heavily on algorithmic processing of personal metrics. Smartphone apps and fitness trackers tap into our blood pressure, heart rate, sleep habits and even fertility cycles on a daily basis.

By manipulating that same data, increasingly accurate AI such as Ludwig could help researchers and physicians to pinpoint diseases and disorders far sooner, allowing for more effective and individual treatment.

Discover how connected hospitals are changing the way we provide healthcare, and discuss the latest in patient-centric tech with our mdtechnologies experts.