While we are inching closer to a blood test that can predict Alzheimer’s, there is currently no definitive way to spot the disease. Researchers have taken a different approach by not looking inside the body for clues, but to speech patterns instead. They say the method is 95% accurate.
“There is an urgent need for simple, inexpensive, non-invasive and easily available diagnostic tools for Alzheimer’s,” said Maria C. Carrillo, Ph.D., Alzheimer’s Association chief science officer. “The possibility of early detection and being able to intervene with treatment before significant damage to the brain from Alzheimer’s disease would be game-changing for individuals, families, and our healthcare system.”
Focused on this critical need, researchers at Stevens Institute of Technology in New Jersey developed an artificial intelligence program that could analyze speech patterns, predict when those patterns might point to the future development of Alzheimer’s, and even explain its conclusions.
When someone is in the early stage of Alzheimer’s, their speech pattern can change ever so slightly.
“People with Alzheimer’s typically replace nouns with pronouns, such as by saying ‘He sat on it’ rather than ‘The boy sat on the chair,’ said a statement about the research. “Patients might also use awkward circumlocutions, saying ‘My stomach feels bad because I haven’t eaten’ instead of simply ‘I’m hungry’.”
By using artificial intelligence and converting descriptions of a picture given by participants to numerical values, the researchers were able to train their program to recognize healthy speech patterns from those tied to early Alzheimer’s. While the AI can’t reveal Alzheimer’s years in advance, it was very good at spotting the very early stages and therefore, could be a useful early warning system.
The researchers say that their system can grow so that when new behavioral markers of Alzheimer’s are discovered, they can be added to the AI architecture. Furthermore, the system could be expanded to analyze not only speech patterns but other forms of communication as well, including social media posts and emails.
The tool’s creator, K.P. Subbalakshmi, founding director of Stevens Institute of Artificial Intelligence and professor of electrical and computer engineering at the Charles V. Schaeffer School of Engineering, is now hoping to expand the AI system to other languages and perhaps other types of neurologic ailments like stroke or depression.
“This method is definitely generalizable to other diseases,” she said. “As we acquire more and better data, we’ll be able to create streamlined, accurate diagnostic tools for many other illnesses too.”
The work was presented earlier this month by Subbalakshmi along with her doctorate students, Mingxuan Chen and Ning Wang, at the 19th International Workshop on Data Mining in Bioinformatics at BioKDD.