This article is part of a limited series about the potential of artificial intelligence to solve everyday problems.
Imagine a test as quick and easy as a temperature or blood pressure measurement that could reliably identify an anxiety disorder or predict impending depressive relapse.
Health care providers have many tools to measure a patient’s physical condition, but no reliable biomarkers (objective indicators of medical states observed from outside the patient) to assess mental health.
But some AI researchers now believe that the sound of your voice could be the key to understanding your mental state, and the AI is perfectly suited to detect such changes, which are difficult, if not impossible, to perceive otherwise. The result is a suite of apps and online tools designed to track your state of mind, as well as programs that provide real-time mental health assessments to telehealth providers and call centers.
Psychologists have long known that certain mental health problems can be detected by listening not only to what a person says but excuse me they say so, said Maria Espinola, a psychologist and assistant professor at the University of Cincinnati School of Medicine.
With depressed patients, Dr. Espínola said, “their speech is generally more monotonous, flat, and soft. They also have a reduced pitch range and lower volume. They take more breaks. They stop more often.”
Anxiety patients feel more tension in their bodies, which can also change the way their voice sounds, he said. “They tend to speak faster. They have more difficulty breathing.”
Today, machine learning researchers are harnessing these kinds of vocal features to predict depression and anxiety, as well as other mental illnesses like schizophrenia and post-traumatic stress disorder. Using deep learning algorithms can uncover additional patterns and features, as captured in short voice recordings, that may not be apparent even to trained experts.
“The technology we’re using now can extract features that may be meaningful that not even the human ear can pick up,” said Kate Bentley, an assistant professor at Harvard Medical School and a clinical psychologist at Massachusetts General Hospital.
“There is a lot of excitement about finding biological or more objective indicators of psychiatric diagnoses that go beyond the more subjective forms of assessment that are traditionally used, such as physician-rated interviews or self-report measures,” he said. Other clues researchers are tracking include changes in activity levels, sleep patterns and social media data.
These technological advances come at a time when the need for mental health care is particularly acute: According to a report by the National Alliance on Mental Illness, one in five adults in the United States experienced mental illness in 2020. And the numbers continue to rise.
Although AI technology can’t address the shortage of qualified mental health care providers (there aren’t enough to meet the country’s needs, Dr. Bentley said), there is hope that it can lower barriers to receiving care. correct diagnosis, help clinicians identify patients who may be hesitant to seek care, and facilitate self-monitoring between visits.
“A lot can happen between appointments, and technology can really offer us the potential to improve monitoring and evaluation on a more ongoing basis,” said Dr. Bentley.
To test this new technology, I started by downloading the mental conditioning app Sonde Health, a health technology company, to see if my feelings of discomfort were a sign of something serious or if I was just languishing. Described as “a voice-powered mental fitness tracking and logging product,” the free app invited me to record my first log, a 30-second verbal journal entry, which would rate my mental health on a scale of 1 to 100.
A minute later I got my score: a not-so-good 52. “Pay attention,” the app warned.
The application marked that the level of liveliness detected in my voice was remarkably low. Did I sound monotonous simply because I had been trying to keep my voice down? Should I heed the app’s suggestions to improve my mental state by going for a walk or clearing my space? (The first question might point to one of the app’s possible flaws: As a consumer, it can be hard to tell why their vocal levels fluctuate.)
Later, feeling nervous between interviews, I tried another voice analysis program, this one focused on detecting anxiety levels. the Stress Wave Test is a free online tool from health care and insurance conglomerate Cigna, developed in collaboration with artificial intelligence specialist Ellipsis Health to assess stress levels using 60-second recorded voice samples.
“What keeps you up at night?” was the notice on the website. After I spent a minute recounting my nagging concerns, the show scored my recording and sent me an email statement: “Your stress level is moderate.” Unlike the Sonde app, Cigna’s email did not offer helpful self-improvement tips.
Other technologies add a potentially useful layer of human interaction, such as Kintsugi, a Berkeley, California-based company that raised $20 million in Series A funding earlier this month. Kintsugi gets its name from the Japanese practice of repairing broken pottery with gold streaks.
Founded by Grace Chang and Rima Seiilova-Olson, who were brought together by shared past experience of struggling to access mental health care, Kintsugi develops technology for telehealth providers and call centers that can help them identify patients who could benefit from more support.
Using Kintsugi’s voice analysis program, a nurse, for example, can be asked to take an extra minute to ask a parent harassed by a colicky baby about her own well-being.
One concern with the development of these kinds of machine learning technologies is the issue of bias: ensuring that programs work equitably for all patients, regardless of age, gender, ethnicity, nationality, and other demographics.
“For machine learning models to work well, you really need to have a very large, diverse, robust data set,” Chang said, noting that Kintsugi used voice recordings from around the world, in many different languages, to protect itself. against this particular problem.
Another major concern in this nascent field is privacy, particularly voice data, which can be used to identify individuals, Dr. Bentley said.
And even when patients agree to be recorded, the question of consent sometimes has two aspects. In addition to assessing a patient’s mental health, some speech analysis programs use the recordings to develop and refine their own algorithms.
Another challenge, Dr. Bentley said, is potential consumer mistrust of machine learning and so-called black-box algorithms, which work in ways that even developers themselves can’t fully explain, particularly what features they use to make predictions.
“There is algorithm creation and algorithm understanding,” said Dr. Alexander S. Young, interim director of the Semel Institute for Neuroscience and Human Behavior and chair of psychiatry at the University of California, Los Angeles, echoing concerns. . that many researchers have about AI and machine learning in general: that little or no human supervision is present during the training phase of the program.
For now, Dr. Young remains cautiously optimistic about the potential of voice analysis technologies, especially as tools for patients to monitor themselves.
“I think you can model people’s mental health status or approximate their mental health status in general,” he said. “People like being able to self-monitor their conditions, particularly with chronic illnesses.”
But before automated speech analytics technologies come into widespread use, some are calling for rigorous research into their accuracy.
“We really need more validation of not only voice technology, but also machine learning and artificial intelligence models based on other data streams,” said Dr. Bentley. “And we need to get that validation from large-scale, well-designed representative studies.”
Until then, AI-powered speech analytics technology remains a promising but unproven tool that may eventually be an everyday method of taking the temperature of our mental well-being.