The USC Signal Analysis and Interpretation Lab (SAIL), in new collaboration with UCLA, has found that artificial intelligence (AI) can accurately detect changes in patient's clinical state from his or her speech as well as the physician's inputs.
SAIL partnered with researchers at UCLA to analyse voice data from patients being treated for serious mental illnesses, including bipolar disorder, schizophrenia and major depressive disorders. These individuals and their treating clinicians used the MyCoachConnect interactive voice and mobile tool, created and hosted on the Chorus platform at UCLA, to provide voice diaries related to their mental health states. SAIL then collaborated with UCLA researchers to apply AI to listen to hundreds of voicemails using custom software to detect changes in patients’ clinical states. The SAIL AI was able to match clinicians’ ratings of their patients.
“Machine learning allowed us to illuminate the various clinically-meaningful dimensions of language use and vocal patterns of the patients over time and personalised at each individual level,” said senior author Dr Shri Narayanan, Niki and Max Nikias Chair in Engineering and Director of SAIL at the USC Viterbi School of Engineering.
Tracking changes in clinical states is important, say the researchers, to detect if there is a change that shows that condition has improved or worsened that would warrant the need for changing treatment.
The next step in this joint research is to scale up this individualized approach to a larger population over longer time periods of observation.