Digital Voice Research in the MGH FTD Unit
Digital recordings of people’s speech (“digital voice”) have emerged as a potentially very valuable source of information about not only the characteristics of speech and language itself in patients with brain conditions of various types, but also an indicator of other types of cognitive functions, such as memory, executive function skills, and even visuospatial skills. We have been digitally recording speech samples in the MGH FTD Unit since 2008, initially requiring laborious manual labor to annotate those recordings or transcriptions of those recordings for analysis. In recent years, increasingly powerful computational speech and language analysis tools have been developed that we have been applying to our database to obtain measures never before possible. This has led to a variety of new insights into Primary Progressive Aphasia and its major variants, as well as other forms of Frontotemporal Degeneration and Alzheimer’s Disease. A brief summary of peer-reviewed publications is listed below with links to the scientific manuscripts. Feel free to reach out to the MGHFTDUnit@mgh.harvard.edu if you have comments or questions.
2017
Cordella C, Dickerson BC, Quimby M, Yunusova Y, Green JR. Slowed articulation rate is a sensitive diagnostic marker for identifying non-fluent primary progressive aphasia. Aphasiology. 2017;31(2):241-260.
This study looked at how different speech timing features, like how fast someone talks and how often they pause, can help identify subtypes of primary progressive aphasia (PPA). The goal was to see which speech measures best distinguish the non-fluent type (nfvPPA) from the more fluent types (lvPPA and svPPA), and how these compare to expert ratings by clinicians. Using speech recordings from 38 people with PPA and 8 healthy individuals, we found that the speed of speech—especially how fast people speak without including pauses (called articulation rate)—was the most accurate way to identify the non-fluent type. In fact, these speech measures outperformed expert ratings in correctly classifying PPA subtypes. The results suggest that automatic speech analysis, particularly articulation rate, could be a more reliable tool to help diagnose and subtype PPA, especially when trying to detect problems with speech motor control in the non-fluent group.
2019
Cordella C, Quimby M, Touroutoglou A, Brickhouse M, Dickerson BC, Green JR. Quantification of motor speech impairment and its anatomic basis in primary progressive aphasia. Neurology. 2019 Apr 23;92(17):e1992-e2004.
We followed up on our prior study of articulation rate in patients with PPA to see how it relates to changes in the brain measured by MRI and how it changes over time. We studied 64 people with PPA, including 39 who were followed for a year. We found that people with the nonfluent type of PPA (nfvPPA) spoke more slowly than those with other types, even early in the disease. Over a year, this group also showed the biggest drop in speech speed. Brain scans showed that slower speech was linked to shrinkage in parts of the brain known to control movements of the mouth and tongue. The results suggest that tracking how fast someone talks is a useful and reliable way to detect early speech problems, monitor changes over time, and understand what parts of the brain are affected in PPA.
2021
Miller HE, Cordella C, Collins JA, Ezzo R, Quimby M, Hochberg D, Tourville JA, Dickerson BC, Guenther FH. Neural substrates of verbal repetition deficits in primary progressive aphasia. Brain Commun. 2021 Feb 16;3(1):fcab015.
This study looked at how shrinkage in certain parts of the brain measured with MRI relates to problems with verbal repetition in people with PPA. Researchers used brain scans from 42 people with PPA and tested their ability to repeat words and numbers. They found that poorer performance on these memory tasks was linked to shrinkage in two specific brain areas on the left side: the supramarginal gyrus and the posterior inferior frontal sulcus. These results support earlier theories about how the brain handles short-term memory of speech.
2022
Rowe HP, Gochyyev P, Lammert AC, Lowit A, Spencer KA, Dickerson BC, Berry JD, Green JR. The efficacy of acoustic-based articulatory phenotyping for characterizing and classifying four divergent neurodegenerative diseases using sequential motion rates. J Neural Transm. 2022 Dec;129(12):1487-1511.
Although brain diseases often affect speech, we still don’t fully understand how to objectively measure these changes. This study tested whether a detailed analysis of speech patterns could help identify different neurological diseases and track how they progress. We recorded speech from people with various brain conditions—ALS, ataxia, Parkinson’s disease, and nonfluent PPA with apraxia of speech—and compared them to healthy individuals. We analyzed the speech using five key areas of motor control: coordination, consistency, speed, precision, and overall speaking rate. The results showed that each disease had a unique speech “fingerprint,” and these patterns could accurately tell most diseases apart—though ALS was harder to classify. These findings suggest that analyzing speech in this detailed way could help doctors diagnose different neurological diseases more accurately and monitor changes over time or responses to treatment.
Cordella C, Gutz SE, Eshghi M, Stipancic KL, Schliep M, Dickerson BC, Green JR. Acoustic and Kinematic Assessment of Motor Speech Impairment in Patients With Suspected Four-Repeat Tauopathies. J Speech Lang Hear Res. 2022 Nov 17;65(11):4112-4132.
This study investigated how speech changes in people with certain brain diseases usually linked to a protein called tau (called 4-repeat-tau or 4RT syndromes), including nonfluent PPA, primary progressive apraxia of speech (PPAOS), corticobasal syndrome (CBS), and progressive supranuclear palsy (PSP). The goal was to see whether people had a speech disorder called apraxia of speech (difficulty planning speech movements) or dysarthria (muscle weakness affecting speech), and how severe it was. We tested 20 patients and 10 healthy people by recording their speech and analyzing both the sounds and movements involved. We grouped speech features into three categories: signs of apraxia, signs of dysarthria, and signs shared by both. Two speech experts also independently rated the participants’ speech without knowing their diagnoses. We found that the computerized speech measures matched well with expert opinions and could tell apart different types of speech problems. The study showed that even within the same disease group, people can have different kinds and severities of speech issues. These speech tests may be helpful for diagnosing and tracking speech problems in people with these tau-related brain diseases, which may be valuable as new treatments are tested in clinical trials.
Rezaii N, Mahowald K, Ryskin R, Dickerson B, Gibson E. A syntax-lexicon trade-off in language production. Proc Natl Acad Sci U S A. 2022 Jun 21;119(25):e2120203119.
When we speak, we choose both the words and sentence structures to express our thoughts. This study looked at how people with PPA use words and grammar in their speech. We found that people who struggle with grammar tend to use more detailed words to get their message across, while those who have trouble with word meaning use more complex grammar. This pattern, known as a “trade-off” between word choice and sentence structure, was also seen in healthy people describing pictures, suggesting it’s a normal part of how we turn ideas into spoken language. This research on patients with disorders of language identified a property of healthy human speech that has not been reported before.
2023
Rezaii N, Michaelov J, Josephy-Hernandez S, Ren B, Hochberg D, Quimby M, Dickerson BC. Measuring Sentence Information via Surprisal: Theoretical and Clinical Implications in Nonfluent Aphasia. Ann Neurol. 2023 Oct;94(4):647-657.
People with nonfluent aphasia typically speak in short, simple sentences and use fewer verbs and connecting words. Traditionally, these speech patterns have been seen as signs of a core problem with grammar. But this study offers a new perspective: instead of just being a weakness, some of these word choices may be a way for the brain to compensate for the difficulties. We looked at how much information patients with nonfluent aphasia packed into their sentences compared to healthy people. We found that, despite using simpler grammar, patients used more meaningful and specific words, keeping the overall information content of their speech just as high. This strategy is called lexical condensation. The results suggest that some features of nonfluent speech might actually be helpful adaptations—not just symptoms of damage. This could change how speech therapy is designed for people with this condition.
Rezaii N, Ren B, Quimby M, Hochberg D, Dickerson BC. Less is more in language production: an information-theoretic analysis of agrammatism in primary progressive aphasia. Brain Commun. 2023 Apr 25;5(3):fcad136.
Agrammatism is a language problem where people speak in short, simple sentences and tend to use more nouns than verbs, often skipping small connecting words. Although these speech patterns have been studied for years, experts haven’t agreed on why they happen. This study suggests that people with agrammatism may be choosing more unusual, less commonly used words to pack more meaning into their short sentences—a smart way of compensating for their trouble with grammar. We analyzed speech from 100 people with different types of PPA and compared it to speech from 65 healthy people describing pictures. We found that the words preferred by people with agrammatism tend to be less frequent but carry more information. When healthy people were asked to speak in short sentences, they showed a similar speech pattern—using more nouns, fewer connecting words, and more specific verbs—just like those with agrammatism.
Josephy-Hernandez S, Rezaii N, Jones A, Loyer E, Hochberg D, Quimby M, Wong B, Dickerson BC. Automated analysis of written language in the three variants of primary progressive aphasia. Brain Commun. 2023 Jul 20;5(4):fcad202.
While writing is an important part of daily life, it’s been understudied in people with PPA. This study looked at how different types of PPA affect written communication, and compared it to how those same people speak. We created a computer program to quickly and accurately measure how much meaningful content people include in their writing or speech when asked to describe a common picture used in language testing.
We found that:
- People with PPA wrote fewer words overall than healthy individuals.
- Those with the logopenic and semantic types of PPA included less meaningful content in their writing.
- People with the nonfluent type used a higher ratio of meaningful content to total words, showing a more condensed or “telegraphic” style.
- Compared to speaking, writing included fewer words, but the drop in meaningful words was smaller—so writing actually packed in relatively more meaning per word, probably because people often include more “filler” words in speech than in writing.
- Across all PPA patients, more severe language or memory problems were tied to producing fewer meaningful words.
The study also introduced a helpful new tool for quickly measuring how much content someone includes in their writing—potentially useful for tracking changes in language over time.
2024
Bayat S, Sanati M, Mohammad-Panahi M, Khodadadi A, Ghasimi M, Rezaee S, Besharat S, Mahboubi-Fooladi Z, Almasi-Dooghaee M, Sanei-Taheri M, Dickerson BC, Rezaii N. Language abnormalities in Alzheimer’s disease indicate reduced informativeness. Ann Clin Transl Neurol. 2024 Nov;11(11):2946-2957.
This study looked at how Alzheimer’s disease (AD) affects the way people speak, using language samples from both English and Persian speakers. Researchers used computerized analytic tools to generate a new measure called the Language Informativeness Index (LII) to detect how much of a person’s speech lacks meaningful content—also known as “empty speech.” We found that, even though English and Persian are very different languages, people with AD showed similar language problems in both. These included less informative and meaningful speech. In fact, a system trained to recognize signs of AD in English could also identify it in Persian with 90% accuracy. This suggests that AD affects a core ability to express clear and informative ideas, no matter what language someone speaks. These insights could help improve early diagnosis of Alzheimer’s in many languages, supporting more equal access to care around the world.
Rezaii N, Hochberg D, Quimby M, Wong B, Brickhouse M, Touroutoglou A, Dickerson BC, Wolff P. Artificial intelligence classifies primary progressive aphasia from connected speech. Brain. 2024 Sep 3;147(9):3070-3082.
Highly trained medical specialists usually diagnose different types of primary progressive aphasia (PPA)—a condition that affects language—by listening to and testing for patterns in how people speak and think. But there’s still debate about the best way to classify the three known types of PPA. In this study, we used artificial intelligence (AI) and computerized language analysis tools to study short speech samples from 78 people with PPA. Without any specific training or supervision, AI was able to group these speech samples into three types that closely matched traditional medical diagnoses (almost 89% agreement). These groups also showed brain shrinkage patterns that matched what’s expected in each type. Next, we looked at which parts of speech were most useful for identifying each PPA type. Using the 17 key features we found, we built a tool that could correctly identify the type of PPA or whether the person was healthy with almost 98% accuracy. This work shows that AI can help detect natural speech patterns in PPA and highlights which language features are most important for understanding and diagnosing the condition.
Rezaii N, Hochberg D, Quimby M, Wong B, McGinnis S, Dickerson BC, Putcha D. Language uncovers visuospatial dysfunction in posterior cortical atrophy: a natural language processing approach. Front Neurosci. 2024 Feb 6;18:1342909.
Posterior Cortical Atrophy (PCA) is a condition that mostly affects how people see and process visual information. We compared how 25 people with PCA and a group of healthy older adults spoke during two tasks: 1) describing a picture (which relies on vision), and 2) talking about their job (which doesn’t rely on vision). We looked at how long it took to speak, what kinds of words were used, and how often spatial words (like “above” or “next to”) appeared. We found that people with PCA had trouble only with the picture task. They used simpler, more common words, took longer to speak, and used fewer spatial words. They also had difficulty understanding key parts of the picture. These language issues weren’t present during the job description task. This shows some language problems in PCA are closely tied to their visual issues. It also highlights that certain kinds of language tasks can illuminate problems with other kinds of cognitive functions, such as visual processing.
Touroutoglou A, Katsumi Y, Rezaii N, Paranhos T, Jones A, Hochberg D, Quimby M, Henderson SK, Wong B, Brickhouse M, Camprodon JA, Dickerson BC, Eldaief MC. Transcranial magnetic stimulation improves language and language network functional connectivity in a patient with logopenic primary progressive aphasia. Brain Stimul. 2024 Nov-Dec;17(6):1213-1215.
We explored whether a type of non-invasive brain stimulation called repetitive transcranial magnetic stimulation (rTMS) could help improve language abilities in a person with logopenic variant primary progressive aphasia (lvPPA). We first identified the parts of this patient’s brain involved in language using specialized brain scans. Then, we used rTMS to stimulate a key area involved in speech (the left posterior inferior frontal gyrus) for five days. We also did a comparison session (a “sham” treatment that mimics rTMS but without real stimulation) to test whether any improvements were due to the treatment itself. After the real (active) rTMS—but not after the sham treatment—the patient showed improvements in naming objects and repeating sentences (standard language tests); use of more complex sentence structures and somewhat more precise vocabulary in everyday speech; stronger brain connectivity on functional MRI between important language areas, especially in parts of the brain known to support speech. This study suggests that rTMS might help people with lvPPA speak more clearly and effectively at least temporarily, even if it doesn’t stop the disease from progressing. By personalizing the stimulation to the patient’s unique brain network, the treatment may offer at least temporary improvements in communication abilities.