Alzheimer’s disease and frontotemporal dementia often produce nearly identical symptoms, creating a diagnostic nightmare that delays treatment and prolongs uncertainty for patients and families. Now engineers at Florida Atlantic University have developed an artificial intelligence system that can distinguish between the two conditions using only a simple, inexpensive electroencephalogram (EEG). The breakthrough could move complex dementia diagnosis out of specialized imaging centers and into community clinics.
The clinical problem is significant. Alzheimer’s disease, the most common dementia, causes widespread cognitive decline affecting memory and spatial navigation. Frontotemporal dementia, by contrast, strikes earlier (often between ages 40 and 60) and targets specific brain regions governing personality, behavior, and speech. Despite these different pathologies, the resulting symptoms overlap so much that accurate diagnosis often takes years.
Current best practices rely on costly MRI and PET scans that require specialized facilities, putting them out of reach for much of the world. EEG technology, which measures the brain’s electrical activity using surface sensors, has always been the affordable alternative. The problem? Brainwaves are inherently noisy, fluctuating wildly between individuals. Until now, doctors had no reliable way to extract the subtle electrical differences between Alzheimer’s and frontotemporal dementia from that messy signal.
Teaching AI to Hear Disease in the Noise
The FAU team, based in the College of Engineering and Computer Science, developed a two-stage deep learning model that analyzes both frequency bands (wave speed) and temporal changes (how waves evolve over time). This dual approach, published in Biomedical Signal Processing and Control, finally decoded the brain’s unique patterns of decline.
The process is straightforward: electrodes placed on the scalp record electrical activity, producing a complex data stream of peaks and troughs tracing billions of neural impulses. The AI doesn’t review static brain structure but listens to a live performance of brain function, searching for the diseased signal buried beneath healthy neural chatter.
Initial results were striking. The model achieved over 90 percent accuracy in separating people with any form of dementia from cognitively healthy individuals. More importantly, it could then distinguish Alzheimer’s from frontotemporal dementia and estimate disease severity.
The AI identified slow delta brain waves as a key biomarker for both conditions, concentrated in frontal and central brain regions. The critical distinction: in Alzheimer’s, disruption was far more extensive, spreading into beta frequency bands and affecting wider brain areas. This widespread disturbance confirms the generalized destruction associated with Alzheimer’s and explains why it’s traditionally been easier to detect.
What makes our study novel is how we used deep learning to extract both spatial and temporal information from EEG signals, allowing us to detect subtle brainwave patterns.
Tuan Vo, the study’s lead author and doctoral student, noted that the model doesn’t just deliver a binary diagnosis. Its ability to estimate severity offers a detailed picture of patient condition while aligning with data from costly neuroimaging.
Separating the two dementias proved technically challenging. Initial specificity (correctly confirming healthy individuals as healthy) was only 26 percent. The team applied sophisticated feature selection techniques that immediately improved specificity to 65 percent. Their finalized two-step system achieved 84 percent accuracy, positioning it among the best EEG-based diagnostic tools available.
Making the Black Box Transparent
The model’s architecture weaves together convolutional neural networks and attention-based LSTMs, enabling simultaneous assessment of disease type and progression. Critically, the system includes an explainable AI component called Grad-CAM that visually shows doctors which brain signals drove each conclusion. This transparency transforms the AI from an opaque decision-maker into a trustworthy clinical partner.
The research mapped how each dementia spreads differently. Alzheimer’s affects broader territory, correlating with lower cognitive test scores. Frontotemporal dementia’s impact remains more localized in frontal and temporal lobes. While this distinction was understood, seeing clear evidence in non-invasive EEG signals provides powerful validation.
Our findings show that Alzheimer’s disease disrupts brain activity more broadly, especially in the frontal, parietal and temporal regions, while frontotemporal dementia mainly affects the frontal and central areas.
Co-author Dr. Hanqi Zhuang highlighted the clinical importance of isolating frontotemporal dementia’s subtle signature. Alzheimer’s wide-reaching damage makes it an easier target, but intelligent feature selection can accurately identify the quieter electrical characteristics of frontotemporal dementia. This distinction is vital for FTD patients, whose behavioral symptoms often cause years of delay before correct neurological diagnosis.
The implications extend beyond the lab. By combining detection and severity assessment into a single system, the technology could dramatically reduce stressful evaluation periods. Clinicians gain a real-time tool for tracking disease progression, enabling earlier personalized interventions and modified care plans.
Dr. Stella Batalama, dean of the College of Engineering and Computer Science, placed the work in broader context: “This work demonstrates how merging engineering, AI and neuroscience can transform how we confront major health challenges.” As global populations age, the need for earlier, more accessible dementia care becomes paramount. This breakthrough offers a concrete path to move complex diagnostics into local clinics, providing rapid answers when families need them most.
The collaborative team also included Dr. Ali K. Ibrahim and doctoral student Chiron Bang. Their paper sets a new technical standard for non-invasive brain diagnostics.
Biomedical Signal Processing and Control: 10.1016/j.bspc.2025.108667
If our reporting has informed or inspired you, please consider making a donation. Every contribution, no matter the size, empowers us to continue delivering accurate, engaging, and trustworthy science and medical news. Independent journalism requires time, effort, and resources—your support ensures we can keep uncovering the stories that matter most to you.
Join us in making knowledge accessible and impactful. Thank you for standing with us!













