A growing body of research is challenging the long-held assumption that memory loss is the earliest reliable warning sign of dementia. Several peer-reviewed studies now point to something subtler and easier to measure: the ability to sustain attention. Momentary lapses in focus, the kind most people dismiss as ordinary distraction, may predict cognitive decline years before traditional memory tests raise any alarm.
Wandering Minds and Alzheimer’s Biomarkers
A study of 504 older adults with a mean age of about 69.5 years used a Sustained Attention to Response Task, or SART, to measure both reaction-time variability and self-reported off-task thoughts. The researchers found that participants whose minds wandered more frequently during the task progressed faster toward cognitive impairment and showed stronger associations with Alzheimer’s disease biomarkers in cerebrospinal fluid and brain imaging. The design is notable because the SART captures something memory tests typically miss: how well someone can maintain focus over a monotonous stretch of time, rather than how well they recall a list of words.
That distinction matters. Memory screening has dominated dementia detection for decades, but it tends to flag problems only after significant neural damage has already occurred. Attention-based measures, by contrast, may pick up on disruptions to brain networks that regulate alertness and executive control, systems that some researchers believe deteriorate earlier in the disease process. In the SART study, even people who were still performing normally on standard cognitive exams showed subtle but measurable attention instability that lined up with biological evidence of Alzheimer’s risk.
Reaction-Time Variability as a Risk Signal
Independent evidence from Australia reinforces the case. The Sydney Memory and Ageing Study tracked a cohort of 861 adults aged 70 to 90 over four years, during which 48 participants developed dementia. The researchers focused on intra-individual variability in reaction time, often abbreviated IIVRT, measured during a complex decision task. Each standard-deviation increase in this variability was associated with roughly a 40 percent higher risk of incident dementia, even after adjusting for age, education, and baseline cognitive scores.
Average speed still mattered—slower responders were more likely to decline—but the inconsistency of responses, the erratic pattern of fast and slow reactions within a single session, carried its own predictive weight. That pattern suggests that the brain’s ability to maintain a stable level of processing from moment to moment may erode before more obvious deficits emerge. In practical terms, someone whose performance fluctuates wildly across a few minutes of testing could be signaling vulnerability that a simple memory checklist would miss.
Clinical work has reached similar conclusions using even briefer tools. In one study, researchers defined attention lapses as any response slower than 500 milliseconds on a three-minute reaction-time task. People with dementia showed a significantly higher proportion of these lapses compared to healthy controls, and those with mild cognitive impairment fell in between. Because the procedure requires only a basic computer or tablet and minimal training, the authors argued it could be integrated into routine primary care visits as a low-cost screening aid.
Large-Scale Confirmation From the UK Biobank
Population-level data from the UK Biobank provide some of the strongest support for attention-based prediction. In this massive resource, brief cognitive tasks were administered to up to 488,130 participants at baseline, with hospital and death records tracked for three to eight years. Over that period, researchers identified 1,051 dementia cases, including 352 diagnosed as Alzheimer’s disease. When they compared different baseline measures, simple reaction-time performance predicted incident dementia at least as well as, and in some models better than, the visual memory tasks collected in the same session.
The Biobank data are especially valuable because of their heterogeneity. Participants differed in age, health status, and follow-up length, allowing analysts to examine both near-term and medium-term risk. Reaction-time measures taken years before diagnosis still carried a detectable signal, suggesting that attention instability is not just a reflection of temporary fatigue, mood, or medication effects. Instead, the pattern appears robust enough across hundreds of thousands of people to point toward underlying brain changes.
Other longitudinal cohorts have reached complementary conclusions. In a study of older adults followed for up to 10 years, measures of processing speed and variability predicted later dementia even after accounting for baseline memory performance. Another analysis using Biobank-derived methods reported that brief computerized tests of attention and reaction time could stratify Alzheimer’s risk in community samples, strengthening the argument that these measures are not just research curiosities but potential clinical tools.
Limitations: These studies are observational and identify associations, not proof that attention lapses cause dementia. Reaction time can also be influenced by sleep, medications, mood, pain, and other health factors, so results should be interpreted in clinical context.
Implicit Cognition Shifts Before Symptoms Appear
Research from Caltech adds a mechanistic layer to these epidemiological findings. A study published in GeroScience examined cognitively healthy older adults stratified by biological risk for Alzheimer’s disease using cerebrospinal fluid markers, MRI scans, and genome sequencing. Participants completed a visual attention task in which distracting information competed with task-relevant targets. Among those at higher biological risk, repeated practice paradoxically increased the influence of the distractors instead of reducing it.
This counterintuitive pattern—summarized in a Caltech news release—suggests that implicit learning processes that normally help people tune out irrelevant information may become distorted years before classic Alzheimer’s symptoms like memory loss appear. In effect, practice made imperfect: with repetition, at-risk individuals grew more vulnerable to distraction, not less. If replicated in larger longitudinal cohorts, this kind of task could help identify people on a neurodegenerative trajectory long before their daily functioning is obviously impaired.
Attention Disorders and Dementia Risk
The link between attention and dementia extends beyond aging-related decline. A study published in October 2023 found that adult ADHD is associated with an increased risk of later-life dementia and warrants reliable assessment in adulthood. The authors reported that adults with clinically diagnosed attention-deficit/hyperactivity disorder were more likely to develop dementia than peers without ADHD, even after adjusting for psychiatric comorbidities and cardiovascular risk factors.
This raises an uncomfortable question for the millions of adults living with diagnosed or undiagnosed ADHD: does a lifelong pattern of inattention compound neurodegenerative risk, or does ADHD simply share overlapping vulnerabilities with dementia, such as disrupted frontal networks or chronic sleep problems? The study could not fully disentangle causality, and many people with ADHD will never develop dementia. Still, the findings underscore that attention problems across the lifespan may deserve closer monitoring, not only for their impact on education and work but also for their potential long-term neurological implications.
Clinicians are beginning to consider how to distinguish between longstanding attentional traits and new-onset lapses that might herald neurodegeneration. For example, a person with stable ADHD symptoms since childhood who shows no change in daily functioning may be at a different level of concern than someone who develops erratic focus and slowed responses for the first time in their 60s. Combining careful history-taking with brief computerized tasks could help tease apart these trajectories.
Toward Earlier, More Practical Screening
One appeal of attention-based markers is their practicality. Reaction-time tasks can be delivered on a tablet or smartphone in minutes, scored automatically, and repeated over time to track subtle changes. In contrast, traditional neuropsychological batteries are time-consuming and require trained specialists. Neurologists and geriatric psychiatrists have argued that scalable digital tools could complement, not replace, in-depth evaluations by flagging those who most need specialist referral. A recent review in neurology highlighted how computerized cognitive testing can detect early impairment, particularly in domains like attention and processing speed that are hard to capture with paper-and-pencil screening alone.
Researchers caution, however, that attention measures are not yet ready to serve as stand-alone diagnostic tests. Many factors can influence reaction time, including sleep, medications, pain, and mood disorders such as depression. Large-scale studies like the UK Biobank help average out these influences, but in individual patients, context still matters. Moreover, predicting risk is not the same as preventing disease. Even if attention lapses reliably forecast dementia, interventions that can modify that trajectory remain limited.
Still, the shift in focus from memory to attention reflects a deeper rethinking of how Alzheimer’s and related dementias unfold. Instead of viewing forgetfulness as the first meaningful sign, scientists are mapping earlier disruptions in the brain’s ability to sustain stable, goal-directed activity. That change of perspective could eventually move diagnosis years earlier, opening a wider window for lifestyle interventions, clinical trials, and, where appropriate, emerging disease-modifying drugs.
For now, the practical takeaway is modest but important. Occasional distraction is part of normal life, and no single slow reaction on a phone game should trigger alarm. But patterns of increasing lapses—especially when they are new, noticeable to others, or accompanied by other changes in thinking or behavior—may warrant discussion with a clinician. As research continues, the everyday experience of a wandering mind may turn out to be one of the most informative clues about the brain’s future health.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.