Scans used, for the first time, to examine relationship between neocortex and cognitive decline.
Iron accumulation in the outer layer of the brain – identified with MRI scans – is linked to cognitive decline in people who have Alzheimer’s disease, according to new research.
Results from the study, published in Radiology, show that MRI can be used to determine how much iron has collected in the neocortex, the heavily grooved outer layer of the brain that is integral to language, conscious thought, and other functions. Until now, being able to assess the presence of iron in this layer has been difficult because its anatomy can create distortions, signal decay, and artifacts in MRI scans.
The association between high iron levels in the brain and Alzheimer’s is already known, but most research has focused on how the iron collects in the deep gray matter structures of the brain. This is the first study to look at the relationship between iron changes in the neocortex and cognitive decline.
But, given the complexities presented with the neocortex itself and the limitations of scanning time in the clinical environment, researchers needed to find another way to adequately assess iron levels, the team indicated.
Related Content: PET, MRI Identify What Increases Women’s Likelihood of Alzheimer’s
"The best solution to minimize these artifacts would be using ultra-high-resolution scans,” said study co-author Reinhold Schmidt, M.D., professor of neurology and chair of the neurology department at the Medical University of Graz in Austria. “However, in the clinical setting, scan time is a limiting factor, and a compromise has to be found.”
Conducting the Study
To achieve this compromise, Schmidt’s team used a 3T MRI scanner with a 12-channel passed-array head coil that allowed for a balance between resolution and scan time. It also enabled any impacts from distortions to be corrected during post-processing. Initially, they enrolled 100 patients with Alzheimer’s and 100 health control participants. Among the Alzheimer’s group, 56 were able to fully complete the study with subsequent neuropsychological testing and brain MRI scans conducted after an average of 17 months.
Using this technique, they created a brain iron map that revealed the iron levels in several parts of the brain.
“We found indications of higher iron deposition in the deep gray matter and total neocortex, and regionally in temporal and occipital lobes, in Alzheimer’s disease patients compared with age-matched healthy individuals,” Schmidt said.
Related Content: FDA Approves First Radiopharmaceutical for Imaging Tau
Specifically, a comparison of R2* values in the total and regional brain matter between patients with Alzheimer’s and the healthy control participants revealed these values were higher in the whole cortex, including the neocortex and deep gray matter of patients with Alzheimer’s. In addition, these patients had higher media R2* levels in their temporal and occipital lobes, as well as the caudate nuclei, a structure that plays a vital role in how the brain learns.
Overall, the team said, the collection of higher levels of iron in the brain were associated with cognitive deterioration independent of the loss of brain volume, adding that changes in iron levels in the temporal lobes did correlate with cognitive decline in patients who had Alzheimer’s.
Ultimately, they said, these findings bolster the existing belief that high iron levels in the brain directly facilitate amyloid beta deposits and neurotoxicity in Alzheimer’s. The results also point to the possibility of using iron-reducing drugs, known as chelators, potential therapy for Alzheimer’s.
“Our study provides support for the hypothesis of impaired iron homeostatis in Alzheimer’s disease and indicates that the use of iron chelators in clinical trials might be a promising treatment target,” Schimdt said. “MRI-based iron mapping could be used as a biomarker for Alzheimer’s disease prediction and as a tool to monitor treatment response in therapeutic studies.”