aging

Music training protects against aging-related hearing loss

February, 2012

More evidence that music training protects older adults from age-related impairment in understanding speech, adding to the potential benefits of music training in preventing dementia.

I’ve spoken before about the association between hearing loss in old age and dementia risk. Although we don’t currently understand that association, it may be that preventing hearing loss also helps prevent cognitive decline and dementia. I have previously reported on how music training in childhood can help older adults’ ability to hear speech in a noisy environment. A new study adds to this evidence.

The study looked at a specific aspect of understanding speech: auditory brainstem timing. Aging disrupts this timing, degrading the ability to precisely encode sound.

In this study, automatic brain responses to speech sounds were measured in 87 younger and older normal-hearing adults as they watched a captioned video. It was found that older adults who had begun musical training before age 9 and engaged consistently in musical activities through their lives (“musicians”) not only significantly outperformed older adults who had no more than three years of musical training (“non-musicians”), but encoded the sounds as quickly and accurately as the younger non-musicians.

The researchers qualify this finding by saying that it shows only that musical experience selectively affects the timing of sound elements that are important in distinguishing one consonant from another, not necessarily all sound elements. However, it seems probable that it extends more widely, and in any case the ability to understand speech is crucial to social interaction, which may well underlie at least part of the association between hearing loss and dementia.

The burning question for many will be whether the benefits of music training can be accrued later in life. We will have to wait for more research to answer that, but, as music training and enjoyment fit the definition of ‘mentally stimulating activities’, this certainly adds another reason to pursue such a course.

Reference: 

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

tags strategies: 

'Exergames' may provide greater cognitive benefit for older adults

February, 2012

An intriguing pilot study finds that regular exercise on a stationary bike enhanced with a computer game-type environment improves executive function in older adults more than ordinary exercise on a stationary bike.

We know that physical exercise greatly helps you prevent cognitive decline with aging. We know that mental stimulation also helps you prevent age-related cognitive decline. So it was only a matter of time before someone came up with a way of combining the two. A new study found that older adults improved executive function more by participating in virtual reality-enhanced exercise ("exergames") that combine physical exercise with computer-simulated environments and interactive videogame features, compared to the same exercise without the enhancements.

The Cybercycle Study involved 79 older adults (aged 58-99) from independent living facilities with indoor access to a stationary exercise bike. Of the 79, 63 participants completed the three-month study, meaning that they achieved at least 25 rides during the three months.

Unfortunately, randomization was not as good as it should have been — although the researchers planned to randomize on an individual basis, various technical problems led them to randomize on a site basis (there were eight sites), with the result that the cybercycle group and the control bike group were significantly different in age and education. Although the researchers took this into account in the analysis, that is not the same as having groups that match in these all-important variables. However, at least the variables went in opposite directions: while the cybercycle group was significantly younger (average 75.7 vs 81.6 years), it was significantly less educated (average 12.6 vs 14.8 years).

Perhaps also partly off-setting the age advantage, the cybercycle group was in poorer shape than the control group (higher BMI, glucose levels, lower physical activity level, etc), although these differences weren’t statistically significant. IQ was also lower for the cybercycle group, if not significantly so (but note the high averages for both groups: 117.6 vs 120.6). One of the three tests of executive function, Color Trails, also showed a marked group difference, but the large variability in scores meant that this difference was not statistically significant.

Although participants were screened for disorders such as Alzheimer’s and Parkinson’s, and functional disability, many of both groups were assessed as having MCI — 16 of the 38 in the cybercycle group and 14 of the 41 in the control bike group.

Participants were given cognitive tests at enrolment, one month later (before the intervention began), and after the intervention ended. The stationary bikes were identical for both groups, except the experimental bike was equipped with a virtual reality display. Cybercycle participants experienced 3D tours and raced against a "ghost rider," an avatar based on their last best ride.

The hypothesis was that cybercycling would particularly benefit executive function, and this was borne out. Executive function (measured by the Color Trails, Stroop test, and Digits Backward) improved significantly more in the cybercycle condition, and indeed was the only cognitive task to do so (other cognitive tests included verbal fluency, verbal memory, visuospatial skill, motor function). Indeed, the control group, despite getting the same amount of exercise, got worse at the Digits Backward test, and failed to show any improvement on the Stroop test.

Moreover, significantly fewer cybercyclists progressed to MCI compared to the control group (three vs nine).

There were no differences in exercise quantity or quality between the two groups — which does argue against the idea that cyber-enhanced physical activity would be more motivating. However, the cybercycling group did tend to comment on their enjoyment of the exercise. While the enjoyment may not have translated into increased activity in this situation, it may well do so in a longer, less directed intervention — i.e. real life.

It should also be remembered that the intervention was relatively short, and that other cognitive tasks might take longer to show improvement than the more sensitive executive function. This is supported by the fact that levels of the brain growth factor BDNF, assessed in 30 participants, showed a significantly greater increase of BDNF in cybercyclists.

I should also emphasize that the level of physical exercise really wasn't that great, but nevertheless the size of the cybercycle's effect on executive function was greater than usually produced by aerobic exercise (a medium effect rather than a small one).

The idea that activities that combine physical and mental exercise are of greater cognitive benefit than the sum of benefits from each type of exercise on its own is not inconsistent with previous research, and in keeping with evidence from animal studies that physical exercise and mental stimulation help the brain via different mechanisms. Moreover, I have an idea that enjoyment (in itself, not as a proxy for motivation) may be a factor in the cognitive benefits derived from activities, whether physical or mental. Mere speculation, derived from two quite separate areas of research: the idea of “flow” / “being in the zone”, and the idea that humor has physiological benefits.

Of course, as discussed, this study has a number of methodological issues that limit its findings, but hopefully it will be the beginning of an interesting line of research.  

Reference: 

[2724] Anderson-Hanley, C., Arciero P. J., Brickman A. M., Nimon J. P., Okuma N., Westen S. C., et al.
(2012).  Exergaming and Older Adult Cognition.
American Journal of Preventive Medicine. 42(2), 109 - 119.

Source: 

Topics: 

tags: 

tags development: 

tags lifestyle: 

tags memworks: 

tags problems: 

tags strategies: 

Cognitive decline begins in middle age

February, 2012

A large ten-year study of middle-aged to older adults (45-70) has found that cognitive decline begins in the 45-55 decade, with reasoning ability the most affected by age.

The age at which cognitive decline begins has been the subject of much debate. The Seattle longitudinal study has provided most of the evidence that it doesn’t begin until age 60. A more recent, much larger study that allows both longitudinal and cross-sectional analysis suggests that, depressingly, mid-to-late forties might be closer to the mark.

A long-term British study known as Whitehall II began in 1985, when all civil servants aged 35-55 in 20 London-based departments were invited to participate. In 1997-9, 5198 male and 2192 female civil servants, aged 45-70 at this point, were given the first of three rounds of cognitive testing. The second round took place in 2002-4, and the third in 2007-9.

Over these ten years, all cognitive scores except vocabulary declined in all five age categories (45-49, 50-54, 55-59, 60-64, and 65-70 at baseline). Unsurprisingly, the decline was greater with increasing age, and greatest for reasoning. Men aged 45-9 at baseline showed a 3.6% decline in reasoning, compared to a 9.6% decline for those aged 65-70. Women were less affected by age: while showing the same degree of decline when younger, the oldest showed a 7.4% decline.

None of the other cognitive tasks showed the same age-related deterioration as reasoning, which displayed a consistently linear decline with advancing age. The amount of decline over ten years was roughly similar for each age group for short-term memory and phonemic and semantic fluency (although the women displayed more variability in memory, in a somewhat erratic pattern which may perhaps reflect hormonal changes — I’m speculating here). Moreover, the amount of decline in each decade for these functions was only about the same as reasoning’s decline in the younger decades — about -4% in each decade.

Men and women differed significantly in education (33% of men attended university compared to 21% of women; 57% of women never finished secondary school compared to 39% of men). It is therefore unsurprising that men performed significantly better on all cognitive tests except memory (noting that the actual differences in score were mostly quite small: 16.9/35 vs 16.5 for phonemic fluency; 16.7/35 vs 15.8 for semantic fluency; 25.7/33 vs 23.1 for vocabulary; 48.7/65 vs 41.6 for reasoning).

The cognitive tests included a series of 65 verbal and mathematical reasoning items of increasing difficulty (testing inductive reasoning), a 20-word free recall test (short-term verbal memory), recalling as many words as possible beginning with “S” (phonemic fluency) and recalling members of the animal category (semantic fluency), and a multi-choice vocabulary test.

The design of the study allowed both longitudinal and cross-sectional analyses to be carried out. Cross-sectional data, although more easily acquired, has been criticized as conflating age effects with cohort differences. Generations differ on several relevant factors, of which education is the most obvious. The present study semi-confirmed this, finding that cross-sectional data considerably over-estimated cognitive decline in women but not men — reflecting the fact that education changed far more for women than men in the relevant time periods. For example, in the youngest group of men, 30% had less than a secondary school education and 42% had a university degree, and the women showed a similar pattern, with 34% and 40%. However, for those aged 55-59 at baseline, the corresponding figures were 38% and 29% for men compared to 58% and 17% for women.

The principal finding is of course that measurable cognitive decline was evident in the youngest group, meaning that at some point during that 45-55 decade, cognitive faculties begin to decline. Of course, it should be emphasized that this is a group effect — individuals will vary in the extent and timing of any cognitive decline.

(A side-note: During the ten year period, 305 participants died. The probability of dying was higher in those with poorer cognitive scores at baseline.)

Reference: 

Source: 

Topics: 

tags development: 

tags problems: 

Diet linked to brain atrophy in old age

January, 2012
  • A more rigorous measurement of diet finds that dietary factors account for nearly as much brain shrinkage as age, education, APOE genotype, depression and high blood pressure combined.

The study involved 104 healthy older adults (average age 87) participating in the Oregon Brain Aging Study. Analysis of the nutrient biomarkers in their blood revealed that those with diets high in omega 3 fatty acids and in vitamins C, D, E and the B vitamins had higher scores on cognitive tests than people with diets low in those nutrients, while those with diets high in trans fats were more likely to score more poorly on cognitive tests.

These were dose-dependent, with each standard deviation increase in the vitamin BCDE score ssociated with a 0.28 SD increase in global cognitive score, and each SD increase in the trans fat score associated with a 0.30 SD decrease in global cognitive score.

Trans fats are primarily found in packaged, fast, fried and frozen food, baked goods and margarine spreads.

Brain scans of 42 of the participants found that those with diets high in vitamins BCDE and omega 3 fatty acids were also less likely to have the brain shrinkage associated with Alzheimer's, while those with high trans fats were more likely to show such brain atrophy.

Those with higher omega-3 scores also had fewer white matter hyperintensities. However, this association became weaker once depression and hypertension were taken into account.

Overall, the participants had good nutritional status, but 7% were deficient in vitamin B12 (I’m surprised it’s so low, but bear in mind that these are already a select group, being healthy at such an advanced age) and 25% were deficient in vitamin D.

The nutrient biomarkers accounted for 17% of the variation in cognitive performance, while age, education, APOE genotype (presence or absence of the ‘Alzheimer’s gene’), depression and high blood pressure together accounted for 46%. Diet was more important for brain atrophy: here, the nutrient biomarkers accounted for 37% of the variation, while the other factors accounted for 40% (meaning that diet was nearly as important as all these other factors combined!).

The findings add to the growing evidence that diet has a significant role in determining whether or not, and when, you develop Alzheimer’s disease.

Reference: 

Source: 

Topics: 

tags development: 

tags lifestyle: 

tags problems: 

Brain atrophy may predict risk for early Alzheimer's disease

January, 2012
  • Shrinking of certain brain regions predicts age-related cognitive decline and dementia, with greater brain tissue loss markedly increasing risk.

A study involving 159 older adults (average age 76) has confirmed that the amount of brain tissue in specific regions is a predictor of Alzheimer’s disease development. Of the 159 people, 19 were classified as at high risk on the basis of the smaller size of nine small regions previously shown to be vulnerable to Alzheimer's), and 24 as low risk. The regions, in order of importance, are the medial temporal, inferior temporal, temporal pole, angular gyrus, superior parietal, superior frontal, inferior frontal cortex, supramarginal gyrus, precuneus.

There was no difference between the three risk groups at the beginning of the study on global cognitive measures (MMSE; Alzheimer’s Disease Assessment Scale—cognitive subscale; Clinical Dementia Rating—sum of boxes), or in episodic memory. The high-risk group did perform significantly more slowly on the Trail-making test part B, with similar trends on the Digit Symbol and Verbal Fluency tests.

After three years, 125 participants were re-tested. Nine met the criteria for cognitive decline. Of these, 21% were from the small high-risk group (3/14) and 7% from the much larger average-risk group (6/90). None were from the low-risk group.

The results were even more marked when less stringent criteria were used. On the basis of an increase on the Clinical Dementia Rating, 28.5% of the high-risk group and 9.7% of the average-risk group showed decline. On the basis of declining at least one standard deviation on any one of the three neuropsychological tests, half the high-risk group, 35% of the average risk group, and 14% (3/21) of the low-risk group showed decline. (The composite criteria required both of these criteria.)

Analysis estimated that every standard deviation of cortical thinning (reduced brain tissue) was associated with a nearly tripled risk of cognitive decline.

The 84 individuals for whom amyloid-beta levels in the cerebrospinal fluid were available also revealed that 60% of the high-risk group had levels consistent with the presence of Alzheimer's pathology, compared to 36% of those at average risk and 19% of those at low risk.

The findings extend and confirm the evidence that brain atrophy in specific regions is a biomarker for developing Alzheimer’s.

Reference: 

[2709] Dickerson, B. C., & Wolk D. A.
(2012).  MRI cortical thickness biomarker predicts AD-like CSF and cognitive decline in normal adults.
Neurology. 78(2), 84 - 90.

Dickerson BC, Bakkour A, Salat DH, et al. 2009. The cortical signature of Alzheimer’s disease: regionally specific cortical thinning relates to symptom severity in very mild to mild AD dementia and is detectable in asymptomatic amyloidpositive individuals. Cereb Cortex;19:497–510.

Source: 

Topics: 

tags development: 

tags problems: 

Why a select group of seniors retain their cognitive abilities

December, 2011
  • Comparison of the brains of octogenarians whose memories match those of middle-aged people reveals important differences between their brains and those of cognitively-normal seniors.

A certain level of mental decline in the senior years is regarded as normal, but some fortunate few don’t suffer from any decline at all. The Northwestern University Super Aging Project has found seniors aged 80+ who match or better the average episodic memory performance of people in their fifties. Comparison of the brains of 12 super-agers, 10 cognitively-normal seniors of similar age, and 14 middle-aged adults (average age 58) now reveals that the brains of super-agers also look like those of the middle-aged. In contrast, brain scans of cognitively average octogenarians show significant thinning of the cortex.

The difference between the brains of super-agers and the others was particularly marked in the anterior cingulate cortex. Indeed, the super agers appeared to have a much thicker left anterior cingulate cortex than the middle-aged group as well. Moreover, the brain of a super-ager who died revealed that, although there were some plaques and tangles (characteristic, in much greater quantities, of Alzheimer’s) in the mediotemporal lobe, there were almost none in the anterior cingulate. (But note an earlier report from the researchers)

Why this region should be of special importance is somewhat mysterious, but the anterior cingulate is part of the attention network, and perhaps it is this role that underlies the superior abilities of these seniors. The anterior cingulate also plays a role error detection and motivation; it will be interesting to see if these attributes are also important.

While the precise reason for the anterior cingulate to be critical to retaining cognitive abilities might be mysterious, the lack of cortical atrophy, and the suggestion that super-agers’ brains have much reduced levels of the sort of pathological damage seen in most older brains, adds weight to the growing evidence that cognitive aging reflects clinical problems, which unfortunately are all too common.

Sadly, there are no obvious lifestyle factors involved here. The super agers don’t have a lifestyle any different from their ‘cognitively average’ counterparts. However, while genetics might be behind these people’s good fortune, that doesn’t mean that lifestyle choices don’t make a big difference to those of us not so genetically fortunate. It seems increasingly clear that for most of us, without ‘super-protective genes’, health problems largely resulting from lifestyle choices are behind much of the damage done to our brains.

It should be emphasized that these unpublished results are preliminary only. This conference presentation reported on data from only 12 of 48 subjects studied.

Reference: 

Harrison, T., Geula, C., Shi, J., Samimi, M., Weintraub, S., Mesulam, M. & Rogalski, E. 2011. Neuroanatomic and pathologic features of cognitive SuperAging. Presented at a poster session at the 2011 Society for Neuroscience conference.

Source: 

Topics: 

Months: 

tags development: 

tags memworks: 

tags problems: 

Memory genes vary in protecting against age-related cognitive decline

November, 2011

New findings show the T variant of the KIBRA gene improves episodic memory through its effect on hippocampal activity. Another study finds the met variant of the BDNF gene is linked to greater age-related cognitive decline.

Previous research has found that carriers of the so-called KIBRA T allele have been shown to have better episodic memory than those who don’t carry that gene variant (this is a group difference; it doesn’t mean that any carrier will remember events better than any non-carrier). A large new study confirms and extends this finding.

The study involved 2,230 Swedish adults aged 35-95. Of these, 1040 did not have a T allele, 932 had one, and 258 had two.  Those who had at least one T allele performed significantly better on tests of immediate free recall of words (after hearing a list of 12 words, participants had to recall as many of them as they could, in any order; in some tests, there was a concurrent sorting task during presentation or testing).

There was no difference between those with one T allele and those with two. The effect increased with increasing age. There was no effect of gender. There was no significant effect on performance of delayed category cued recall tests or a visuospatial task, although a trend in the appropriate direction was evident.

It should also be noted that the effect on immediate recall, although statistically significant, was not large.

Brain activity was studied in a subset of this group, involving 83 adults aged 55-60, plus another 64 matched on sex, age, and performance on the scanner task. A further group of 113 65-75 year-olds were included for comparison purposes. While in the scanner, participants carried out a face-name association task. Having been presented with face-name pairs, participants were tested on their memory by being shown the faces with three letters, of which one was the initial letter of the name.

Performance on the scanner task was significantly higher for T carriers — but only for the 55-60 age group, not for the 65-75 age group. Activity in the hippocampus was significantly higher for younger T carriers during retrieval, but not encoding. No such difference was seen in the older group.

This finding is in contrast with an earlier, and much smaller, study involving 15 carriers and 15 non-carriers, which found higher activation of the hippocampus in non-T carriers. This was taken at the time to indicate some sort of compensatory activity. The present finding challenges that idea.

Although higher hippocampal activation during retrieval is generally associated with faster retrieval, the higher activity seen in T carriers was not fully accounted for by performance. It may be that such activity also reflects deeper processing.

KIBRA-T carriers were neither more nor less likely to carry other ‘memory genes’ — APOEe4; COMTval158met; BDNFval66met.

The findings, then, fail to support the idea that non-carriers engage compensatory mechanisms, but do indicate that the KIBRA-T gene helps episodic memory by improving the hippocampus function.

BDNF gene variation predicts rate of age-related decline in skilled performance

In another study, this time into the effects of the BDNF gene, performance on an airplane simulation task on three annual occasions was compared. The study involved 144 pilots, of whom all were healthy Caucasian males aged 40-69, and 55 (38%) of whom turned out to have at least one copy of a BDNF gene that contained the ‘met’ variant. This variant is less common, occurring in about one in three Asians, one in four Europeans and Americans, and about one in 200 sub-Saharan Africans.  

While performance dropped with age for both groups, the rate of decline was much steeper for those with the ‘met’ variant. Moreover, there was a significant inverse relationship between age and hippocampal size in the met carriers — and no significant correlation between age and hippocampal size in the non-met carriers.

Comparison over a longer time-period is now being undertaken.

The finding is more evidence for the value of physical exercise as you age — physical activity is known to increase BDNF levels in your brain. BDNF levels tend to decrease with age.

The met variant has been linked to higher likelihood of depression, stroke, anorexia nervosa, anxiety-related disorders, suicidal behavior and schizophrenia. It differs from the more common ‘val’ variant in having methionine rather than valine at position 66 on this gene. The BDNF gene has been remarkably conserved across evolutionary history (fish and mammalian BDNF have around 90% agreement), suggesting that mutations in this gene are not well tolerated.

Reference: 

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

The positive side of age-related cognitive change

The brain changes as we age, but it's not all bad! Experience changes our brains in a good way.

Older news items (pre-2010) brought over from the old website

Experienced air traffic controllers work smarter, not harder, making up for normal mental aging

A study involving 36 air traffic controllers and 36 age- and education-matched non-controllers, with 18 older (average age 57) and 18 younger adults (average age 24) per group has found that although predictable age-related declines were observed in most of the standard tests of cognitive function, in the simulated air traffic control task, experience helped the older controllers to compensate to a significant degree for age-related declines, especially in their performance of the more complex simulations.

[313] Nunes, A., & Kramer A. F.
(2009).  Experience-based mitigation of age-related performance declines: evidence from air traffic control.
Journal of Experimental Psychology. Applied. 15(1), 12 - 24.

http://www.eurekalert.org/pub_releases/2009-03/apa-eat031209.php
http://www.eurekalert.org/pub_releases/2009-03/uoia-oat030309.php

When emotions involved, older adults may perform memory tasks better than young adults

A study involving 72 young adults (20-30 years old) and 72 older adults (60-75) has found that regulating emotions – such as reducing negative emotions or inhibiting unwanted thoughts – is a resource-demanding process that disrupts the ability of young adults to simultaneously or subsequently perform tasks, but doesn’t affect older adults. In the study, most of the participants watched a two-minute video designed to induce disgust, while the rest watched a neutral two-minute clip. Participants then played a computer memory game. Before playing 2 further memory games, those who had watched the disgusting video were instructed either to change their negative reaction into positive feelings as quickly as possible or to maintain the intensity of their negative reaction, or given no instructions. Those young adults who had been told to turn their disgust into positive feelings, performed significantly worse on the subsequent memory tasks, but older adults were not affected. The feelings of disgust in themselves did not affect performance in either group. It’s speculated that older adults’ greater experience allows them to regulate their emotions without cognitive effort.

[200] Scheibe, S., & Blanchard-Fields F.
(2009).  Effects of regulating emotions on cognitive performance: what is costly for young adults is not so costly for older adults.
Psychology and Aging. 24(1), 217 - 223.

http://www.eurekalert.org/pub_releases/2009-03/giot-oac030409.php

Aging brains allow negative memories to fade

Another study has found that older adults (average age 70) remember fewer negative images than younger adults (average age 24), and that this has to do with differences in brain activity. When shown negative images, the older participants had reduced interactions between the amygdala and the hippocampus, and increased interactions between the amygdala and the dorsolateral frontal cortex. It seems that the older participants were using thinking rather than feeling processes to store these emotional memories, sacrificing information for emotional stability. The findings are consistent with earlier research showing that healthy seniors are able to regulate emotion better than younger people.

[680] St Jacques, P. L., Dolcos F., & Cabeza R.
(2009).  Effects of aging on functional connectivity of the amygdala for subsequent memory of negative pictures: a network analysis of functional magnetic resonance imaging data.
Psychological Science: A Journal of the American Psychological Society / APS. 20(1), 74 - 84.

http://www.eurekalert.org/pub_releases/2008-12/uoaf-aba121608.php
http://www.eurekalert.org/pub_releases/2008-12/dumc-oay121508.php

'Super-aged' brains reveal secrets of sharp memory in old age

While we take for granted that we’ll lose some cognitive ability as we get older, it’s also true that some very old people have brains just as quick as they always were. Now a post-mortem study of the brains of five of these "super aged" has revealed that these brains do indeed differ from normal elderly brains; specifically, by having much fewer tau tangles. Tau tangles are characteristic of Alzheimer's patients, but they are not restricted to them; until now, it’s been assumed that aging brings about the accumulation of these tangles. However, amyloid plaques, also characteristic of Alzheimer’s and found in smaller quantities in aging brains, were found in “normal” quantities, pointing to the tangles as the critical factor.

The findings were presented November 16, at the Society for Neuroscience annual meeting in Washington, D.C.

http://www.eurekalert.org/pub_releases/2008-11/nu-ab111308.php

Confidence in memory performance helps older adults remember

A study involving 335 adults aged 21 to 83 found that control beliefs were related to memory performance on a word list recall task for middle-aged and older adults, but not for younger adults. This was partly because middle-aged and older adults who perceived greater control over cognitive functioning were more likely to use strategies to help their memory. In other words, the more you believe there are things you can do to remember information, the more likely you are to make an effort to remember.

Lachman, M.E. & Andreoletti, C. 2006. Strategy Use Mediates the Relationship Between Control Beliefs and Memory Performance for Middle-Aged and Older Adults. J Gerontol B Psychol Sci Soc Sci, 61, P88-P94.

http://www.eurekalert.org/pub_releases/2006-03/bu-cim030706.php

'Sharp' older brains are not the same as younger brains

We know that many older adults still retain the mental sharpness of younger people, but studies comparing brain activity in older and younger adults suggests they perform differently. A rat study has now found the first solid evidence that still "sharp" older brains do indeed store and encode memories differently than younger brains. Comparison of those older rats who had retained their cognitive abilities with those who had not, also revealed that those with impaired cognition had lost the ability to modify the strength of the communications between synapses (synaptic communication is the means by which memories are encoded and stored). But the competent seniors also differed from the younger rats in the mechanism most used to bring about synaptic change.

[1316] Lee, H-K., Min S S., Gallagher M., & Kirkwood A.
(2005).  NMDA receptor-independent long-term depression correlates with successful aging in rats.
Nat Neurosci. 8(12), 1657 - 1659.

http://www.eurekalert.org/pub_releases/2005-11/jhu-ob110905.php

An advantage of age

A study comparing the ability of young and older adults to indicate which direction a set of bars moved across a computer screen has found that although younger participants were faster when the bars were small or low in contrast, when the bars were large and high in contrast, the older people were faster. The results suggest that the ability of one neuron to inhibit another is reduced as we age (inhibition helps us find objects within clutter, but makes it hard to see the clutter itself). The loss of inhibition as we age has previously been seen in connection with cognition and speech studies, and is reflected in our greater inability to tune out distraction as we age. Now we see the same process in vision.

[1356] Betts, L. R., Taylor C. P., Sekuler A. B., & Bennett P. J.
(2005).  Aging Reduces Center-Surround Antagonism in Visual Motion Processing.
Neuron. 45(3), 361 - 366.

http://psychology.plebius.org/article.htm?article=739
http://www.eurekalert.org/pub_releases/2005-02/mu-opg020305.php

Effect of expectations on older adults’ memory performance

A study involving 193 participants and two experiments, each with a younger (17 – 35 years old) and older (57 – 82 years old) group of adults, has investigated how negative stereotypes about aging influences older adults' memory. Participants were exposed to stereotype-related words in the context of another task (scrambled sentence, word judgment) in order to prime positive and negative stereotypes of aging. Results show memory performance in older adults was lower when they were primed with negative stereotypes than when they were primed with positive stereotypes. Age differences in memory between young and older adults were significantly reduced following a positive stereotype prime, with young and older adults performing at almost identical levels in some situations.

[1414] Hess, T. M., Hinson J. T., & Statham J. A.
(2004).  Explicit and implicit stereotype activation effects on memory: do age and awareness moderate the impact of priming?.
Psychology and Aging. 19(3), 495 - 505.

http://www.eurekalert.org/pub_releases/2004-09/apa-se090704.php

Cognitive abilities are fairly stable and may be correlated with longevity

The Scottish Mental Survey assessed 87,498 eleven-year-olds in 1932, and another 70,805 in 1947. In a fascinating follow-up to this study, over 1000 of these students have been contacted and re-assessed, on the exact same tests. It was found that, first of all, the seniors did rather better than they had at age 11, and that differences in mental ability remained fairly stable with age. Mental ability at 11 was also found to be significantly correlated with survival — those who scored highly were more likely to have survived, with the exception that men with high ability were more likely to die in active service in World War II. People of lower ability had a greater tendency to lung and stomach cancer. More results from this landmark study are expected.

These preliminary findings were presented by Professor Ian Deary from the Department of Psychology, University of Edinburgh at a symposium on aging at the Australian National University.

http://dsc.discovery.com/news/afp/20030929/aging.html

Compensating strategies for aging memories

PET scans of the prefrontal cortex reveal that older adults who perform better on a simple memory task display more activity on both sides of the brain, compared to both older adults who do less well, and younger adults. Although this seems counter-intuitive – the older adults who perform less well show activity patterns more similar to that of younger adults, this supports recent theory that the brain may change tactics as it ages, and that older people whose brain is more flexible can compensate for some aspects of memory decline. Whether this flexibility is neurological, or something that can be taught, is still unknown.

[449] Cabeza, R., Anderson N. D., Locantore J. K., & McIntosh A. R.
(2002).  Aging gracefully: compensatory brain activity in high-performing older adults.
NeuroImage. 17(3), 1394 - 1402.

http://www.nytimes.com/2002/11/19/health/aging/19AGIN.html?8vd

Training can improve age-related memory decline in elderly

Older adults show two kinds of cognitive-processing deficits: under-recruitment, where appropriate areas of the brain are less likely to be utilized without specific instruction, and non-selective recruitment, where non-relevant regions of the brain are more likely to be used. A recent imaging study confirmed that older adults were less likely than younger ones to use the critical frontal regions when performing a memory task, and more likely to use cortical regions that are not as useful. However, when subjects were given specific strategy instructions, the older adults showed increased activity in the frontal regions, and their remembering improved. Even with this support however, older adults still showed a greater tendency to use brain regions that were not useful.

[761] Logan, J. M., Sanders A. L., Snyder A. Z., Morris J. C., & Buckner R. L.
(2002).  Under-recruitment and nonselective recruitment: dissociable neural mechanisms associated with aging.
Neuron. 33(5), 827 - 840.

http://www.eurekalert.org/pub_releases/2002-02/hhmi-tci021302.php
http://www.eurekalert.org/pub_releases/2002-02/wuis-bis021402.php

How aging brains compensate for cognitive decline

Evidence from a series of studies using functional positron emission tomography (PET) images suggests that one way older adults may compensate for age-related cognitive decline is by using additional regions of the brain to perform memory and information processing tasks. For example, simple short-term memory tasks involve the same brain regions in both older and younger adults, but older adults also activate a frontal cortex region that young adults use only when performing complex short-term memory tasks. This may explain why performance of older adults on complex memory tasks is usually significantly poorer than that of younger adults - the frontal cortex region that young adults will activate to help with complex short-term memory tasks is already preoccupied in older adults, and is less available to help when the task becomes more complex.

The research was conducted by University of Michigan researchers under the leadership of cognitive neuroscientist Patricia Reuter-Lorenz, and presented at the annual meeting of the American Psychological Association in San Francisco.

http://www.ns.umich.edu/Releases/2001/Aug01/r081501a.html

tags development: 

tags problems: 

Aging - specific failures

Older news items (pre-2010) brought over from the old website

Failing recall not an inevitable consequence of aging

New research suggests age-related cognitive decay may not be inevitable. Tests of 36 adults with an average age of 75 years found that about one out of four had managed to avoid memory decline. Those adults who still had high frontal lobe function had memory skills “every bit as sharp as a group of college students in their early 20s." (But note that most of those older adults who participated were highly educated – some were retired academics). The study also found that this frontal lobe decline so common in older adults is associated with an increased susceptibility to false memories – hence the difficulty often experienced by older people in recalling whether they took a scheduled dose of medication.

The research was presented on August 8 at the American Psychological Association meeting in Toronto.

http://www.eurekalert.org/pub_releases/2003-08/wuis-fmf080703.php

Older adults better at forgetting negative images

It seems that this general tendency, to remember the good, and let the bad fade, gets stronger as we age. Following recent research suggesting that older people tend to regulate their emotions more effectively than younger people, by maintaining positive feelings and lowering negative feelings, researchers examined age differences in recall of positive, negative and neutral images of people, animals, nature scenes and inanimate objects. The first study tested 144 participants aged 18-29, 41-53 and 65-80. Older adults recalled fewer negative images relative to positive and neutral images. For the older adults, recognition memory also decreased for negative pictures. As a result, the younger adults remembered the negative pictures better. Preliminary brain research suggests that in older adults, the amygdala is activated equally to positive and negative images, whereas in younger adults, it is activated more to negative images. This suggests that older adults encode less information about negative images, which in turn would diminish recall.

Charles, S.T., Mather, M. & Carstensen, L.L. 2003. Aging and Emotional Memory: The Forgettable Nature of Negative Images for Older Adults. Journal of Experimental Psychology: General, 132(2), 310-24.

tags development: 

tags problems: 

Aging - rate of cognitive decline

White matter appears to decrease faster than grey matter, but doesn't begin to decline until the forties. Presumably this relates to the decline in processing speed that is the most evident characteristic of age-related decline.

Grey matter, on the other hand, declines at a fairly constant rate from adolescence, mirroring a decline in processing ability that seems to start as early as the twenties.

Cognitive decline seems to be faster in women than men. This presumably reflects apparent gender differences in brain activity. For example, while women seem to have a greater density of brain cells in the prefrontal cortex, they also show a steeper rate of decline so that, in old age, the density is similar between the genders.

There is some evidence that individual differences in processing speed and memory are more important than age, and that personality attributes affect the rate of cognitive decline and brain atrophy.

Some gene variants, including the so-called Alzheimer’s gene, are associated with a faster rate of decline, or an earlier start. These may be triggered by activity in early adulthood. Head size in adulthood also seems to affect rate of decline. Head size in adulthood reflects not only head size at birth, but growth in the early years — pointing to the importance of providing both proper nourishment and intellectual stimulation in these early years.

Older news items (pre-2010) brought over from the old website

Marital status and gender affects rate of age-related cognitive decline; education doesn’t

Analysis of data from 6,476 adults born prior to 1924 (taken from the AHEAD study), who were given five rounds of cognitive testing between 1993 and 2002, has found marital status is a significant factor in rate of cognitive decline, with widows and widowers and those who never married declining faster than married individuals. This is consistent with findings of the benefits of social stimulation and support for aging cognition. Confirming earlier indications, it was also found that women declined faster than men. Level of education did not affect rate of decline. There was an effect of socioeconomic status, in that those in the bottom quintile declined more slowly than those in the highest quintile, and non-Hispanic blacks declined more slowly than non-Hispanic whites, but the chief difference was at baseline — that is, socioeconomic status and race were a far more significant factor in the level of cognitive performance at the start of the study, compared to the rate of decline with age.

[628] Karlamangla, A. S., Miller-Martinez D., Aneshensel C. S., Seeman T. E., Wight R. G., & Chodosh J.
(2009).  Trajectories of Cognitive Function in Late Life in the United States: Demographic and Socioeconomic Predictors.
Am. J. Epidemiol.. 170(3), 331 - 342.

http://www.eurekalert.org/pub_releases/2009-08/uoc--sfn080709.php

Evidence cognitive decline begins in late 20s

A seven-year study involving 2,000 healthy participants between the ages of 18 and 60 has revealed that in 9 of 12 tests the average age at which the top scores were achieved was 22. A notable decline in certain measures of abstract reasoning, processing speed and spatial visualization became apparent at 27. Average memory declines could be detected by about age 37. However, accumulated knowledge skills, such as improvement of vocabulary and general knowledge, actually increase at least until the age of 60. It must be remembered however that there is considerable variance from person to person.

[239] Salthouse, T. A.
(2009).  When does age-related cognitive decline begin?.
Neurobiology of Aging. 30(4), 507 - 514.

http://www.eurekalert.org/pub_releases/2009-03/uov-cdb031909.php
http://news.bbc.co.uk/2/hi/health/7945569.stm

Education may not affect how fast you will lose your memory

A study involving some 6,500 older Chicago residents being interviewed 3-yearly for up to 14 years (average 6.5 years), has found that while at the beginning of the study, those with more education had better memory and thinking skills than those with less education, education was not related to how rapidly these skills declined during the course of the study. The result suggests that the benefit of more education in reducing dementia risk results simply from the difference in level of cognitive function.

[362] Wilson, R. S., Hebert L. E., Scherr P. A., Barnes L. L., Mendes de Leon C. F., & Evans D. A.
(2009).  Educational attainment and cognitive decline in old age.
Neurology. 72(5), 460 - 465.

http://www.eurekalert.org/pub_releases/2009-02/aaon-emn012709.php

Brain slows at 40, starts body decline

We get slower as we age, we all know that. This slowness reflects damage to the myelin sheathing (“white matter”) that coats nerve fibers and is vital for speedy conduction of electrical impulses. A study involving 72 healthy men aged 23 to 80 has found that the speed with which they could tap an index finger, and the health of the myelin in the region that orders the finger to tap, both peaked at age 39, then gradually declined with increasing age. This explains why you don’t get many world-class athletes after 40. Luckily, it probably takes a little longer before the myelin in cognitive areas starts to fray (a decade or so, it’s thought). The finding is consistent with a recent report that the system that’s supposed to repair myelin becomes less efficient with age. More research is looking at what you can do to help your myelin, but in the meantime, it’s suggested that mental and physical activity may help stimulate myelin repair, and stress may damage it.

[468] Villablanca, P., Bartzokis G., Lu P. H., Tingus K., Mendez M. F., Richard A., et al.
(2008).  Lifespan trajectory of myelin integrity and maximum motor speed.
Neurobiology of Aging.

http://www.physorg.com/news144948216.html
http://www.eurekalert.org/pub_releases/2008-10/uoc--pdc101708.php

Memory loss becoming less common in older Americans

A new nationally representative study involving 11,000 people shows a downward trend in the rate of cognitive impairment among people aged 70 and older, from 12.2% to 8.7% between 1993 and 2002. It’s speculated that factors behind this decline may be that today’s older people are much likelier to have had more formal education, higher economic status, and better care for risk factors such as high blood pressure, high cholesterol and smoking that can jeopardize their brains. In fact the data suggest that about 40% of the decrease in cognitive impairment over the decade was likely due to the increase in education levels and personal wealth between the two groups of seniors studied at the two time points. The trend is consistent with a dramatic decline in chronic disability among older Americans over the past two decades.

[1307] Langa, K. M., Larson E. B., Karlawish J. H., Cutler D. M., Kabeto M. U., Kim S. Y., et al.
(2008).  Trends in the Prevalence and Mortality of Cognitive Impairment in the United States: Is There Evidence of a Compression of Cognitive Morbidity?.
Alzheimer's & dementia : the journal of the Alzheimer's Association. 4(2), 134 - 144.

http://www.eurekalert.org/pub_releases/2008-02/uomh-mla021808.php

People at genetic risk for Alzheimer's age mentally just like noncarriers

A long-running study involving 6,560 people has found that carriers of the so-called ‘Alzheimer’s gene’— the APOE4 allele — does not contribute to cognitive change during most of adulthood. There was no difference in cognitive performance between carriers and non-carriers prior to the development of dementia symptoms.

[1189] Jorm, A. F., Mather K. A., Butterworth P., Anstey K. J., Christensen H., & Easteal S.
(2007).  APOE genotype and cognitive functioning in a large age-stratified population sample.
Neuropsychology. 21(1), 1 - 8.

http://www.eurekalert.org/pub_releases/2007-01/apa-pag010307.php

Longevity gene also helps retain cognitive function

The Longevity Genes Project has studied 158 people of Ashkenazi, or Eastern European Jewish, descent who were 95 years of age or older. Those who passed a common test of mental function were two to three times more likely to have a common variant of a gene associated with longevity (the CETP gene) than those who did not. When the researchers studied another 124 Ashkenazi Jews between 75 and 85 years of age, those subjects who passed the test of mental function were five times more likely to have this gene variant than their counterparts. The gene variant makes cholesterol particles in the blood larger than normal.

[916] Barzilai, N., Atzmon G., Derby C. A., Bauman J. M., & Lipton R. B.
(2006).  A genotype of exceptional longevity is associated with preservation of cognitive function.
Neurology. 67(12), 2170 - 2175.

http://tinyurl.com/yrf5s4
http://www.eurekalert.org/pub_releases/2006-12/aaon-lga121906.php

Risk of mild cognitive impairment increases with less education

A study of 3,957 people from the general population of Olmsted County, Minnesota is currently in train to find how many of those who did not have dementia might have mild cognitive impairment. A report on the findings so far suggests 9% of those aged 70 to 79 and nearly 18% of those 80 to 89 have MCI. Prevalence varied not only with age but also years of education: 25% in those with up to eight years of education, 14% in those with nine to 12 years, 9% in those with 13 to 16 years, and 8.5% in those with greater than 16 years.

Findings from this study were presented April 4 at the American Academy of Neurology meeting in San Diego.

http://www.eurekalert.org/pub_releases/2006-04/mc-mci033006.htm

Human cerebellum and cortex age in very different ways

Analysis of gene expression in five different regions of the brain's cortex has found that brain changes with aging were pronounced and consistent across the cortex, but changes in gene expression in the cerebellum were smaller and less coordinated. Researchers were surprised both by the homogeneity of aging within the cortex and by the dramatic differences between cortex and cerebellum. They also found that chimpanzees' brains age very differently from human brains; the findings cast doubt on the effectiveness of using rodents to model various types of neurodegenerative disease.

[951] Fraser, H. B., Khaitovich P., Plotkin J. B., Pääbo S., & Eisen M. B.
(2005).  Aging and Gene Expression in the Primate Brain.
PLoS Biol. 3(9), e274 - e274.

http://www.eurekalert.org/pub_releases/2005-08/hu-hca072805.php

Childhood environment important in staving off cognitive decline

Confirming earlier studies, a British study of 215 men and women aged between 66 and 75, has found that the larger a person's head, the less likely their cognitive abilities are to decline in later years. Those with the smallest heads had a fivefold increased risk of suffering cognitive decline compared with those with the largest heads. Encouragingly, however, this doesn’t mean you’re doomed at birth — the researchers found that it wasn’t head circumference at birth that was important, but head size in adulthood. During the first year of life, babies' brains double in size, and by the time they are six, their brain weight has tripled. These, it appears, are the crucial years for laying down brain cells and neural connections — pointing to the importance of providing both proper nourishment and intellectual stimulation in these early years.

[1208] Gale, C. R., Walton S., & Martyn C. N.
(2003).  Foetal and postnatal head growth and risk of cognitive decline in old age.
Brain. 126(10), 2273 - 2278.

http://observer.guardian.co.uk/uk_news/story/0,6903,1051264,00.html

Failing recall not an inevitable consequence of aging

New research suggests age-related cognitive decay may not be inevitable. Tests of 36 adults with an average age of 75 years found that about one out of four had managed to avoid memory decline. Those adults who still had high frontal lobe function had memory skills “every bit as sharp as a group of college students in their early 20s." (But note that most of those older adults who participated were highly educated – some were retired academics). The study also found that this frontal lobe decline so common in older adults is associated with an increased susceptibility to false memories – hence the difficulty often experienced by older people in recalling whether they took a scheduled dose of medication.

The research was presented on August 8 at the American Psychological Association meeting in Toronto.

http://www.eurekalert.org/pub_releases/2003-08/wuis-fmf080703.php

How aging brains compensate for cognitive decline

Many of the cognitive deficits associated with advancing age are related to functions of the prefrontal cortex such as working memory, decision-making, planning and judgment. Postmortem examination of 20 brains ranging in age from 25 to 83 years, confirm that prefrontal regions may be particularly sensitive to the effects of aging. It also appears that white matter decreases at a faster rate than grey matter with age.

Kigar, D.L., Walter, A.L., Stoner-Beresh, H.J. & Witelson, S.F. 2001. Age and volume of the human prefrontal cortex: a postmortem study. Paper presented to the annual Society for Neuroscience meeting in San Diego, US.

Memory starts to decline in our mid-twenties

Studies of more than 350 men and women between the ages of 20 and 90 have found that cognitive decline starts as early as the twenties, and this decline in cognitive processing power appears to be constant - that is, the rate of decline is the same when you are in your twenties as when you are in your sixties. However young adults don't notice this decline because the loss hasn't yet become great enough to affect everyday activities.

Denise Park, who directs the Center for Aging and Cognition at the University of Michigan Institute for Social Research (ISR) presented a paper on these studies on Aug. 24 in San Francisco at the annual meeting of the American Psychological Association.

http://www.umich.edu/~newsinfo/Releases/2001/Aug01/r081301a.html

Gray matter may decline from adolescence, but white matter keeps growing until our late forties

Brain scans of 70 men, ages 19 to 76 confirms that the brain's gray matter, the cell bodies of nerve cells, declines steadily from adolescence. But surprisingly, the white matter, the fatty material that insulates the long extending branches of the nerve cells and makes nerve signals move faster, in the frontal parts of the brain appears to grow at least until the late 40's, before beginning to decline. The growth of white matter may improve the brain's ability to process information.

[1246] Bartzokis, G., Beckson M., Lu P. H., Nuechterlein K. H., Edwards N., & Mintz J.
(2001).  Age-Related Changes in Frontal and Temporal Lobe Volumes in Men: A Magnetic Resonance Imaging Study.
Arch Gen Psychiatry. 58(5), 461 - 465.

http://www.nytimes.com/2001/05/22/health/22VITA-3.html

Mental faculties unchanged until the mid-40s

A large-scale study of mental abilities in adults found that mental faculties were unchanged until the mid-40s, when a marked decline began and continued at a constant rate. The ability to remember words after a delay was especially affected. Accuracy did not seem to be affected, only speed.

The paper was presented to a British Psychological Society conference in London.

http://www.guardian.co.uk/Archive/Article/0,4273,4108165,00.html

Gender differences in frontal lobe neuron density

A recent study has found that women have up to 15% more brain cell density in the frontal lobe, which controls so-called higher mental processes, such as judgement, personality, planning and working memory. However, as they get older, women appear to shed cells more rapidly from this area than men. By old age, the density is similar for both sexes. It is not yet clear what impact, if any, this difference has on performance.

Witelson, S.F., Kigar, D.L. & Stoner-Beresh, H.J. 2001. Sex difference in the numerical density of neurons in the pyramidal layers of human prefrontal cortex: a stereologic study. Paper presented to the annual Society for Neuroscience meeting in San Diego, US.

http://news.bbc.co.uk/hi/english/health/newsid_1653000/1653687.stm

tags development: 

tags problems: 

Pages

Subscribe to RSS - aging