Stereotypes and the brain suggests that older folk (over 60-88) are more susceptible to stereotypical thinking than younger people (age 18-25).
A decade ago, a research team led by William von Hippel of the University of Queensland challenged that assumption. The psychologists proposed that older people may exhibit greater prejudice because they have difficulty inhibiting (bolding mine) the stereotypes that regularly get activated in all of our brains. They suggested an aging brain is not as effective in suppressing unwanted information — including stereotypes.
In two recently published papers, von Hippel and Gabriel Radvansky of the University of Notre Dame provide compelling support for this concept. In the Journal of Experimental Social Psychology, they describe a series of experiments designed to assess whether older adults were relatively more likely to draw and remember stereotypic inferences.
Forty-eight older adults (age 60 to 88) and 71 younger adults (age 18 to 25), read four stories, each of which “allowed for stereotypic inferences.” Two of the tales featured African Americans, one dealt with people from Appalachia, and one involved Jews. After finishing the stories, the participants were shown a series of statements relevant to the tale, and asked to rate them as true or false. Some of these statements were strictly factual, while others contained inferences of stereotypes.
The results revealed “significantly greater memory strength among older adults for stereotype-consistent situation models,” the researchers write. “This finding supports our suggestion that older adults are more likely to make stereotypic inferences during comprehension, and that this stereotyping carries over into their later memory for that information.”
This process “appears to be a more general phenomenon of aging,” they note, adding that some older adults “may be relying on stereotypes despite their best intentions to the contrary.”
The second paper, published earlier this year in the journal Aging, Neuropsychology and Cognition, contains a way around this problem. It describes a study in which older and younger adults read a story in which a central character was employed in a sex-stereotyped profession. In half the stories, the character’s gender was consistent with the stereotype (a male plumber), while in the other half it was inconsistent (a female plumber).
“Results revealed that with explicit labeling, older adults were able to discount their stereotypes and avoid processing difficulties when subsequent stereotype-inconsistent information was encountered,” the researchers write. “These data suggest that when counter-stereotypical information is explicitly provided at encoding (that is, the first stage of the memory process, in which stimuli are initially registered), older adults are no more likely than younger adults to rely on stereotypes, and are similarly capable of altering their interpretation of a situation when information suggests that information is incorrect.”
In real life, of course, no one is pointing out biased statements as they emerge from the mouths or friends, family members or talk-show hosts. So for older adults, the best advice might be to avoid acquaintances who speak in stereotypes. This research suggests prejudice can be contagious, and we become more susceptible as our brains age.
I have not read the protocols nor the material, but several thoughts occur:
1.) Since the age gap is so wide, can one measure the common thread of stereotypical thinking from only one set of examples? Can you fashion a story that is common to both generations and isolate from socialization?
2.) Part of the theory involves brain based cognition, as the brain enters a more fluid phase of brain cell connections during the teens to early twenties. However, connections in older adults might be fewer overall but can be more complex in connections. What that means is not clear.
Sort of like blonde jokes, you know. Substitute brunette and the joke isn’t funny anymore.