Spark

How 'compassionate ageism' made its way into design of new technology

If older people are using technology, why aren't their needs factored into its design? That's a question researchers at the intersection of digital innovation and aging are looking into.

Ideas of later life being a time of dependency and frailty are encoded in tech, says researcher

Senior woman with short greying hair, wearing a blue shirt and a backpack uses a mobile phone on the street.
Ageism manifests itself in the design of AI and digital technologies. (leungchopan/Shutterstock)

Originally published in March 2023.

Older adults' use of digital devices is growing — but the development of new technologies, from smartphones to AI, still often leaves out the needs of these users.

Nearly 90 per cent of Canadians 65 and older use the internet every day — and the pandemic spurred many to experiment with digital tech, according to a 2020 poll by AGE-WELL. 

"Older adults can sometimes be seen as not being technologically literate, technologically savvy," Charlene Chu, an assistant professor in nursing at the University of Toronto, told Spark host Nora Young.

"Compassionate ageism" is often used to describe the paternalistic belief that aging people are in need of special policies to assist them. Ideas of later life being a time of dependency, frailty and general decline are encoded and amplified through the design and marketing of various technologies made for the general public, and specifically for older adults.

"The assumptions that we have and the stereotypes we have about older adults as being technophobic, resistant to integrating new technologies into their everyday life stem from broader social and cultural age-based stereotypes and ageist perceptions that we have about older adults in general," Nicole Dalmer, an assistant professor of health, aging and society at McMaster University, told Young.

A woman with short curly hair and black-rimmed eyeglasses is smiling and looking directly at the camera, wearing a black collared shirt and standing against a white background.
Nicole Dalmer is an Assistant Professor in the Department of Health, Aging and Society at McMaster University. (Submitted by Nicole Dalmer)

In focus groups Dalmer ran a few years ago with older adults, she saw how some had internalized those assumptions about their digital literacy, even though they showed extensive knowledge about various technologies.

"There was this thread of participants characterizing themselves in technophobic terms, calling themselves dinosaurs," she said.

"They're not grouped in the tech savvy or the digital native group, in the news or in policy documents, for example, even though I would argue so many of these older adults have actually grown up with so many variations of technologies, thinking about programming or working the DOS system, for example."

Digital ageism is sustained by what Chu calls "cycles of injustice."

"The technological, individual and social biases all interact, and they end up producing and basically mutually reinforcing each other in order to perpetuate ageism," she said.

More than health management apps

Dalmer says much of the technology currently on the market for older adults is centered on supporting them to age in place — that is, continuing to live in their homes for as long as possible.

While less costly than relocating to longterm care facilities, it still comes at a cost: the users' privacy, she says. The technology relies on monitoring and surveillance in the home — sensors in places like the bed and fridge.

Dalmer also notes that these technologies can disrupt older adults' routines and alter their relationships with loved ones, as they often unknowingly share intimate, sensitive data that is collected with family members.

"These technologies are really changing how we feel about the home, [which is] this really intimate space," she said. "There's so much power in this kind of data."

And as these devices focus on measuring heart rate, blood sugar and urine output, Dalmer says they miss other equally important indicators of health, like leisure and joy.

"Older adults are more than their bodily outputs and numbers — they're whole humans," she said. "Sometimes I think technologies are just looking at the bits and pieces instead of the whole self."

Dalmer says design, product development and marketing is dominated by younger people, and the needs, values and ideas of older adults may not be captured.

"That might be why those pendant alarms and a variety of those devices that we see for older adults are not so aesthetically pleasing to the eye," she said.

"As we age, I would love to continue to insert or invite my style into whatever devices, be they medical or mobility or other support aids. I hope that they can stay abreast with what I like, but I don't see that to be the case right now."

Chu says a lot of the data generated from these health management apps reinforce the idea that older people suffer from chronic illnesses because these technologies already target a very small portion of older adults.

"When we think about what it means to actively age in a healthy way that is devoid of chronic illness, it's very difficult to find a dataset that would reflect that in older adults," she said.

Ageism in AI

In 2022, Chu led a study about digital ageism, specifically in the context of artificial intelligence (AI).

Her team looked at seven facial image datasets, commonly used to build algorithms for facial recognition and age estimation. In the most commonly used one, hosting over 400,000 images, only 0.001 per cent of images represent older adults.

She notes that even the definition of what an older adult is was inconsistent across different datasets. Some labeled it as 50 plus, while others labeled it as 60 plus, or 70 plus.

"When you have this data disparity, it's very difficult to generate algorithms that are then accurate and can work as well for older adults," she said.

A woman with straight brown hair is wearing a black blazer with a bright pink blouse and is smiling at the camera. She's standing outside in a park which is blurred.
Charlene Chu is an Assistant Professor at Lawrence S. Bloomberg Faculty of Nursing at the University of Toronto. (UofT)

AI Ethics Guidelines Global Inventory, a working database of principles and frameworks on the ethical use of algorithms and AI by Algorithm Watch, has 146 different documents focused on how AI systems could be used and built ethically, but only about 20 per cent of these documents mentioned age as a bias, while almost all of them referenced gender-related bias and race-related bias, said Chu.

While the area remains largely unexplored, Chu says there's recently been a growing focus on the representation of aging people in the data used to train AI systems.

Better, more inclusive design

Still, Chu sees the promise of tech in helping people age in place and decentralizing healthcare, away from hospitals — if done right.

"The question then is: how do we design these technologies so that they are easy to use, so that older adults feel that they are empowered when using these technologies, that they feel like that they still have ownership of the data that is being collected, and that it is accessible to people?" she said. 

The answer, Dalmer says, lies in co-design and participatory design, an approach of involving older adults throughout the design process — from ideation to prototyping, to testing — to ensure that devices are usable, functional and reflect the needs of the aging population.

She adds that age studies sometimes come up short in the inclusion of a diverse group of older adults, from various socioeconomic, racial and ethnic backgrounds.

"Because those all come to bear on how technology is or isn't made accessible, and that carries through the entire life course and can impact different skills that older adults have or don't have in later life," said Dalmer.