Skip to main contentSkip to navigationSkip to navigation
While there are potential benefits of the technology in terms of safety for older people and a reprieve for caregivers, some also worry about its potential harms.
While there are potential benefits of the technology in terms of safety for older people and a reprieve for caregivers, some also worry about its potential harms. Illustration: Erre Gálvez/The Guardian
While there are potential benefits of the technology in terms of safety for older people and a reprieve for caregivers, some also worry about its potential harms. Illustration: Erre Gálvez/The Guardian

The future of elder care is here – and it’s artificial intelligence

This article is more than 2 years old

Computers are increasingly guiding decisions about elder care – and tracking everything from toilet visits to whether someone has bathed

Kellye Franklin recalls the devastation when her now 81-year-old father, a loyal air force veteran, tried to make his own breakfast one morning. Seven boxes of open cereal on the living room floor with milk poured directly into every one of them. He would later be diagnosed with moderate to severe dementia.

Yet Franklin, 39, who is her dad’s only child and his primary caregiver, does not worry about that repeating now.

In late 2019, she had motion sensors that are connected to an artificial intelligence (AI) system installed in the two-floor townhome she and her dad share in Inglewood, in Los Angeles county. Sensors at the top of doors and in some rooms monitor movements and learn the pair’s daily activity patterns, sending warning alerts to Franklin’s phone if her dad’s normal behavior deviates – for instance if he goes outside and doesn’t return quickly.

“I would have gotten an alert as soon as he went to the kitchen that morning,” she says, because it would have been out of the ordinary for her dad to be in the kitchen at all, especially that early. Franklin says the system helps her “sanity”, taking a little weight off an around-the-clock job.

Donald Franklin, 81, and his daughter Kelly Franklin, 39. Kellye, Donald’s primary caretaker, has an AI surveillance system to help monitor her dad. Photograph: Jessica Pons/The Guardian

Welcome to caregiving in the 2020s: in rich societies, computers are guiding decisions about elder care, driven by a shortage of caregivers, an ageing population and families wanting their seniors to stay in their own homes longer. A plethora of so called “age tech” companies have sprung up over the last few years including to keep tabs on older adults, particularly those with cognitive decline. Their solutions are now beginning to permeate into home care, assisted living and nursing facilities.

The technology can free up human caregivers so they can be “as efficient as potentially possible” sums up Majd Alwan, the executive director of the Center for Aging Services Technologies at LeadingAge, an organization representing non-profit ageing services providers.

But while there are potential benefits of the technology in terms of safety for older people and a reprieve for caregivers, some also worry about its potential harms. They raise questions around the accuracy of the systems, as well as about privacy, consent and the kind of world we want for our elders. “We’re introducing these products based on this enthusiasm that they’re better than what we have – and I think that’s an assumption,” says Alisa Grigorovich, a gerontologist who has also been studying the technology at the KITE-Toronto Rehabilitation Institute, University Health Network, Canada.

Q&A

What is AI?

Show

Artificial intelligence (AI) refers to computer systems that do things that normally require human intelligence. While the holy grail of AI is a computer system that is indistinguishable from a human mind, there are several forms of specialized, but limited, AI that are already a part of our everyday lives. AI may be used with cameras to identify someone based on their face, to power virtual companions, and to determine whether a patient is at a high risk for disease.

AI shouldn’t be confused with other kinds of algorithms. The simplest definition of an algorithm is that it’s a series of instructions needed to complete a task. For example, a thermostat in your home is equipped with sensors to detect temperature and instructions to turn on or off as needed. This is not the same as artificial intelligence.

The rollout of AI today has been made possible by decades of research on topics including computer vision, which enables computers to perceive and interpret the visual world; natural language processing, allowing them to interpret language; and machine learning, a way for computers to improve as they encounter new data.

AI allows us to automate tasks, gather insights from huge datasets, and complement human expertise. But a rich body of scholarship has also begun to document its pitfalls. For example, automated systems are often trained on huge troves of historical digital data. As many widely publicized cases show, these datasets often reflect past racial disparities, which AI systems learn from and replicate.

Moreover, some of these systems are difficult for outsiders to interpret due to an intentional lack of transparency or the use of genuinely complex methods.

Was this helpful?

Technology to help keep seniors safe has been in use for a long time – think life alert pendants and so called “nanny cams” set up by families fearful their loved ones could be mistreated. But incorporating systems that use data to make decisions – what we now call AI – is new. Increasingly cheap sensors collect many terabytes of data which is then analyzed by computer scripts known as algorithms to infer actions or patterns in activities of daily living and detect if things might be off.

A fall, “wandering behavior”, or a change in the number or duration of bathroom visits that might signal a health condition such as a urinary tract infection or dehydration are just some of the things that trigger alerts to carers. The systems use everything from motion sensors to cameras to even lidar, a type of laser scanning used by self-driving cars, to monitor spaces. Others monitor individuals using wearables.

CarePredict, a watch-like device worn on the dominant arm, can track the specific activity that a person is likely to be engaged in by considering the patterns in their gestures, among other data. If repetitive eating motions aren’t detected as expected, a carer is alerted. If the system identifies someone as being in the bathroom and it detects a sitting posture, it can be inferred that the person “is using the toilet”, notes one of its patents.

The system in use in the Franklins’ home is called People Power Family. An addition to it, targeted at care agencies, includes daily reports tracking when someone fell asleep, whether they bathed, and bathroom visits. “You can manage more clients with fewer caregivers,” says the promotional video.

Kellye Franklin has an AI surveillance system installed in her house to help monitor her dad Donald. The system is connected to her iPad and smartphone.
Kellye Franklin, 39, has an AI surveillance system installed in her house to help monitor her dad, Donald Franklin, who has dementia. The system is connected to her iPad and smart phone. Photograph by Jessica Pons/The Guardian

The large blue warning signs read “Video recording for fall detection and prevention” on the third-floor dementia care unit of the Trousdale, a private-pay senior living community in Silicon Valley where a studio starts from about $7,000 per month.

In late 2019, AI-based fall detection technology from a Bay Area startup, SafelyYou, was installed to monitor its 23 apartments (it is turned on in all but one apartment where the family didn’t consent). A single camera unobtrusively positioned high on each bedroom wall continuously monitors the scene.

If the system, which has been trained on SafelyYou’s ever expanding library of falls, detects a fall, staff are alerted. The footage, which is kept only if an event triggers the system, can then be viewed in the Trousdale’s control room by paramedics to help decide whether someone needs to go to hospital – did they hit their head? – and by designated staff to analyze what changes could prevent the person falling again.

“We’ve probably reduced our hospital trips by 80%,” says Sylvia Chu, the facility’s executive director. The system has captured every fall she knows of, though she adds that sometimes it turns out the person is on the ground intentionally, for example to find something that has fallen on the floor. “I don’t want to say it is a false alarm … but it isn’t a fall per se,” she says. And she stresses it is not a problem – often the resident still needs help to get back up and staff are happy to oblige.

Sylvia Chu, The Trousdale "We’ve probably reduced our hospital trips by 80 percent." EXECUTIVE DIRECTOR OF TROUSDALE DEMENTIA CARE UNIT, WHERE AN AI-BASED FALL DETECTION SYSTEM IS USED

“We’re still just scratching the surface,” when it comes to accuracy, says George Netscher, SafelyYou’s founder and CEO. Non-falls – which the company refers to as “on-the-ground events” – are in fact triggering the system about 40% of the time, he says, citing someone kneeling on the ground to pray as an example. Netscher says that while he wants to get the error rate down, it is better to be safe rather than sorry.

Companies must also think about bias. AI models are often trained on databases of previous subjects’ behavior, which might not represent all people or situations. Problems with gender and racial biases have been well documented in other AI-based technology such as facial recognition, and they could also exist in these types of systems, says Vicente Ordóñez-Roman, a computer vision expert at the University of Virginia.

That includes cultural biases. CarePredict, the wearable which detects eating motions, hasn’t been fine-tuned for people who eat with chopsticks instead of forks – despite recently launching in Japan. It is on the to-do list, says Satish Movva, the company’s founder and CEO.

For Clara Berridge, who studies the implications of digital technologies used in elder care at the University of Washington, privacy intrusion on older adults is one of the most worrying risks. She also fears it could reduce human interaction and hands-on care – already lacking in many places – further still, worsening social isolation for older people.

In 2014, Berridge interviewed 20 non-cognitively-impaired elder residents in a low-income independent living apartment building that used an AI-based monitoring system called QuietCare, based on motion detection. It triggered an operator call to residents – escalating to family members if necessary – in cases such as a possible bathroom fall, not leaving the bedroom, a significant drop in overall activity or a significant change in nighttime bathroom use.

Kellye Franklin’s AI surveillance system. Photograph: Jessica Pons/The Guardian

What she found was damning. The expectation of routine built into the system disrupted the elders’ activities and caused them to change their behaviour to try to avoid unnecessary alerts that might bother family members. One woman stopped sleeping in her recliner because she was afraid it would show inactivity and trigger an alert. Others rushed in the bathroom for fear of the consequences if they stayed too long.

Some residents begged for the sensors to be removed – though others were so lonely they tried to game the system so they could chat with the operator.

A spokesperson for PRA Health Sciences, which now makes QuietCare, noted the configuration studied in the paper was a historical version and the current version of QuietCare is only installed at assisted living facilities where facility staff, rather than relatives, are notified regarding changes in patients’ patterns or deviations in trends.

Berridge’s interviews also revealed something else worrying: evidence of benevolent coercion by social workers and family members to get the elders to adopt the technology. There is a “potential for conflict”, says Berridge. Another of her studies has found big differences in enthusiasm for in-home monitoring systems between older people and their adult kids. The latter were gung ho.

Though sometimes the seniors win the day. Startup Cherry Labs is pivoting partially because it ran into problems obtaining seniors’ consent. Its home monitoring system, Cherry Home, features up to six AI cameras with sound recorders to capture concerning behavior and issue alerts; facial recognition to distinguish others in the space such as carers from seniors; and the ability for family members or carers to look in on how the senior is doing in real time.

Max Goncharov, CEO of Cherry Labs 'The seniors were against it' CHILDREN COULDN'T CONVINCE PARENTS TO USE THE COMPANY'S MONITORING SYSTEM

But Max Goncharov, its co-founder and CEO, notes that business has been tough not least because adult children couldn’t convince their parents to accept the system. “The seniors were against it,” he says. Cherry Labs now has a different application – targeting its technology at industrial workplaces that want to monitor employee safety.

Franklin, in Inglewood, says the fact her system uses motion sensors rather than cameras is a big deal. She and her dad, Donald, are African American and she just couldn’t imagine her dad being comfortable with a video-based system. “He was born in 1940 in the south and he has seen the evolution and backpedaling on racial issues. He definitely has some scars. There are various parts of our American culture he is distrustful of,” says Franklin.

She has done her best to explain the monitoring system, for which she now pays $40 a month, simply and without sugar-coating. For the most part, he’s all right with it as long as it helps her.

“I never want to be a burden,” he says. But he also wants her to know that he has a plan if they ever decide the technology is too invasive: they can move out of their townhome and rent it to someone else.

“You have to have a trick bag to protect yourself from their trick bag,” he tells her. “I am still your dad no matter how many sensors you got.”

Most viewed

Most viewed