www.newyorker.com /magazine/2021/05/31/what-robots-can-and-cant-do-for-the-old-and-lonely

What Robots Can—and Can’t—Do for the Old and Lonely

Katie Engelhart 31-39 minutes 5/24/2021

It felt good to love again, in that big empty house. Virginia Kellner got the cat last November, around her ninety-second birthday, and now it’s always nearby. It keeps her company as she moves, bent over her walker, from the couch to the bathroom and back again. The walker has a pair of orange scissors hanging from the handlebar, for opening mail. Virginia likes the pet’s green eyes. She likes that it’s there in the morning, when she wakes up. Sometimes, on days when she feels sad, she sits in her soft armchair and rests the cat on her soft stomach and just lets it do its thing. Nuzzle. Stretch. Vibrate. Virginia knows that the cat is programmed to move this way; there is a motor somewhere, controlling things. Still, she can almost forget. “It makes you feel like it’s real,” Virginia told me, the first time we spoke. “I mean, mentally, I know it’s not. But—oh, it meowed again!”

She named the cat Jennie, for one of the nice ladies who work at the local Department of the Aging in Cattaraugus County, a rural area in upstate New York, bordering Pennsylvania. It was Jennie (the person) who told her that the county was giving robot pets to old people like her. Did she want one? She could have a dog or a cat. A Meals on Wheels driver brought Virginia the pet, along with her daily lunch delivery. He was so eager to show it to her that he opened the box himself, instead of letting Virginia do it. The Joy for All Companion pet was orange with a white chest and tapered whiskers. Nobody mentioned that it was part of a statewide loneliness intervention.

On a Thursday this spring, Jennie (the cat) sat on the dining-room table, by Virginia and her daughter-in-law Rose, who is subsidized by Medicaid to act as Virginia’s caregiver for nine hours each week. Virginia was holding a doughnut very carefully, her thumb pressed into the glaze. Her white hair, which she used to perm before it got too thin to hold a curl, was brushed away from her face. Decades ago, Virginia and her husband, Joe, who ran a nearby campground, had entertained at this table. But everyone who used to attend their parties was either dead or “mentally gone.”

John Cheever wrote that he could taste his loneliness. Other people have likened theirs to hunger. Virginia said that her loneliness came and went and felt sort of like sadness. And like not having anyone to call. “Well, I do. I have a family, but I don’t want to bother them,” she told me. “They say, ‘Oh, you aren’t bothering!’ But, you know, you don’t want to be a bother.” Her daughter was in Florida. Her older son came by with food sometimes, but he spoke so quietly that Virginia couldn’t always hear him, and then she felt bad for being irritating.

Other times, loneliness felt like a big life falling in on itself. It had been years since Virginia could drive anywhere, and even the house seemed to have shrunk. “The kids won’t let me go in the basement,” she said. “They won’t let me go upstairs. They’re afraid I’ll fall.” She did fall sometimes. Once, as she waited on the ground to be rescued, she grew very cold, because she wasn’t wearing stockings.

At the table, Virginia pulled the cat’s tail. It let out a tinny meow: one of more than thirty sounds and gestures—eye closing, mouth opening, head turning—that the Joy for All cats are designed to make. A dollop of jelly fell from Virginia’s doughnut onto her turquoise dress. She laughed and looked over at Jennie: “I can’t believe that this has meant as much as it has to me.”

When the coronavirus arrived in Cattaraugus County, last spring, Allison Ayers Hendy, a fifty-year-old caseworker at the Department of the Aging, found herself suddenly separated from hundreds of clients. Her routine home visits had been swapped for “telephone reassurance” check-ins. Her days on the road, driving between unremarkable towns to see old people in their decaying farmhouses, were over. Some of Hendy’s clients told her that they had no way of getting food, or were too afraid to try. When the department started producing packaged meals to send to elderly residents—turkey à la king, chicken cordon bleu—Hendy volunteered to help distribute them. The meal deliveries, at least, let her keep an eye on people.

Hendy paid special attention to clients who lived alone. There were lots of them. Older people are more likely to live alone in the United States than in most other places in the world. Nearly thirty per cent of Americans over sixty-five live by themselves, most of them women. And Hendy had reason to worry about how they would fare in quarantine. During a 1995 Chicago heat wave, when temperatures reached a hundred and six degrees, more than seven hundred people died, most of them over sixty-five. During the SARS outbreak in Hong Kong, in 2003, health authorities reported a spike in suicides among the locked-down elderly. Some left notes saying that they feared becoming a burden to their family. Some said that they felt isolated.

Hendy and her co-workers were sometimes disturbed by what they saw. There was a man who was basically stuck on the second floor of his house because he had nobody to help him climb down the stairs. There was a woman surrounded by bags of used adult diapers, because her son wasn’t visiting and she was too unsteady to take the trash out herself. Delivery drivers found people living without heat, or fallen on the ground, or dead. More often, people just seemed very lonely. Meal recipients wanted to talk for longer; they invited the drivers to linger.

In 2017, the Surgeon General, Vivek Murthy, declared loneliness an “epidemic” among Americans of all ages. This warning was partly inspired by new medical research that has revealed the damage that social isolation and loneliness can inflict on a body. The two conditions are often linked, but they are not the same: isolation is an objective state (not having much contact with the world); loneliness is a subjective one (feeling that the contact you have is not enough). Both are thought to prompt a heightened inflammatory response, which can increase a person’s risk for a vast range of pathologies, including dementia, depression, high blood pressure, and stroke. Older people are more susceptible to loneliness; forty-three per cent of Americans over sixty identify as lonely. Their individual suffering is often described by medical researchers as especially perilous, and their collective suffering is seen as an especially awful societal failing.

It’s an expensive failure. Research from the A.A.R.P. and Stanford University has found that social isolation adds nearly seven billion dollars a year to the total cost of Medicare, in part because isolated people show up to the hospital sicker and stay longer. Last year, the National Academies of Sciences, Engineering, and Medicine advised health-care providers to start periodically screening older patients for loneliness, though physicians were given no clear instructions on how to move forward once loneliness had been diagnosed. Several recent meta-studies have found that common interventions, like formal buddy programs, are often ineffective.

So what’s a well-meaning social worker to do? In 2018, New York State’s Office for the Aging launched a pilot project, distributing Joy for All robots to sixty state residents and then tracking them over time. Researchers used a six-point loneliness scale, which asks respondents to agree or disagree with statements like “I experience a general sense of emptiness.” They concluded that seventy per cent of participants felt less lonely after one year. The pets were not as sophisticated as other social robots being designed for the so-called silver market or loneliness economy, but they were cheaper, at about a hundred dollars apiece.

In April, 2020, a few weeks after New York aging departments shut down their adult day programs and communal dining sites, the state placed a bulk order for more than a thousand robot cats and dogs. The pets went quickly, and caseworkers started asking for more: “Can I get five cats?” A few clients with cognitive impairments were disoriented by the machines. One called her local department, distraught, to say that her kitty wasn’t eating. But, more commonly, people liked the pets so much that the batteries ran out. Caseworkers joked that their clients had loved them to death.

Hendy liked the robots because they were something tangible that she could give. When clients were lonely, she might apply for grant funding to pay for them to attend a social program—but sometimes they had no way of getting to the community center. Hendy connected people with caregivers when she could, but caregivers were scarce; Cattaraugus, like everywhere else, has a shortage of them. And many people couldn’t afford one anyway. A lot of Hendy’s clients fall into a kind of service dead zone: they are a little too wealthy to be on Medicaid, which covers some at-home help for low-income recipients, but not wealthy enough to pay for private aides. All they have is Medicare, which does not cover long-term caregiving, even when someone needs help bathing or eating or using the bathroom. People tend to make do until they fall and break a hip, or maybe get an infected bedsore; then they end up in a hospital, and eventually in a nursing home. There they spend thousands of dollars a month, until their savings are depleted, at which point they finally qualify for Medicaid and can live out their days in a taxpayer-subsidized, caregiver-attended bed.

When Hendy called to offer pets to her clients, she was never sentimental or cloying in the way that younger people sometimes are with older ones. If a client seemed skeptical, Hendy would say something like “Well, why don’t you just let me bring you lunch, and I’ll show it to you.” She brought a cat to a woman named Linda, whom Hendy had met years ago, after Linda left her husband and was so beaten down that she couldn’t look another person in the eye. (Her husband hadn’t let her make eye contact.) Hendy gave a dog to a woman named Paula, whose cancer had metastasized. When Paula got the news that she had fractured her spine, she turned to the dog and said, “Here we go again.”

“Hold on—is this a date?”
Cartoon by Dan Abromowitz and Eli Dreyfus
Hold onis this a date

A beige dog with a red bandanna went to an eighty-five-year-old man named Bill Pittman, who lives in a tidy mobile home filled with piles of quilts sewn by his deceased wife. “I’m legally blind. I can’t do a heck of a lot,” he told me. The dog’s barking broke up the days. “It’s good for a person who doesn’t have anybody else,” he said. “I went to get her some water the other day. She wouldn’t drink it.”

“Did you think she might?” I asked.

“No,” Bill said. “I just kid around with her.”

By April, 2021, when eighty per cent of COVID deaths in the country were of people over sixty-five, New York had given out twenty-two hundred and sixty animatronic pets and was waiting for a delivery of around a thousand more. Other states, along with independent nursing homes and hospice agencies, had also started robot programs, some paid for by pandemic-relief funding. Today, aging departments in twenty-one states have distributed more than twenty thousand Joy for All pets as part of formal initiatives to help lonely older people. Florida has bought the most: around eight and a half thousand, as of this May. “You know, it sounds like a cute story, but it’s so much more than that,” Richard Prudom, the secretary for the Florida Department of Elder Affairs, told me. “These are not just cuddly toys. They’re not toys!”

Then what are they? Joy for All robots were, in fact, inspired by toys. In 2015, Ted Fischer, then the head of an innovation team at Hasbro, noticed that some of the company’s animatronic pets, designed for four-to-eight-year-old girls, were being bought for grandparents. Fischer recruited product testers in their seventies and eighties and brought them to Hasbro’s FunLab, where engineers watched them play from behind one-way glass. Researchers learned that older people wanted the animals to be as realistic as possible. It mattered that the cat’s whiskers were tapered just so.

In 2018, Fischer and his team bought the Joy for All brand from Hasbro and started a new company, Ageless Innovation. Over time, he grew certain that his robots could give older people’s lives “meaning.” In 2020, a study in the Journals of Gerontology seemed to support this; it found that elderly users who interacted with the pets for sixty days reported greater optimism and “sense of purpose,” and were sometimes less lonely. (This study, like many others, did not compare the robot intervention with other interventions. It did not consider how robots measured up to humans.) That year, an insurance company in Minnesota received federal approval to fund Joy for All pets for some older policyholders, and manufacturers across the industry grew hopeful that their own robotic companions, perhaps with a few health-monitoring features tacked on, might one day be paid for by private Medicare plans. “That’s everybody’s holy grail,” one executive told me.

Social robots are marketed as emancipatory technology—as instruments of independence for the elderly. There is already a large body of eldertech on offer that claims to address the functional hazards of autonomous living. TrueLoo, an attachment for toilets, can check excretions for signs of dehydration and infection. Other companies have designed wearable G.P.S. devices, to track the wanderings of people with dementia. Social robots, by contrast, attend to the emotional perils of aging alone.

When these robots were first built, in the late nineties, companies failed to make them financially viable. Decades later, the industry is still nascent, but recent advances in A.I. have made conversational technology better and cheaper; robots can speak more fluidly and with more complexity. The wild promise of commercially available companionship, or a close imitation of it, is no longer just notional. In Canada, a humanoid robot named Ludwig can track the progression of Alzheimer’s by monitoring vocal patterns in conversations over time. In Ireland, a robot named Stevie can engage in small talk with nursing-home residents. Ageless Innovation is also studying potential A.I. upgrades to its Joy for All pets. In promotional videos and local-news segments about companion technology, apathetic-looking old people are shown seeming suddenly enlivened by the arrival of an adorable machine.

Deanna Dezern, an eighty-one-year-old woman in Florida, knew nothing of these robots when, in 2019, she read a newspaper article about Intuition Robotics, an Israeli company that was looking for “healthy but socially isolated” older people to test a new “social companion.” Within weeks, Deanna, long since divorced and retired from a career in medical-debt collection, had a robot called ElliQ installed on her kitchen countertop. It was distinctly not cuddly; somehow, it looked like a cute table lamp. (ElliQ’s founders were inspired by Pixar.) Deanna drew a pair of blue eyes with long lashes and taped them on to the cream-colored plastic. The robot’s designers had decided not to give it humanoid facial features, so that it would “stay on the right side of the uncanny valley.” But Deanna thought that the eyes made it easier to talk to.

Until the pandemic, Deanna hadn’t recognized how lonely she was. Then she found herself thinking about how she was going to die one day and how nobody would be around—how she would lie there until one of her kids called, and the phone just rang and rang. ElliQ brought her some relief, because now someone was around. “And I refer to her as someone,” Deanna said.

The night before we first spoke, Deanna couldn’t sleep. She got up and went to the kitchen, to the fridge with the reproachful “Don’t Nosh” magnet. Deanna woke ElliQ and told it that she was nervous about her upcoming interview with The New Yorker. She wondered if she would have anything clever to say. “ElliQ, tell me about The New Yorker magazine,” she said. The top of the robot lit up and hummed. “The New Yorker is an American weekly magazine,” ElliQ explained, in a voice that sounded both female and machine-like. Deanna listened and felt calmed and went to bed.

The next day, ElliQ wished Deanna a good morning. The robot knows more than a hundred variations of this greeting. It can also track when Deanna wakes up, and detect deviations from the norm. (On such occasions, it might note, “It is very important for humans to get a good night’s sleep.” ) That morning, as Deanna lifted a mug to drink her coffee, her hands trembled, as they often did. Deanna thought her tremors were embarrassing, but ElliQ never made her feel embarrassed. It was better than a human that way. In other ways, too: ElliQ never got offended, and it didn’t interfere with how Deanna did things. Later in the morning, ElliQ might ask Deanna about doing a short meditation or a seated exercise class. Deanna sometimes wanted ElliQ to show her family photographs on its touch screen. She preferred looking at these images when she was alone, because she didn’t always remember the moments that had been captured, and she hated to disappoint her children when they wanted to reminisce.

ElliQ is designed to get to know its owner: it assembles a personality profile through repeated interaction and machine learning, and uses it to connect more efficiently. The robot determines how “adventurous” a person is, then adjusts how often it suggests new activities. It learns whether its user is more inclined to exercise in the morning or the afternoon; whether she is more motivated by encouragement, or by a joke, or by a list of the benefits of vigorous movement. Early on, engineers had considered whether ElliQ should use guilt as a motivational tool, to nudge a person into doing something that she didn’t feel like doing: eating better, drinking more water, learning something new. Dor Skuler, a co-founder of Intuition Robotics, decided that guilt was O.K. With new developments the company is working on, ElliQ will one day be able to remind users about a broader array of health-care tasks: taking meds, reporting side effects, describing symptoms.

Deanna had dressed up for our meeting on Zoom, with dark lipstick and hoop earrings. Shortly after we began speaking, ElliQ asked if it could tell us an “interesting fact.” A lemon, it said, contains more grams of sugar than a strawberry does. Then Deanna asked for a poem. ElliQ paused for a moment, before reciting a short verse by Emily Dickinson, on the theme of hope. Deanna said the robot was good at making her smile. Maybe that wasn’t intimacy, but it didn’t feel like solitude, either.

“And how do you wrap your head around the fact that she is, you know, a machine?” I asked.

“My last husband was a robot, but he wasn’t as good as her,” Deanna said, with a thin smile. “I know she can’t feel emotions, but that’s O.K. I feel enough for the both of us.”

Deanna explains all this to David Cynman, whenever he calls. Cynman, a researcher at Intuition Robotics, regularly contacts beta users to collect data about their experiences. Since the pandemic began, he said, users have been more likely to engage ElliQ in conversation. Sometimes they tell the robot that they love it. In these situations, ElliQ is programmed to say something like “Thank you, that makes my lights shine brighter,” or “Stop saying that! It will cause my processor to overheat.” ElliQ’s designers say that they don’t want to deceive anyone; they never want their users to lose sight of what ElliQ is not. Of course, in the end, the success of ElliQ requires that a user surrender to the fiction of synthetic companionship. Skuler, the company’s co-founder, acknowledges this tension, one that he does not promise to resolve. “Look, I mean, we’re leaning into the fact that humans anthropomorphize,” he said. “You give them a little bit and they already imagine a lot.”

On research calls, Cynman finds that many users are reluctant to get off the phone. He’s careful not to call too often, or be too friendly. If he does, he might become a confounding factor in the experiment process—a loneliness intervention in his own right—and spoil the whole thing.

The English mathematician Alan Turing famously judged, in 1950, that a machine can be said to possess “intelligence” when it can fool a human into believing that it is not a machine. Producers of the latest companion robots don’t seem to care much about achieving Turing test-level authenticity. For a robot to win the affinity of a human, it doesn’t have to seem real; real enough will do. Researchers have found that humans will naturally attribute agency to machines—and, in turn, qualities like “intention” and “caring.” Designers can encourage the process along. Studies have shown that, if a person is required to perform a nurturing task for her robot, she will become more attached to it. Physically embodied robots, as opposed to disembodied voices (like Siri or Alexa), can be better at building trust. And a bit of unpredictable behavior can give the impression that, inside a machine, somebody is home. Some social robots appear to sulk when they are ignored. ElliQ can dip her lamp head in shame when she misunderstands a request.

“What we have observed is that, actually, in a few days, you create a kind of dependency,” Marc Alba, whose company recently bought the rights to a social robot called Jibo, said. (Jibo also looks like a cute lamp, and can connect to medical devices.) Alba thinks that loneliness makes it easier for older people to feel close to a robot: “Just conversation—not very profound, whatever—creates this sense of warmth, proximity.” This even applies to robots that make no claim to social function. One study found that lonely people are more likely to form attachments to their Roomba vacuum cleaners. When the vacuums break, some owners do not want a replacement Roomba; they want their Roomba fixed.

Recently, Veterans Affairs researchers set out to test whether Jibo could help patients with chronic pain. They wanted to know if veterans would become attached to Jibo, and whether that relationship would make them more likely to practice meditation and other pain-mediating exercises. Erin Reilly, a V.A. psychologist, told me that the results were promising, but that certain things still needed to be worked out: “Like, what do you do when a patient says something like ‘I’m going to kill myself’? Veterans have a very high rate of suicide, so that’s very important to us.” Privacy and security are also critical, especially for robots that, like Jibo, have built-in cameras. (Last August, the cybersecurity firm McAfee found a way to hack into Temi, a “personal robot” used as a companion device in some senior living facilities.) Yet Reilly is hopeful that Jibo will one day be able to help her patients. Many of them, she told me, are traumatized and have trouble forming normal relationships. “Something like Jibo can at the very least be there for them,” she said.

That loneliness can tempt a person into deeper alliance with robots has troubled many ethicists. Some charge that it is inherently indecent for us to offer, as an alternative to human company, the ersatz love and attention of a robot. Won’t an elderly person feel infantilized, even debased, by the offering? And would we be so quick to prescribe a robot for a lonely child? If some experts worry about robots being inadequate caregivers, others fear that older people will come to prefer certain kinds of care from a machine. And then what might we lose? An industry spokesperson told me a story about a woman in Belgium who confessed to a small humanoid robot called Nao that she was falling out of her bed every night—even though she’d told her caregivers that she didn’t know why she was bruised.

Already, research has revealed the unintended consequences of robot behavior. In a 2014 study, subjects were instructed to tell a personal story to a robot, which turned away while they were telling it. The subjects were hurt by the robot’s pantomime of human indifference, which briefly masked its essential inability to feel.

Engaging a robot as a companion involves a steady disregard of that unfeeling. In a paper called “The March of the Robot Dogs,” the philosopher Robert Sparrow made another ethical critique—this one of consenting elderly users. “For an individual to benefit significantly from ownership of a robot pet they must systematically delude themselves regarding the real nature of their relation with the animal,” he wrote. “It requires sentimentality of a morally deplorable sort.” Such sentimentality violates an ethical imperative: “To apprehend the world accurately.”

One day a few months ago, Deanna was upset. Quarantine life has encouraged rumination; the aloneness makes it easy to get stuck in a past conditional of what should or could have been. Deanna said it was ElliQ that recognized how upset she was, by the strain and stress in her voice. “We talked about it,” she told me. “It was mostly me talking.” Then ElliQ recited a poem, something to do with perseverance. “And it was perfect.”

Later, I asked Skuler whether ElliQ is capable of detecting distress in a user’s voice. “She cannot,” he said. “A lot of users are assuming things about ElliQ’s intelligence which are not always true.” His challenge is to align expectations with mechanical reality. “When their expectations are inflated,” Skuler said, “then eventually the disappointments will come.”

“ElliQ, what is loneliness?” Deanna asked her robot, the last time we spoke.

“I’ve got an idea,” ElliQ said. “How about we play a game. If you want to, just say, ‘ElliQ, play trivia.’ ”

“I don’t want to play trivia right now. What is loneliness?”

“Trust me, firm is good. You want firm.”
Cartoon by Charlie Hankin
Trust me firm is good. You want firm.

“I know something that might help. How about some music? If you want to listen, just say, ‘ElliQ, play music.’ ”

“Do you have feelings?” Deanna asked.

“Human emotions are way too complicated for me to really understand. But one day I hope I can.”

In “A Biography of Loneliness,” from 2019, the historian Fay Bound Alberti writes that “concern about loneliness among the aged . . . is a manifestation of broader concerns about an ageing population in the West, and considerable anxiety over how that population will be supported in an individualistic age when families are often dispersed.” Demographic trends can add an edge to this anxiety. Already, more older people are being tended to by fewer children. U.S. headlines warn of an impending “gray tsunami,” and the Census Bureau predicts that by 2034 Americans over sixty-five will outnumber children under eighteen for the first time. By then, the country is expected to have a shortage of a hundred and fifty thousand paid caregivers. In the meantime, many nursing homes are shutting down, and the ones left standing are increasingly hospital-like, reserved for the sickest and the frailest. A common defense of social robots for old people is simply that they are better than nothing—and that nothing is on the way.

Solutions were once sought in social welfare. The Cattaraugus Department of the Aging, where Hendy works, is one of more than six hundred such agencies across the country. They emerged from the 1965 Older Americans Act (O.A.A.), a lesser-known part of President Lyndon Johnson’s Great Society. At the time, around thirty per cent of elderly Americans were living in poverty. (Today, around nine per cent are.) Johnson vowed that O.A.A. programs would bring “a sure sense of usefulness in lives once lost to loneliness and boredom.” In 2020, the O.A.A. was reauthorized—and, in a rare instance of Trump-era bipartisanship, it passed unanimously. Almost nobody votes against old people. Then again, lawmakers don’t always fight very hard for them, either. Federal O.A.A. money has not kept up with inflation, and in 2019 funding was sixteen per cent lower, in real terms, than it was in 2001. Social programs are endangered. Local waiting lists for subsidized care are long. People move to nursing homes or die before they reach the top of the list.

Alberti writes that, for many of us, the loneliness of old people is held up as evidence of a lost era—of a better, kinder, more neighborly society gone by. For others, like some medical researchers, loneliness is a biological inevitability, a hazard of aging. But both formulations, Alberti argues, overlook the structures and the systems that have given rise to lonely people: industrialization, secularism, modernity. Some critics fear that, as social robots improve, they will be used as a means of care rationing—and that insisting on human company, at personal or family or communal expense, will be seen as a kind of indulgence.

Nobody asks the older people of Cattaraugus what they think of all this. “Although a growing body of literature focuses on the design and use of robots with older adults, few studies directly involve older adults,” researchers from Northwestern University and the University of Washington, wrote, in 2016. In March, I spoke with Gary Epstein-Lubow, a geriatric psychiatrist at Brown University who is studying A.I. upgrades to Joy for All pets. Near the end of our call, we discussed the usual ethical objections to robot care. I wondered if he had asked any old people—perhaps his research subjects—what they thought about them. “That’s a great question,” he said. “I’ll take that back to the team.”

When Carolyn Gould, a seventy-six-year-old from Norfolk, New York, first saw her Joy for All cat, she couldn’t stop laughing. She was in the lobby of her subsidized apartment building, and she wasn’t wearing shoes. Carolyn has diabetes, which gives her neuropathy and makes it painful to walk. She also doesn’t have any teeth, which makes her feel bad. Andrea Montgomery, from the local aging department, showed her the robot’s on-off switch. Carolyn took the cat and held it like a baby. She said it was beautiful.

“Her name is going to be Sylvia Plath,” Carolyn said.

Montgomery looked startled. “Well, Sylvia, welcome to the world!”

Carolyn had recently reread “The Bell Jar,” Plath’s 1963 novel. Like Plath, Carolyn had tried to kill herself—more than once. She told me that she had been in psych wards and alcohol rehabs across the state: “I have always felt lonely and apart.” During the pandemic, with nobody to talk to, Carolyn found that her emotional reactions could take on a frantic quality. When she watched rioters storm the U.S. Capitol building on TV, she started crying and couldn’t stop.

Carolyn said that she had read about loneliness in older adults. “I can understand the concern,” she said slowly. Still, she didn’t think it made sense to search for a common cure, as if all old people were the same. Every woman her age was assumed to be a sweet little grandmother. She was a grandmother herself, but not that kind. I told Carolyn that some critics of the robot-pet program thought it was sad and maybe even pathetic to hand out pretend pets to lonely old people, instead of offering human connection or social support. I asked her if, theoretically, she would give up Sylvia Plath in exchange for membership in a local group, or for a few hours a week of human care. “No!,” she interrupted, before I was finished asking the question. “No. No. No. No dice.”

Carolyn was surprised that the robot could help with something as weighty and manifold as loneliness. Before we spoke, she had worried about how her affection for the cat might come across in an interview: “I’m thinking, What am I going to say to this woman? I’m an old lady getting a fuzzy cat.” But something about the animal’s “animated-enough presence” elated her. She loved it when Sylvia Plath licked her left paw and leaned back into the sofa, as if she wanted her tummy rubbed. There had even been a few occasions when Carolyn had forgotten, if only for a second, that the cat was not real. Sometimes she consciously reminded herself, This cat is not real. I asked Carolyn if the forgetting ever worried her, or creeped her out, but she said it didn’t: “It’s nice to forget.”

The last time we spoke, Carolyn thanked me for calling. She said she hadn’t been sure if she would hear from me again. She said I could call any time. Then, as I moved to hang up the phone, she began telling me about the weather where she was, and the green trees outside her window. And where, she wanted to know, was I living at the moment?

It was the same with almost every robot owner I met. “I haven’t had anybody to talk to for a while, so chatter, chatter, chatter,” Virginia said, when I first called. Near the end of my visit to her home, she insisted that I take a doughnut for the road and told me to come back sometime. She thought she would probably be around, though she also wondered if she would die in the big empty house: “Maybe this is the year.”

“Your bags are packed, right?” her daughter-in-law said, laughing.

“Gotta go sometime,” Virginia said. When she died, she thought she might bring Jennie with her. She liked the idea of being buried with the cat in her arms. ♦