In 2020, a young Chinese software engineer in Hangzhou chanced upon an essay about lip syncing technology. Its premise is relatively simple — using a computer program to match lip movements with speech recordings.
But his grandfather, who died nearly a decade earlier, came to mind.
“Can I see Grandpa again using this technology?” Yu Jialin asked himself.
His journey to recreate his grandfather, documented in April by investigative journalist Tang Yucheng for the state-owned magazine Sixth Tone, is one of several accounts now surfacing in China of people using artificial intelligence to resurrect the dead.
Mixing an assortment of emerging AI technologies, people in the country have been building chat programs — known as griefbots — with the personalities and memories of the deceased, hoping for a chance to speak to their loved ones again.
For Yu, they presented a chance to speak his final words to the man who helped raise him.
The software engineer, now 29, told Tang that he was 17 when his grandfather died.
He still regrets two instances when he was harsh to his grandfather. Yu yelled at the older man for interrupting a gaming session once, and on another occasion told his grandfather to stop picking him up from school, Tang reported.
His family stopped mentioning his grandfather after he died, he told Tang. “Everyone in the family was trying their best to forget Grandpa rather than remember him,” Yu said.
The Griefbot rides the ChatGPT craze
The griefbot concept has been trialed for years — largely as AI-powered programs that learn how to mimic human beings through their memorabilia, photos, and recordings. But generative AI’s rapid advancement in the last year has pushed the power and accessibility of griefbots to a whole new level.
Older models required vast sets of data. Now, laymen or lone engineers like Yu can feed language models with tidbits of a person’s past, and recreate almost exactly how they look, speak, and think.
“In today’s technology, you don’t need too many samples for an AI to learn the style of a person,” Haibing Lu, an information and analytics professor at Santa Clara University, told Insider.
Systems like ChatGPT, the popular text-based program that closely imitates human speech, have already learned how most people naturally speak or write, said Lu, whose research focuses on AI.
“You only need to tweak the systems a little bit in order to loosely get a 99% similarity to your person. The stark differences will be minimal,” Lu said.
For Yu to teach his AI model what his grandfather was like, he retrieved a trove of old letters from his grandmother. She’d exchanged them with Yu’s grandfather when they were young, and they revealed a side to the man that even Yu hadn’t glimpsed as a child, he told Tang.
The software engineer dug up photos and videos shot more than a decade ago, and found text messages his grandfather sent him, Tang reported.
Yet even given weeks of testing and training, the tech has a long way to go if humans expect something akin to Black Mirror’s robot replicas. Yu’s bot was clearly limited, and took 10 minutes to respond to each prompt, Tang reported.
“Hey, Grandpa. Guess who I am?” Yu asked the program at one point.
Grandpa delivered a generic response.
“Who you are is not important at all. Life is a beautiful miracle,” the bot wrote back, according to Tang.
But as Yu fed the AI with more information about his grandfather, it started to show a more accurate representation of the man’s habits and preferences. For example, it remembered his grandfather’s favorite show, he told Tang.
“Happy Teahouse went off the air,” Yu told the chatbot.
“That’s a shame. The show I want to watch the most is no longer available. I would have liked to watch a few more episodes,” the grandfather bot replied.
That was the moment when Yu felt he’d gotten somewhere, he told Tang. The program was eventually sophisticated enough that Yu felt confident he could show his work to his grandmother. She watched silently as her late husband responded to her questions, then thanked her grandson, stood up, and left the room.
Yu told Tang his grandmother needed the chatbot to process her emotions and mourn. “Otherwise, why would she thank me?” he said.
As for himself, he declined to share his intimate conversations with his grandfather bot.
“But I think my grandfather forgave me in the end,” he told Tang.
Mourning with the times
It’s natural for humans to change the way they mourn as technology evolves, Sue Morris, director of bereavement services at the Dana-Farber Cancer Institute in Boston, told Insider.
In the 1980s, people would write down stories about their loved ones to remember them, said Morris, who teaches psychology at Harvard Medical School. Now, it’s far more common in the digital age to keep photos and videos of the departed, she said.
Psychologists often help grieving clients by asking them to speak to an empty chair as though their loved one were sitting in it, and to imagine the person’s response.
“It feels as though these griefbots are the technological step up from that,” said Morris.
But griefbots take substantial control away from the user, she added. Many people deal with grief by controlling how and when they process their emotions.
“You choose when you’re going to look at your photos and videos, how long you’re going to look at them,” she said.
An unexpected trigger, like say, an insensitively timed message from a chatbot, can often overwhelm someone with grief, Morris said. “Maybe 98% of the time, the program is going to say the appropriate thing, but what if it doesn’t for a small percentage? Could that then send somebody into more of a downward spiral?” she said.
Still, if griefbots sound offensive to some, history shows that social norms constantly change when it comes to the dead, Mary Frances O’Connor, director of the Grief, Loss and Social Stress Lab in the University of Arizona, told Insider.
When photography became accessible to the public in the 19th century, people would take pictures with their loved ones’ corpses and hang the photos in their living rooms, said O’Connor.
“Today, we might think this living room display was morbid, but it was common at the time,” O’Connor said.
GPT-powered griefbots gain ground in China
As generative AI gains traction in China, so have stories of new griefbots. Another Chinese man who used AI to “resurrect” a loved one — a 24-year-old Shanghai blogger going by the name Wu Wuliu — went viral on social media in March when he said he’d trained a chatbot to mimic his dead grandmother.
Like Yu’s grandfather bot, Wu’s bot produced limited responses. “But I feel good being able to look at grandma and talk more with her,” he said.
Wu said he used ChatGPT, though access to the platform has been limited in China since February 24.
“I wish I’d seen this video sooner,” a top comment on Wu’s page read. “My grandmother passed away last winter. I was caught off guard. I don’t have any audio recordings or high definition photos of her.”
And during this year’s annual Tomb Sweeping Festival, a Chinese cemetery used GPT software and voice cloning AI to recreate people who were being buried at its facilities, YiCai reported. The cemetery said thousands of people have used its platform, and that it costs around $7,300 to recreate a dead person, per YiCai.
Seeking human connection from a virtual bot has become common in China. Xiaoice, a 2018 Chinese chatbot assistant that takes the appearance of a teenage girl, has more than 660 million users. She can act as a confidant or friend, and can receive gifts from fans, said Microsoft, which runs the flagship bot.
Earlier versions of griefbots have established footholds elsewhere in the world. Several companies and research projects in the US have offered griefbots, such as Replika, which now markets itself as a social AI app.
In Canada, a man named Joshua Barbeau digitally remade his girlfriend in 2021 using Project December — an older program built with the predecessor to current GPT software. Barbeau’s girlfriend died from a rare liver disease eight years earlier, and he told The San Francisco Chronicle that speaking to the chatbot helped him heal from his loss.
Then there’s the South Korean documentary “Meeting You,” which featured a young mother tearfully reuniting with her deceased 7-year-old daughter in virtual reality. Viewers worried the show was emotionally manipulative, though the mother in the episode was thankful for the experience, and said it was like she had a “nice dream.”
Griefbots are bound to be controversial
Yet the griefbot and its byproducts can pose serious ethical dilemmas, said Lu, the infoanalytics professor.
A dead person’s identity can be easy pickings for a fraudster, he said. They can feed that person’s data to an AI, then pretend they’re a medium who communicates with the person’s spirit, Lu said.
“And there’s no scientific proof that a psychic’s powers are valid, right? No one can invalidate that,” he said.
Then there’s the challenge of getting consent from the dead, Lu said.
“In a future where everyone knows about this technology, maybe you can sign a document that says your descendants can use your knowledge, or to forbid it,” Lu said.
HereAfter.AI, a US-based company, offers an opt-in experience for people to upload their own personalities online. An AI learns about each person through submitted photos, audio logs, and questionnaires, and makes a digital avatar that can talk to their friends and families after they die.
Its founder, James Vlahos, spent months recording his terminally-ill father recounting memories and reminiscing about life, feeding them to a “Dadbot” that could live on when the man could not.
But Lu said there’s little chance that the typical person who dies nowadays would have given that kind of go-ahead. And if they haven’t, it would be problematic even for their children or grandchildren to use their personal information, he added.
“It doesn’t mean that if a person has passed away, that other people have the right to disclose their personal privacy, even if it’s to immediate family members,” Lu said.
As for Yu, the software engineer — his grandfather bot is no more. Yu decided to delete the grandfather bot, telling Sixth Tone that he was afraid of getting overly reliant on the AI for emotional support.
“These emotions might have overwhelmed me too much to work and live my life,” he told Tang.