'I love you robo-dad': Meet a family using AI to preserve loved ones after death
The product is part of a burgeoning AI industry centered on "grief tech."
Once a week, 9-year-old Jayce Gowin and his dad sit down at their living room table to discuss anything from a favorite flavor of Cheetos to life advice. But Jayce doesn't talk to his dad -- instead he interacts with an artificial intelligence-powered chatbot trained to mimic his father.
Jayce sits next to his real-life dad, Jason Gowin, who types questions into a laptop on behalf of Jayce, before the two of them await an audio response from the AI. Gowin said he sought out the technology after serious health scares left him and his wife concerned about leaving Jayce and his young siblings parentless.
On a Sunday morning last month, Jayce wanted to know whether the bot remembered his favorite stuffed animal, Bob.
"Yes, Jayce," the AI uttered in a voice meant to closely resemble his father's. "Bob is a mouse character in your bedtime stories. We've talked about him before and how he has a voice."
Jayce was thrilled. "I love you robo-dad," he said.
The members of the Gowin family volunteered to test the product made by a startup called You, Only Virtual, in exchange for permanent access free of charge. The firm is one of a handful of companies in a burgeoning industry centered on so-called "grief tech," chatbots modeled off of deceased loved ones.
The products aim to redefine how some people deal with grief, offering users a chance to carry on conversations with loved ones long after they die. Proponents hold up the once-unfathomable technology as an emblem of the transformative benefits promised by AI.
Critics, however, raise concerns about the privacy of data shared by users and the technical feasibility of a bot that accurately speaks with the voice of a dead relative. Some products in the sector simply repeat statements recorded by a loved one prior to his or her death; others voice original statements, including the product used by the Gowin family.
Generative AI tools like ChatGPT -- which scan text from across the internet and string words together based on statistical probability -- have displayed a propensity to share arbitrary, false or hateful speech, raising alarm about the personal and societal effects of noxious words delivered with the intimacy and authority of a deceased loved one.
"There's a real need for many people to have an interaction that they can't have," Gary Marcus, an emeritus professor at New York University and author of the book ''Rebooting AI," told ABC News. "Loneliness is a real problem. Grief can be very challenging."
"In some sense, everybody who's using it now is part of an experiment and not a very well-controlled experiment," Marcus added. "The more people you have in that, the more you have to worry about: What are the consequences?"
Jason and his wife, Melissa Gowin -- who live with their three children in the small town of Sayre, Pennsylvania -- turned to grief tech after each of them suffered brush with death.
In 2019, Melissa Gowin had a stroke days after giving birth to twin boys, the couple said. Soon afterward, a doctor told her she had as little as two years left to live, they added.
Four months later, Jason Gowin was diagnosed with an early stage of stomach cancer, putting his health in doubt as well, they added. Suddenly, the couple had to reckon with the prospect of their young children growing up without parents.
"When you're in your 20s and 30s, you think you're invincible. This kind of thing is something you only hear about on the news, said Jason Gowin, a stand-up comedian and podcast host. "Turns out those stories on the news happen to real people -- and it happened to us."
While watching the Superman movie "Man of Steel," the couple happened upon a way to address their difficult situation, as seen through the eyes of the main character.
"[Superman] goes into the Fortress of Solitude and talks to what is essentially an AI version of his father," Jason Gowin recounted. "I said, 'You know what, I bet somebody has come up with this technology.'"
He said he found a website for the Los Angeles-based You, Only Virtual, and contacted its CEO, Justin Harrison.
"Talk about Americana: Mom, dad, three kids, house full of pets; some of the most loving, funny, just joyful people I've ever met," Harrison told ABC News about meeting the Gowin family. "Then they told me their story."
The company creates what it calls "versonas," virtual reproductions of deceased loved ones that utter responses in conversation with a user either through written chats or audio that mimic a relative's voice.
Unlike some of its competitors, You, Only Virtual produces generative chatbots that craft original responses after scanning training materials, such as text messages, videos, and information about the deceased gleaned from previous conversations with users.
"It's very simple," Jason Gowin said. "You take text messages, or video or audio clips, and you upload them into this portal. It gets turned into their algorithm that they have built for each personality."
He acknowledged holding some concerns about data privacy but he said the family is careful to avoid putting sensitive information into the chatbot, such as financial or medical records. Otherwise, he said, the information is of a type he would share publicly anyway. "We're a very open book," Jason Gowin said.
During a session speaking with the chatbot last month, Jayce wanted to know whether it remembered an old friend of his father's, named Mark. "I'm not sure," the chatbot responded, using a voice that closely resembled Jason's. "Can you remind me who Mark is?"
After being told a few details about Mark, the chatbot offered to share a story about him, as if it had taken part in the experience firsthand.
"I remember a time when Mark and I were on a road trip, and we got lost in the middle of nowhere," the chatbot said. "We didn't have GPS or smartphones back then. So we had to rely on our sense of direction and find our way back to civilization."
Sitting alongside his son, Jason Gowin called the story bogus. "He just made that entirely up," he said. "That's not a real thing."
When asked about risks posed by false or misleading statements made by the chatbot under the guise of a trusted relative, Jason Gowin downplayed the mishaps as the sort of errors that he and other real-life parents make all the time.
"I don't know why we're going to hold the AI to any different standard," he added. "As a parent, I've made major, major screw ups. And you don't see them as screw ups at the time."
Some grief tech companies have sidestepped generative AI altogether.
StoryFile, another firm based in Los Angeles, offers an interactive version of deceased relatives by recording an hourslong question-and-answer session with the individual before his or her death, and in turn, attempting to create a reproduction that responds to prompts.
In this case, the virtual reproduction utters pre-recorded content in a real-life manner. If a topic falls outside a set of established discussion areas, however, the reproduction cannot respond.
Still, the company occasionally uses AI to make modifications of the source material, James Fong, the CEO of StoryFile, told ABC News. In some cases, the company adds transition sentences or combines two different responses provided by the deceased. Going further, the company made a digital recreation of Walmart founder Sam Walton through adapting performances of his words by a real-life actor.
The digital recreation of Walton was made for an exhibit at the Walmart Museum in Bentonville, Arkansas.
The company views the use of AI as allowable in cases "where you're not infringing upon the authenticity," Fong said.
Grief tech fills a need for those who've lost a family member, but it also risks enabling a denial of that loss, Elena Lister, a professor of clinical psychiatry at Weill Cornell Medical College and co-author of the book "Giving Hope: Conversations With Children About Illness, Death, and Loss."
"People ache for this. They feel like, 'If I could just talk to this person.' They imagine conversations in their head," Lister told ABC News. "The pain of loss can be great, especially for a child who loses a parent."
"If you can accept, 'I'm not really talking to the mom that I long for -- it's a projected image,' then I think that helps things," Lister added. "But that's very hard to do because our hunger to have that person with us again overrides our logical, cognitive knowledge."
Harrison, of You, Only Virtual, admits the company does encounter discomfort among some people wary about the notion of recreating their deceased relatives in virtual form.
"I think it's pretty normal for people to feel uncomfortable with things that are new," Harrison said. "This won't be for everybody, although I think it'll be for many, many more people than realize it."
"Ultimately, we're a tool," he said. "That's the cool thing about living in the world that we live in today: You can pick and choose what's out there."