Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Crime
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • The Atlantic

    No One Is Ready for Digital Immortality

    By Kate Lindsay,

    3 hours ago
    https://img.particlenews.com/image.php?url=2hKidk_0uiudXPV00

    Every few years, Hany Farid and his wife have the grim but necessary conversation about their end-of-life plans. They hope to have many more decades together—Farid is 58, and his wife is 38—but they want to make sure they have their affairs in order when the time comes. In addition to discussing burial requests and financial decisions, Farid has recently broached an eerier topic: If he dies first, would his wife want to digitally resurrect him as an AI clone?

    Farid, an AI expert at UC Berkeley, knows better than most that physical death and digital death are two different things. “My wife has my voice, my likeness, and a lot of my writings,” he told me. “She could very easily train a large language model to be an interactive version of me.” Other people have already done precisely that. Instead of grieving a loved one by listening to their voicemails on repeat, you can now upload them to an AI audio program and create a convincing voice clone that wishes you happy birthday. Train a chatbot off a dead person’s emails or texts, and you can forever message a digital approximation of them. There is enough demand for these “deathbots” that many companies, including HereAfter AI and StoryFile, specialize in them.

    When it comes to end-of-life planning, recent technology has already dumped new considerations on our plates. It’s not just What happens to my house? but also What happens to my Instagram account? As I have previously written , dead people can linger as digital ghosts through their devices and accounts. But those artifacts help maintain their memory. A deathbot, by contrast, creates a synthetic version of you and lets others interact with it after you’re gone. These tools present a new kind of dilemma: How can you plan for something like digital immortality?

    [ Read: My mom will email me after she dies ]

    Farid, the AI expert, hasn’t figured out an answer in his discussions with his wife. “We have very conflicting feelings about it,” he said. “I imagine that in the coming five to 10 years, it is a conversation we’re going to have the same way we have other conversations about end of life.” Grieving the death of a loved one is hard, and it’s easy to see why someone would prefer to remember the deceased in a way that feels, well, real . “The experience made up for what I missed out with my dad,” a woman in China told Rest of World after creating a replica of her dead father.

    It is also easy to see the pitfalls. A voice clone can be made to say whatever its creator wants it to say: Earlier this year, the team of one Indian parliamentary candidate created a realistic video in which his late father—a famous politician—endorses him as his “rightful heir.” Compared with voice clones, chatbots in particular pose problems. “To have something that is basically improvising on what you might’ve said in life—that can go wrong in so many different ways,” Mark Sample, a digital-studies professor at Davidson College, told me. Any chatbot trained on a wide output of text from a person’s life will produce messages that reflect not simply who that person was at the time of their death but also how they acted throughout their life—including, potentially, ideas they’d abandoned or biases they’d overcome. The chatbot could also, of course, preserve any less admirable personality traits they had even at the end of life.

    Grief, too, gets complicated. Deathbots can be an unhealthy coping mechanism for the bereaved—a way to never have to fully acknowledge the death of a loved one or adapt to life without them. “It’s a tool, and a tool can be useful or it can be overused,” Dennis Cooley, a philosophy-and-ethics professor at North Dakota State University, told me. “It warps the person’s ability to interact and engage in the world.”

    What makes all of this especially fraught is that the dead person may not have given consent. StoryFile and HereAfter AI are both designed for you to submit your data before your death, which allows for some agency in the process. But these policies are not standard across the digital-afterlife industry, AI ethicists from University of Cambridge’s Leverhulme Centre for the Future of Intelligence noted in May. The researchers declared the industry “high risk,” with lots of potential for harm. Just like other apps that pester you with push notifications, a deathbot could keep sending reminders to message the AI replica of your mom. Or a company could threaten to discontinue access to a deathbot unless you fork up more money.

    In other words, as people get their affairs in order, there are lots of reasons they should take into account the possibility of deathbots. Some wills already include instructions for social-media profiles, emails, and password-protected cellphones; language about AI could be next. Perhaps you might set specific guidelines for how your digital remains can be repurposed for a deathbot. Or you might forgo digital immortality entirely and issue what’s essentially a digital “do not resuscitate.” “You could put an instruction in your estate plan like ‘I don't want anybody to do this,’” Stephen Wu, a lawyer at Silicon Valley Law Group, told me, regarding deathbots. “But that’s not necessarily enforceable.”

    Telling your loved ones that you don’t want to be turned into an AI clone may not stop someone from going rogue and doing it anyway. If they did, the only legal recourse would be in instances where the AI clone was used in a way that violates a law. For instance, a voice clone could be employed to access a deceased person’s private accounts. Or an AI replica could be used for commercial purposes, in an ad, say, or on a product label, which would violate the person’s basic right of publicity. But of course, that’s little help for lots of harmful ways in which someone could interact with a deathbot.

    Like so much else in the world of AI, many of the concerns about these replicas are still hypothetical. But if deathbots continue to gain traction, “we’re going to see a slew of new AI laws,” Thomas Dunlap, a lawyer at the firm Dunlap, Bennett, and Ludwig, told me. Perhaps even weirder than a world in which deathbots exist is a world in which they are normal . By the time today’s children reach the end of their life, these kinds of digital ghosts could conceivably be as much a part of the grieving process as physical funerals. “Technology tends to go through these cycles,” Farid said. “There’s this freak-out, and then we figure it out; we normalize it; we put reasonable guardrails on it. I suspect we’ll see something like that here.”

    However, the road ahead is bumpy. Part of you can still live on, based on texts, emails, and whatever else makes up your digital footprint. It’s something that future generations may have to keep in mind before they fire off an angry social-media post at an airline. Beyond just “What does this say about me now?,” they may have to ask themselves, “What will this say about me when I’m gone?”

    Older people who are getting their affairs in order today are caught in the tricky position of having to make decisions based on deathbot technology as it exists in the present, even though the ramifications might play out in a very different world. Voice cloning has already crossed the uncanny valley, Farid said, “but in a couple years, all the intonations and the laughter and the expressions; we will have solved that problem.” For now, older adults confronting deathbots are left scrambling. Even if they manage to account for all of their possessions and plan out every end-of-life decision—a monumental task in its own right—their digital remains still might linger forever.

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular

    Comments / 0