Is Alexa’s voice of the dead a healthy way to grieve a loved one?

By Riya Anne Polcastro

Amazon’s Alexa is getting an update that may soothe some grieving souls while making others’ skin crawl. The AI enhancement will enable the device to replicate a deceased loved one’s voice from less than a minute of recording, allowing users the opportunity to connect with memories in a much more extensive manner than simply listening to old voicemail messages or recordings might provide.

Still, there are reasonable concerns regarding how this technology could impact unprocessed emotions or even be used for unscrupulous purposes.

The ‘why’ behind the new AI

Rohit Prasad, senior vice president and head scientist for Alexa, told attendees at this year’s Amazon re:MARS conference  that while AI cannot take away the grief that comes from losing a loved one, it can help keep the memories around by providing a connection with their voice. A video played at the conference featured a child asking Alexa to have his grandmother – who had already died – read a book. The device obliged and read from “The Wonderful Wizard of Oz” in the grandmother’s voice. It was able to do so by analyzing a short clip of her voice and creating an AI version of it.

At the conference, Prasad mentioned “the companionship relationship” people have with their Alexa devices:

“Human attributes like empathy and affect are key to building trust,” he said. “These attributes have become even more important in these times of the ongoing pandemic, when so many of us have lost someone we love.” By giving the voice those same attributes, his plan is for the voice to be able to connect with people in a way that helps maintain their memories long after their loved one is gone.

What does the research say?

While it’s yet to be proven whether an AI facsimile of a loved one’s voice has the potential to assist in the grieving process, there’s hope there could be a real benefit to the application. Research into how hearing a mother’s voice can ease stress among schoolchildren suggests the potential is there.

Leslie Seltzer, a biological anthropologist at the University of Wisconsin–Madison, determined that talking to Mom on the phone can have the same calming effects as receiving in-person comfort—which included hugs. In a follow-up study that demonstrated the same effects don’t hold for students conversing with their mothers through instant messages, the researcher explained that speaking with someone trustworthy has the power to reduce cortisol and increase oxytocin.

There is, however, a fundamental difference between talking to a living relative on the phone and interacting with an AI imitation of someone who is gone. Anecdotal evidence of friends and family listening to old recordings of their loved ones suggests that what is healing for some may be devastating for others. While some people report that listening to old voicemails, for example, help them reconnect and process their grief, others have said it made the pain worse.

What about the experts?

Dianne Gray, a certified grief specialist, also pointed out it could go either way. She explained the Alexa feature could “be immensely helpful or, conversely, act as a trigger that brings grief back up to the surface.”

She suggested regardless of the situation, the mourner should be in a safe space that will allow them enough time and support to work through any unexpected emotions that come up.

Likewise, Holly Zell, a licensed clinical professional counselor intern specializing in death and grief, agreed:

“Every person’s grief experience is unique, and each grief experience a person has across their life is unique,” she said. “What might be helpful in one situation might feel distressing or harmful in another.”

Zell is concerned the AI could interfere with the grieving process, particularly with the example given at the conference of a child listening to their grandmother read a story.

“One of the most challenging and also important aspects of grief is acceptance, which involves acknowledging that the death has happened and that certain things change in relationships after death,” she said. “It can be healthy to have a sense of a ‘continued’ relationship after death, but this is not meant to be in conflict with acceptance.”

Zell instead encourages having loved ones record messages before they pass. Those messages can also provide that connection that can be so crucial, Gray explained.

“This connection via sound can continue long after the loved one has died,” she said. “A common fear of the bereaved is that they will forget what a loved one’s voice sounded like.”

She’s hopeful that by hearing the voice of the deceased without their physical body, the feature can help people navigate acceptance.

“Research will be interesting on this topic.”

Additionally, Gray sees potential benefit for seniors with low vision who may find it easier to use the 100% voice-activated device than if they were trying to pull up recordings on their phones.

That doesn’t mean the AI is risk-free, she explained.

“What if there are things left unsaid, disharmony or abuse between the voice on the Alexa device and the beloved? What if the message on the Alexa device is not as kind, gentle or loving as it should or could be?”

Gray pointed to the unfortunate reality that people often die with close relationships still in tatters—and that their voice could have a negative impact on survivors.

Zell said she also remains unconvinced at this point.

“I’m sure there are people who will find this comforting or helpful. I personally and professionally feel skeptical of this as a useful tool, and would strongly encourage people to find their own meaningful ways to include their lost loved ones into their lives through photos, stories, videos/recordings and other experiences.”

Complete Article HERE!

What Should Happen to Our Data When We Die?

Anthony Bourdain’s A.I.-generated voice is just the latest example of a celebrity being digitally reincarnated. These days, though, it could happen to any of us.

By Adrienne Matei

The new Anthony Bourdain documentary, “Roadrunner,” is one of many projects dedicated to the larger-than-life chef, writer and television personality. But the film has drawn outsize attention, in part because of its subtle reliance on artificial intelligence technology.

Using several hours of Mr. Bourdain’s voice recordings, a software company created 45 seconds of new audio for the documentary. The A.I. voice sounds just like Mr. Bourdain speaking from the great beyond; at one point in the movie, it reads an email he sent before his death by suicide in 2018.

“If you watch the film, other than that line you mentioned, you probably don’t know what the other lines are that were spoken by the A.I., and you’re not going to know,” Morgan Neville, the director, said in an interview with The New Yorker. “We can have a documentary-ethics panel about it later.”

The time for that panel may be now. The dead are being digitally resurrected with growing frequency: as 2-D projections, 3-D holograms, C.G.I. renderings and A.I. chat bots.

A holograph of the rapper Tupac Shakur took the stage at Coachella in 2012, 15 years after his death; a likeness of a 19-year-old Audrey Hepburn starred in a 2014 Galaxy chocolate ad; and Carrie Fisher and Peter Cushing posthumously reprised their roles in some of the newer “Star Wars” films.

Few examples drew as much attention as the singing, dancing hologram that Kanye West gave Kim Kardashian West for her birthday last October, cast in the image of her late father, Robert Kardashian. Much like Mr. Bourdain’s vocal doppelgänger, the hologram’s voice was trained on real audio recordings but spoke in sentences never uttered by Mr. Kardashian; as if communicating from the afterlife, the hologram expressed pride in Ms. Kardashian West’s pursuit of a law degree and described Mr. West as “the most, most, most, most, most genius man in the whole world.”

Daniel Reynolds, whose company, Kaleida, produced the hologram of Mr. Kardashian, said that costs for projects of its nature start at $30,000 and can run higher than $100,000 when transportation and display are factored in.

But there are other, much more affordable forms of digital reincarnation; as of this year, on the genealogy site MyHeritage, visitors can animate family photos of relatives long dead, essentially creating innocuous but uncanny deepfakes, for free.

Though most digital reproductions have revolved around people in the public eye, there are implications for even the least famous of us. Just about everyone these days has an online identity, one that will live on long after death. Determining what to do with those digital selves may be one of the great ethical and technological imperatives of our time.

Ever since the internet subsumed communication, work and leisure, the amount of data humans create daily has risen steeply. Every minute, people enter more than 3.8 million Google search queries, send more than 188 million emails and swipe through Tinder more than 1.4 million times, all while being tracked by various forms of digital surveillance. We produce so much data that some philosophers now believe personhood is no longer an equation of body and mind; it must also take into account the digital being.

When we die, we leave behind informational corpses, composed of emails, text messages, social media profiles, search queries and online shopping behavior. Carl Ohman, a digital ethicist, said this represents a huge sociological shift; for centuries, only the rich and famous were thoroughly documented.

In one study, Dr. Ohman calculated that — assuming its continued existence — Facebook could have 4.9 billion deceased users by the century’s end. That figure presents challenges at both the personal and the societal level, Dr. Ohman said: “It’s not just about, ‘What do I do with my deceased father’s Facebook profile?’ It’s rather a matter of ‘What do we do with the Facebook profiles of the past generation?’”

The aggregate data of the dead on social media represents an archive of significant humanitarian value — a primary historical resource the likes of which no other generation has left behind. Dr. Ohman believes it must be treated as such.

He has argued in favor of designating digital remains with a status similar to that of archaeological remains — or “some kind of digital World Heritage label,” he said — so that scholars and archivists can protect them from exploitation and digital decay.

Then, in the future, people can use them to learn about the big, cultural moments that played out online, like the Arab Spring and the #MeToo movement, and “zoom in to do qualitative readings of the individuals that took part in these movements,” Dr. Ohman said.

Public social media profiles are one thing. Private exchanges, such as the email read in the Bourdain documentary, raise more complicated ethical questions.

“We don’t know that Bourdain would have consented to reading these emails on camera,” said Katie Shilton, a researcher focused on information technology ethics at the University of Maryland. “We don’t know that he would have consented to having his voice manipulated.” She described the decision to have the text read aloud as “a violation of autonomy.”

From an ethical standpoint, Dr. Shilton said, creating new audio of Mr. Bourdain’s words would require the permission of those close to him. In an interview with GQ, Mr. Neville said he “checked” with Mr. Bourdain’s “widow and his literary executor,” who approved of his use of A.I.

For her part, Ottavia Busia, Mr. Bourdain’s ex-wife, said she did not sign off on the decision. “I certainly was NOT the one who said Tony would have been cool with that,” she wrote on Twitter July 16, the day the film was released in theaters.

Celebrity Holograms and Posthumous Privacy

As Jean-Paul Sartre once put it: “To be dead is to be a prey for the living.” It’s a sentiment that philosophers are still mulling over today, and one that Patrick Stokes, the author of “Digital Souls,” sees as directly related to digital remains.

As he sees it, creating a digital version of a deceased person requires taking qualities from the dead that are meaningful to the living — such as their conversations and entertainment value — and leaving the rest behind.

“We’ve crossed into replacing the dead,” said Mr. Stokes, a senior lecturer in philosophy at Deakin University. “We’ve crossed into not simply finding a particularly vivid way to remember them, but instead, we found a way to plug the gap in existence they’ve left by dying.”

In the case of public figures, there is an obvious financial incentive to create their digital likenesses, which is why their images are protected by posthumous publicity rights for a certain period of time. In California, it’s up to 70 years after death; in New York, as of December 2020, it’s 40 years post-mortem.

If a company wants to use the image of a deceased person sooner, it requires consent from the deceased’s estate; resulting collaborations can be mutually profitable. As such, moral guardianship can be complicated by financial motives.

Some artists are explicitly expressing their desires. Robin Williams, for instance, who died in 2014, filed a deed preventing the use of his image, or any likeness of him, for 25 years after his death as an extra layer of protection on top of California’s law.

Consumers are also making their opinions known. The company Base Hologram, which has produced hologram shows of Roy Orbison, Buddy Holly and Maria Callas, reversed plans to put likenesses of both Whitney Houston and Amy Winehouse on tour, after they were criticized as exploitative. Just because producing such performances is legal doesn’t mean audiences will accept them as ethical.

Currently, United States federal law does not recognize the dead’s right to privacy, said Albert Gidari, a lawyer and former consulting director of privacy at the Stanford Center for Internet and Society.

“But,” he said, “as a practical matter, because so much of the information about you is in digital form today, residing with platform providers, social media and so on, the Stored Communications Act actually does protect that information against disclosure without prior consent.”

“And obviously, if you’re dead, you can’t consent,” Mr. Gidari added. A consequence is that families of dead individuals often cannot recover online data from their loved ones’ digital accounts.

As a way of asserting agency over their digital legacies, some people are choosing to create their own A.I. selves using a growing number of apps and services.

Some, like HereAfter, are focused on family history. For $125 to $625, the company interviews clients about critical moments in their lives. Those answers are used to create a Siri-like chat bot. If your great-grandchildren, for instance, wanted to learn how you met your spouse, they could ask the bot and it would answer in your voice.

Another chat bot app, Replika, creates avatars that mimic their users’ voices; over time, each of those avatars is meant to become the ultimate empathetic friend, ever-available by text (free) and voice calls (for a fee). The service gained traction during the pandemic, as isolated people sought out easy companionship.

Eugenia Kuyda, the app’s creator, got the idea after her friend Roman Mazurenko died in 2015. She used what is known as a neural network — a series of complex algorithms designed to recognize patterns — to train a chat bot on the textual data he left behind, which communicated convincingly enough to charm Mr. Mazurenko’s mother. That same technology underpins Replika’s chat bots.

“Replika is primarily a friend for our users, but it will live on past their death bearing the knowledge about its creator,” Ms. Kuyda wrote in an email.

In December 2020, Microsoft filed a patent for “Creating a conversational chat bot of a specific person,” which could be used in tandem with a “2-D or 3-D model of a specific person.” (“We do not have anything to share about this particular patent,” a Microsoft representative wrote in an email.)

Other projects seem aimed at offering emotional closure after the death of a loved one. In February 2020, a South Korean documentary called “Meeting You” was released. It chronicled the virtual-reality “reunion” of a woman named Jang Ji-sun and her young daughter who died from cancer.

The daughter’s avatar was created by Vive Studios in close conjunction with the Jang family. The company has considered other applications for its V.R. technology — creating a “digital memorial park” where people can visit dead loved ones, for instance, or teaming up with health care providers guiding patients through grief.

This is all happening in the midst of a pandemic that has radically altered the rites around death. For many families, final goodbyes and funerals were virtual in 2020, if they happened at all. When digital-afterlife technologies begin to enter mainstream use, they may help ease the process of bereavement, as well as foster connections between generations past and present and encourage the living to discuss death more openly with each other.

But before then, Mr. Stokes, the philosopher, said, there are important questions to consider: “If I do start interacting with these things, what does that say about my relationship to that person I loved? Am I actually doing the things that love requires by interacting with this new reanimation of them? Am I protecting the dead? Or am I exploiting them?”

“We have a rare chance to actually be ethically ready for new technology before it gets here,” Mr. Stokes said. Or, at least, before it goes any further.

Complete Article HERE!