Amazon is adding an experimental new feature to its lineup of Amazon Alexa devices, letting them mimic the voices of a dead relative.
The feature was shown off during Amazon’s annual MARS conference hosted by Adam Savage, editor-in-chief of Tested.com. During the presentation, a child asked Alexa to read a bedtime story in the voice of their late grandmother.
“In these times of the ongoing pandemic, when so many of us have lost someone we love,” said Amazons head scientist for Alexa AI, Rohit Prasad, “While AI can’t eliminate that pain of loss, it can definitely make their memories last.”
As you might expect, the presentation has made quite a few people feel uncomfortable and raises some major questions as to the ethics of the idea in general. Dozens of Twitter users have even compared it to a Black Mirror episode come to life.
At the moment, Amazon hasn’t made any indication on if the feature will ever become commercially available, or if it’s something simply being tested. Technology like this has been a hot topic of debate for years, especially as its use becomes more prevalent in entertainment.
AI technology has been used to recreate actors’ likenesses, and is now being used to recreate voices as well. The Mandalorian Season 2 used an AI neural network to recreate the voice of a young Luke Skywalker, while the likeness was created by blending images of Mark Hamill and a younger actor. The Anthony Bourdain documentary, Roadrunner, also drew a ton of criticism for using AI to recreate the late-chefs voice for three lines in the film.
Using this kind of technology is a difficult decision for anyone, and we’ll likely be debating the ethics of it for years to come. Subbarao Kambhampati, a professor of computer science at Arizona State University, talked to NPR about Amazon Alexa’s new feature.
“For people in grieving, this might actually help in the same way we look back and watch videos of the departed,” said Kambhampati. “But it comes with serious ethical issues, like is it OK to do this without the deceased person’s consent?”