My, what a familiar voice you have, Alexa: Amazon tech feature can mimic dead relative

Very soon, Amazon says that the voice interacting with you on your smart speaker could be that of a dead friend or relative. Nothing creepy about that.

In what is becoming a more and more common refrain, geeks have excitedly unveiled a new technology, only to appear dumbfounded and even hurt that anyone would think that the idea was, at root, overwhelmingly insane.

In the tech world, if you can change a cargo van into an artichoke, then you should do it. Not because there is any reason to do it, or because the world needs more artichokes, but because you can. The ability to do it is reason enough, forget consumer demand, ethics or unintended consequences. How else to explain Hangtime, an app that records how high in the air you are able to throw your phone?

Tim Rowland
Tim Rowland

Personally, I would make that argument about smart speakers themselves, but I know that ship has sailed. I have had people gush that Alexa has changed their lives. But when I ask “How?” they stumble and redden a bit when they hear themselves saying out loud that a great new American tomorrow has been created by the ability to speak “Alexa, change the TV to channel 9” instead of punching the 9 button on the remote. Think of how much index-finger stress this is saving.

Related:Amazon's latest Alexa feature can mimic the voice of a dead person

Technology comes in two parts. One is creating the technology, the second is convincing us that it is crucial that we buy this technology that, honestly, we could easily live without.

So I expect there is no way to avoid a hard sell of dead-relative voice capability. People will spend their money on that, and then still have the audacity to complain about the cost of gasoline.

According to news reports, it will be possible to play a sound clip of a lost loved one and then, for example, have Aunt Gertie report tomorrow’s weather. Sounds useful. Or, considering that your children don’t face enough budding psychological catastrophes as it is, you can have the grandmother you’ve just buried read them a bedtime story.

Maybe one day, voices of loved ones who have gone on to their great reward will be just as ubiquitous and cherished as old photographs. And I understand that this could save parents valuable time they would otherwise waste reading to their own kids while they are still alive.

More Tim:Oil, gas are on our minds but not for long

But I strongly suspect the unhealthy undercurrents will run deeper than that. If you feel guilty about not spending more time with your grandparents when they were living, no problem, it’s like they never left.

As the Austin Lounge Lizards wrote in their song “Grandpa’s Hologram” about loading a photo of the old man onto a laser disc and recreating him at the breakfast table:

Always user friendly,

Though he hasn’t much to say.

We see lots more of grandpa,

Than before he passed away.

One day, I assume voice-mimicking capabilities will be available beyond our own relatives and be offered in a wide range of vocal flavors. So if you have been waiting all these years to hear Stalin read “Little Women,” your prayers are about to be answered.

Pete WatersI love dogs a lot — people too

Some worrywarts are more concerned about this technology than I am, fearing that scammers could call up grandparents who are actually living and ask for money using the voice of a relative. Maybe. But if phishing emails are any indication, the Russians have a long way to go in their command of the English language before they will achieve believability:

“Hello grandmother whom I love exclamation point. I have car much broke down. Can you send account and routing number to page on the Facebook of good friend Yuri Molochenko.”

Although I have a savvy friend who just got caught on a phishing scam that began, “Dear Hello Twitter User,” so who knows?

Tim Rowland is a Herald-Mail columnist.

This article originally appeared on The Herald-Mail: Amazon Alexa will be able to mimic dead relatives; that's not creepy