Alexa's snafu should be a turning point for tech

Register now

(Bloomberg) -- Alexa is listening to us after all. A married Oregon couple found out that their Amazon Echo had sent their friend a recording of a private conversation they were having about hardwood floors. "My husband and I would joke and say, ‘I'd bet these devices are listening to what we're saying,’" said the woman who would come to learn that she had been betrayed by her live-in robot assistant.

An acquaintance who received the recording of their conversation called her with a warning: "Unplug your Alexa devices right now. You're being hacked."

It honestly sounds too crazy to be true. But this isn't some crank conspiracy. Amazon admitted the couple was telling the truth. The world's second-most valuable company said in a statement: "We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future." Amazon says that the Echo thought it heard a series of commands, including "Alexa" and "send message," instructing it to send the audio files.

Twitter joked that this was Alexa's first butt dial.

It's hard not feel vindicated. Not that Alexa-as-opt-in-Amazon-surveillance-program has been a particular hobbyhorse of mine. (I have an Alexa-powered Echo at home that's not going anywhere.) But time and again the obvious, paranoid-sounding, dystopia-minded critique seems to come true! Apple really does throttle old phones. Facebook really did run Russian ads designed to influence the election. Juicero really was selling squeezable juice packets.

Honestly, I just think it's worth stewing over the fact that these problems were so obvious, not just in retrospect, but when they were first announced. This is far from the only recent example of tech overlord obliviousness: How was Google taken by surprise when people were upset that it planned to use artificial intelligence to impersonate human callers?

Whether it's the fault of coding gone wrong or well-intentioned engineers disconnected from public norms (like privacy or the supremacy of man over machine), too often the darkest sci-fi imagining of the future of technology turns out to be the right one. Yet the estimated 60 million people who will use a smart speaker at least once a month this year are told not to worry, Alexa isn't always listening. My boss Brad Stone called concerns about Alexa "overheated" in this very newsletter back in December. Time to take your sensitive conversations back outside, away from Alexa's prying ears, Brad.

He was swayed by Amazon's argument that Alexa only listens when you call out her name. Yet there were early signs that even a partially snooping Alexa could have serious consequences: In a murder case, investigators wanted all the voice records from when someone summoned Alexa (on purpose or by accident) in case it picked up any stray audio evidence. This isn't just an Amazon problem: One of Google's voice-enabled devices was recording everything before it was patched.

OK, so our worst fears are being realized, but we're all so addicted to the devices that Silicon Valley pumps out that we're not going to give them up. We're not innocent here, I admit. And the suspicious and tech-illiterate are not necessarily entitled to new whizzbang technology. Nonetheless, as paying customers and as fellow human beings, I think it's reasonable to use this snooping speaker as a moment to ask that engineers take our concerns more seriously.

Part of the problem is that a handful of absurdly wealthy and powerful technology companies are making rapid-fire decisions that have huge ramifications for the rest of our lives without really taking any outside input. When someone builds a street near your house, you can go to a hearing to give your feedback. But when Amazon wants to put a listening device inside your home, it’s “take it or leave it.” As these companies come to hoover up more of our collective time and money, the least they could do is hear us out (without recording and sharing it).

For reprint and licensing requests for this article, click here.