A Murder Case Tests Alexa’s Devotion to Your Privacy
Credit to Author: Gerald Sauer| Date: Tue, 28 Feb 2017 15:00:27 +0000
The Amazon Echo can seem like your best friend—until it betrays you. That’s because this device is different from anything else in your house. Alexa, the voice assistant that powers Echo and more, is always listening, sending what you say after using a “wake” word to Amazon’s servers. Of course, Echo isn’t the only voice-assistant speaker on the market, but it sits in millions of homes, and Alexa is headed to devices from companies like Ford, Dish, Samsung, and Whirlpool.
Thankfully, before Alexa can betray you, Amazon is taking steps to push back.
Arkansas police recently demanded that Amazon turn over information collected from a murder suspect’s Echo. Amazon’s attorneys contend that the First Amendment’s free speech protection applies to information gathered and sent by the device; as a result, Amazon argues, the police should jump through several legal hoops before the company is required to release your data.
If Amazon has its way, the police must prove the state has a compelling need for the information and that the material can’t be obtained elsewhere (such as from another source—a receipt in a person’s possession, for instance). The information sought must be specific and integral to the investigation. If the police meet this test, a judge will review the information in private and decide what information, if any, should be disclosed.
Law enforcement has a well-documented history of expanding investigations into areas that test an individual’s right to privacy. The US Supreme Court, in 1967’s Katz v. United States, determined that the FBI’s use of an electronic eavesdropping device affixed to the outside of a telephone booth was an invasion of privacy, and that the material it collected could not be offered as evidence at trial. That decision demonstrates that there are limits to what the police can do in their investigations and may provide guidance for the Arkansas court in considering Amazon’s arguments.
Amazon’s effort to protect the data your Echo collects by invoking the First Amendment is commendable, but the company has failed to address the real problem: Why is all that data just sitting in Amazon’s servers in the first place? The brief Amazon filed in the Arkansas court confirms that the company saves the recordings and transcripts of your dialogue with Alexa on servers where “all data is protected during transmission and securely stored.” So should we just trust that Amazon’s servers are impenetrable?
Digital assistants like the Echo and Google Home are backed by sophisticated cloud-based artificial intelligence systems, connected to your home through the internet to at least one speaker and an always-on microphone. Use the right trigger phrase, and the digital assistant is all ears, ready to do your bidding: streaming music, answering questions, controlling smart-home devices, scheduling events and, especially for Amazon, buying things.
Think of the assistants like really smart dogs. They’re always ready to react to specific commands. Also like a really smart dog, they can remember those commands forever. And this concept of an always-on, always-connected, always-remembering listening device is where it gets intriguing.
Amazon argues that Alexa only records when the trigger word is spoken, and users can mute the device by pressing a button. But consider this: In the Oscar-winning documentary Citizenfour, during a meeting with journalists Laura Poitras and Glenn Greenwald, Edward Snowden disconnects a landline phone in the hotel room because it might be used as a listening device.
Mobile phones, computer webcams and now, digital assistants also can be co-opted for nefarious purposes. These are not potential listening devices. They are listening devices—that’s why they exist. And if a hotel-room phone can be rigged as a listening device for other than its original purposes, why not something built to listen all the time?
Let’s look at a few scenarios. These are more or less specific to Amazon’s technology and policies, but variants could apply to Google Home or other digital assistants.
Say you’re meeting with your attorney, confess you’ve had an affair with a woman named Alexa, and happen to say the trigger phrase (after all, more than 100,000 people born in the past 25 years were given that name). Who has access to that recording? Or what if, in a meeting, someone triggers the device to record the conversation without your permission? This would be legal, if unethical, in so-called one-party consent states (California requires permission from both parties before recording).
This brings up a more basic question: Do you have to give informed consent to be recorded each time you enter my Alexa-outfitted home? Do I have to actively request your permission? And who, at Amazon or beyond, gets to see what tendencies are revealed by your Alexa commands? Amazon claims you can permanently delete the voice recordings, though wiping them degrades performance. Even if you’re smart enough to clear your browser history, are you smart enough to clear this, too? And what about the transcripts?
Another question: How do you know when your digital assistant is recording what you say? Amazon provides several ways to activate the recording beyond the “wake” word. A light on the Echo turns blue to indicate audio is streaming to the cloud. After the request is processed, the audio feed is supposed to close. You can also set the device to play a sound when it stops streaming your audio, but what happens if the device is hacked or modified to keep recording?
There are endless scenarios where digital assistants could be problematic. Current law, as always with fast-emerging technologies, will struggle to catch up. Amazon’s contentions in the Arkansas criminal case, overall, offer a good start for setting some legal standards.
And of course these issues apply beyond Amazon and Alexa. In fact, they may be more problematic with Google Home and other competitors. Among other things, Google Home has access to your private calendar—it knows the events that you feel were important enough to flag for a particular day.
Google’s privacy policies and other online materials about Home don’t directly address voice-recorded issues. But its policies do say, “Google will share your information with companies, organizations, and individuals outside of Google if Google has a good-faith belief that access, use, preservation, or disclosure of the information is reasonably necessary to meet applicable law, regulation, legal process, or enforceable government request.” In short, it appears that Google doesn’t ask for your permission to share your voice recordings.
And yes, you can delete your search history in Google Home, but like Amazon, the company says doing so “will limit the personalized experience for features like the Google Assistant.”
Millions of people are putting digital assistants in their lives with no clue about the potential havoc this Trojan horse could bring. Based on what Amazon and Google say about their devices, everyone needs to recognize the unresolved legal issues involving this new technology. Beware of who, or what, is listening.