All ears: The dangers of voice assistants

Attentive listeners such as Apple Siri and Amazon Echo have settled down in our houses. What’s the harm?

Nowadays the proverb “the walls have ears” is not as metaphoric as it used to be.

“The telescreen received and transmitted simultaneously. Any sound that Winston made, above the level of a very low whisper, would be picked up by it … There was of course no way of knowing whether you were being watched at any given moment.” That is George Orwell’s description of Big Brother’s spying devices in the novel 1984.

What if Big Brother weren’t the only one with access to the telescreen, though? What if pretty much anybody with the necessary skill set could listen in? And what if that screen were used not only for political propaganda, but also for displaying personalized ads: say, you complain to your spouse about a headache and immediately see a commercial for a painkiller? This is no longer a plot from a dystopian novel; it’s close to reality — a bit futuristic, but with a good chance of becoming real in the very near future.

We’ve already surrounded ourselves with budding telescreens, and their new features — such as voice assistants — are quite capable of becoming new threats.

Virtual assistants such as Apple Siri live in smartphones, tablets, and laptops, or in stationary devices like Amazon Echo or Google Home smart speakers. People use them to turn music on and off, check weather forecasts, adjust room temperature, order goods online, and do many other things.

Can these vigilant microphones do harm? Sure. The first possibility that comes to mind is the leaking of personal and corporate data. But here’s another one that is even easier for cybercriminals to turn into money: Do you dictate your credit card numbers and one-time passwords to fill in forms on websites?

Smart speakers can interpret voices even in noisy surrounding or with music playing. You don’t even need to speak very clearly to be understood: In my experience, Google’s voice assistant in a common Android tablet sometimes understands 3-year-old children better than it understands their parents.

Here are a few stories that might seem funny and alarming at the same time. All of them are related to different voice assistants and smart gadgets. Sci-fi writers have long dreamed about machines we can talk to, but even they couldn’t have imagined these situations that happened in the real life.

The speaker rebellion

In January 2017, San Diego, California, channel CW6 aired an interesting news segment about the vulnerabilities of Amazon Echo speakers (equipped with the Alexa virtual assistant).

The system is unable to differentiate people by voice, the show hosts explained, which means Alexa will follow the orders of anybody who is around. As a result, little kids started making unplanned online purchases, not knowing the difference between asking their parents to give them a snack and asking Alexa to give them a toy.

Then one of hosts said on air: “I love the little girl, saying ‘Alexa, order me a dollhouse.” The complaints began rolling in. People all over San Diego reported spontaneous purchases of dollhouses made by their voice assistants. Alexa heard a line uttered on television as a command and swiftly fulfilled it.

Amazon assured the victims of “AI rebellion” they could cancel their orders and not have to pay.

Gadgets under oath

Gadgets that can listen are valuable for law enforcement agencies because they can (typically) repeat anything they’ve heard. Here’s a detective story that happened in Arkansas is 2015.

Four men had a party. They watched football, drank, relaxed in the hot tub — nothing out of the ordinary. The next morning, the owner of the house found the lifeless body of one of the guests in the tub. He quickly became the number one suspect; the other guests said they had left the party before anything happened.

Detectives noticed a lot of smart devices in the home: lighting and security systems, a weather station — and an Amazon Echo. The police decided to question it. Detectives hoped to get voice recordings made at the night of murder. They asked Amazon to share the data, but the company allegedly refused.

Amazon developers claim that Echo doesn’t record sounds all the time, only when the user pronounces the wake-up word — by default, Alexa. Then the command is stored on the company’s servers for a limited time. Amazon claims it stores commands only to improve customer service, and users can manually delete all records in their account settings.

Anyway, detectives found another device from which to gather clues. They entered into evidence the testimony of a … smart water meter. In the early-morning hours after the victim’s death, an exorbitant amount of water was apparently used. The house owner claimed that he was already asleep at that time. Investigators suspected that water was used to clean up blood.

It’s noteworthy that the smart meter’s indications seem to be inaccurate. In addition to the very high water usage in the middle of the night, it reported water consumption was no higher than 40 liters per hour the day of the party, but you don’t fill a hot tub at those rates. The accused owner gave an interview to StopSmartMeters.org (yes, a website created by haters of smart meters); he said he supposed that the time on the meter was set wrong.

The case goes to court this year.

Virtual assistants in movies
(Spoiler alert!)
Modern mass culture also treats virtual assistants with suspicion. For example, in the movie Passengers, the android bartender Arthur reveals Jim Preston’s secret and damages his reputation in the eyes of his companion, Aurora. In Why Him? voice assistant Justine eavesdrops on a telephone call of the protagonist, Ned Fleming, and rats him out.

The car as a wiretap

Forbes also reports a few interesting cases of electronic devices being used against their owners.

In 2001, the FBI got permission from a Nevada court to request ATX Technologies’ help to intercept communications in a private car. ATX Technologies develops and manages utility car systems that enable car owners to request assistance in case of traffic accident.

The company complied with the request. Unfortunately, technical details were not published, except for the FBI’s demand that such surveillance be carried out with “a minimum of interference” to the quality of services provided to the suspect. It seems quite possible that eavesdropping was fulfilled over the emergency line, and that the car’s microphone was turned on remotely.

A similar story took place in 2007 in Louisiana. A driver or passenger accidentally pressed the hotline button and called the OnStar emergency service. The operator answered the call. Receiving no response, she notified the police. Then she tried once more to reach the possible victims and heard dialogue that sounded like a part of a drug deal. The operator let the police officer listen in and pinpointed the car’s location. As a result, police stopped the car and found marijuana inside.

In the latter case, the driver’s defense tried to invalidate the police evidence because there was no warrant, but the court rejected this argument because the police did not initiate the wiretap. The suspect had bought the car from a previous owner a few months before the incident and probably didn’t know about the emergency feature. He was ultimately found guilty.

How to stay off the air

In January, at CES 2017 Las Vegas, almost every smart thing presented — from cars to refrigerators — was equipped with a virtual assistant. This trend will surely create new privacy, security, and even physical safety risks.

Every developer needs to make users’ security a top priority. As for ordinary consumers, we have a few tips to help them protect their lives from the all-hearing ears.

  1. Turn off the microphone on Amazon Echo and Google speakers. There’s a button. It’s not a particularly convenient way to ensure privacy — you will always have to remember to neutralize the assistant — but at least it’s something.
  2. Use Echo’s account settings to prohibit or password-protect purchasing.
  3. Use antivirus protection of PCs, tablets, and smartphones to decrease the risk of data leaks and to keep criminals out.
  4. Change Amazon Echo’s wake word if someone in your household has a name that sounds like “Alexa.” Otherwise any dialogue near the device has the potential to turn into a real nuisance.

That’s not a one-way street

Okay, you taped over the webcam on your laptop, hid your smartphone under a pillow, and threw your Echo away. You feel safe from digital eavesdroppers now … but you’re not. Researchers from Ben-Gurion University (Israel) explain that even common earphones can be turned into listening devices.

  1. Earphones and passive loudspeakers are basically an inside-out microphone. That means every earphone set connected to a PC can detect sound.
  2. Some audio chipsets can change the function of an audio port at the software level. This is not a secret — it is stated in the motherboard specifications.

As a result, cybercriminals can turn your earphones into a wiretapping device to secretly record sound and send it to their servers via the Internet. Field studies proved it: In this way, one can record at acceptable quality a conversation taking place several meters away. Consider that people often keep their earphones quite a bit closer, resting them on their necks or on a nearby table.

To protect yourself from this style of attack, use active loudspeakers instead of passive earphones and speakers. Active speakers have a built-in amplifier between the jack and the speaker that stops a signal from coming back to the input side.

Tips

How to travel safely

Going on vacation? We’ve compiled a traveler’s guide to help you have an enjoyable safe time and completely get away from the routine.