AI

Two More Security Holes In Voice Assistants

Researchers from Indiana University, the Chinese Academy of Science, and the University of Virginia have discovered 2 new security vulnerabilities in voice-powered assistants, like Amazon Alexa or Google Assistant, that could lead to the theft of personal information.

Voice Squatting

The first vulnerability, outlined in a recent white paper by researchers has been dubbed ‘voice squatting’ i.e. a method which exploits the way a skill or action is invoked. This method takes advantage of the way that VPAs like smart speakers work. The services used in smart speakers operate using apps called “skills” (by Amazon) or “actions” (by Google). A skill or an action is what gives a VPA additional features, so that a user can interact with a smart assistant via a virtual user interface (VUI), and can run that skill or action using just their voice.

The ‘voice squatting’ method essentially involves tricking VPAs by using simple homophones – words that sound the same but have different meanings. Using an example from the white paper, if a user gives the command “Alexa, open Capital One” to run the Capital One skill / action a cyber criminal could create a malicious app with a similarly pronounced name e.g. “Capital Won”. This could mean that a voice command for Capital One skill is then hijacked to run the malicious Capital Won skill instead.

Voice Masquerading

The second vulnerability identified by the research has been dubbed ‘voice masquerading’. This method of exploiting how VPAs operate involves using a malicious skill / action to impersonate a legitimate skill / action, with the intended result of tricking a user into reading out personal information / account credentials, or to listen-in on private conversations.

For example, the researchers were able to register 5 new fake skills with Amazon, which passed Amazon’s vetting process, used similar invocation names, and were found to have been invoked by a high proportion of users.

Private Conversation Sent To Phone Contact

These latest revelations come hot on the heels of recent reports of how a recording the private conversation of a woman in Portland (US) was sent to one of her phone contacts without her authorisation after her Amazon Echo misinterpreted what she was saying.

What Does This Mean For Your Business?

VPAs are popular but are still relatively new, and one positive aspect of this story is that at least these vulnerabilities have been identified now by researchers so that changes can (hopefully) be made to counter the threats. Amazon has said that it conducts security reviews as part of its skill certification process, and it is hoped that the researchers’ abilities to pass-off fake skills successfully may make Amazon, Alexa and others look more carefully at their vetting processes.

VPA’s are now destined for use in the workplace e.g. business-focused versions of popular models and bespoke versions. In this young market, there are however, genuine fears about the security of IoT devices, and businesses may be particularly nervous about VPAs being used by malicious players to listen-in on sensitive business information which could be used against them e.g. for fraud or extortion. The big producers of VPAs will need to reassure businesses that they have installed enough security features and safeguards in order for businesses to fully trust their use in sensitive areas of the workplace.

Tech Tip – Alexa Skills Commands That Could Help At Work

Amazon’s Echo speakers may be used mainly in the home, but putting the listening / privacy fears aside, they can be useful in a business setting, particularly in small business settings / home offices. With this in mind, here are four skills commands that could help you:

Create Reminders – Alexa can act like a personal assistant. For example, you can tell Alexa exactly what you need to remember e.g. business appointments on certain days / times and it will remind you of that task and time. To create a reminder, say the task and its time such as, “Alexa, remind me to review customer accounts 10 a.m. every Monday”.

Create Distinctive Voice Profiles – By setting up voice profiles, Alexa can distinguish who is issuing the command e.g. different people in the office can ask “Alexa, what’s on my calendar?” Ask Alexa for details of how to do it.

ChatBot Skill – By enabling the ChatBot skill, workers can audibly request Alexa to post on their behalf. This can aid productivity. It can be achieved by linking an Amazon account to a Slack account. Users can then post to a specific channel by asking simply Alexa.

Find Your Phone – You can use Alexa to help you find your phone by using your voice. This is a free skill available from Amazon here: https://www.amazon.com/gp/product/B076PHYQD2?ie=UTF8&ref-suffix=ss_rw. The phone should ring even if it is on silent. It may not work if your phone is in Do Not Disturb mode, but you can add multiple people by name to call different phones instead of just one.

Alexa Records and Sends Private Conversation

A US woman has complained of feeling “invaded” after a private home conversation was recorded by her Amazon’s voice assistant, and then sent it to a random phone contact … who happened to be her husband’s employee.

What?!!

As first reported by US news outlet KIRO 7, the woman identified only as ‘Danielle’ had a conversation about hardwood flooring in the privacy of her own home in Portland, Oregon. Unknown to her, however, her Amazon’s voice assistant Alexa via her Amazon Echo not only recorded a seemingly ‘random’ conversation, but then sent the recording to a random phone contact without being expressly asked to do so.

The woman was only made aware that she had been recorded when she was contacted by her husband’s employee, who lives over 100 miles away in Seattle, who was able to tell her the subject of her recent conversation.

How Could It Have Happened?

Last year Amazon introduced a service whereby Amazon Echo users could sign up to the Alexa Calling and Messaging Service from the Alexa app. This means that all of the contacts saved to your mobile phone are linked to Alexa automatically, and you can call and message them using voice commands via your Echo.

In the case of the woman from Portland, Amazon has reportedly explained the incident as being the result of an “unlikely” string of events which were that:

  • Her Alexa started recording after it registered as hearing its name or another “wake word” (chosen by users).
  • Subsequently, in the following conversation (about hardwood floors), Alexa registered part of the conversation as being a ‘send message’ request.
  • Alexa would / should have said at that point, out loud, ‘To whom?’
  • It is believed that Alexa then interpreted part of the background conversation as a name in the woman’s phone contact list.
  • The selected contact was then sent a message containing the recoding of the private conversation.

Investigated

The woman requested a refund for her voice assistant device, saying that she felt invaded.

Amazon has reportedly apologised for the incident, has investigated what happened, and has determined that was an extremely rare occurrence. Amazon is, however, reported to be “taking steps” to avoid this from happening in the future.

Not The First Time

Amazon’s intelligent voice assistant has made the news in the past for some unforeseen situations that helped to perpetuate the fears of users that their home devices could have a more sinister dimension and / or could malfunction or be used to invade privacy. For example, back in 2016, US researchers found that they could hide commands in white noise played over loudspeakers and through YouTube videos in order to get smart devices to turn on flight mode or open a website. The researchers also found that they could embed commands directly into recordings of music or spoken text.

Also, although Amazon was cleared by an advertising watchdog, there was the case of the television advert for its Amazon’s Echo Dot smart speaker activating a viewer’s device and placing an order for cat food.

What Does This Mean For Your Business?

Although it may have been a series of events resulting in a ‘rare’ occurrence, the fact is that this appears to be a serious matter relating to the privacy of users that is likely to re-ignite many of the fears of home digital assistants being used as listening devices, or could be hacked and used to gather personal information that could be used to commit crime e.g. fraud or burglary.
If the lady in this case was an EU citizen, it is likely that Amazon could have fallen foul of the new GDPR and, therefore, potentially liable to a substantial fine if the ICO thought it right and necessary.

Adding the Alexa Calling and Messaging service to these devices was really just the beginning of Amazon’s plans to add more services until we are using our digital assistants to help with many different and often personal aspects of our lives e.g. from ordering goods and making appointments, to interacting with apps to control the heating in the house, and more. News of this latest incident could, therefore, make some users nervous about taking the next steps to trusting Amazon’s Alexa with more personal details and important aspects of their daily lives.

Amazon may need to be more proactive and overt in explaining how it is addressing the important matters of privacy and security in its digital assistant and devices in order to gain the trust that will enable it to get an even bigger share in the expanding market, and successfully offer a wider range of services via Alexa and Echo devices.