Manufacturers of smart home devices are increasingly adding voice assistant features to a wide range of devices such as smart speakers, televisions, thermostats, security systems and doorbells. As a result, many of these devices are equipped with microphones, which raises significant privacy concerns. Users are not always informed when audio recordings are sent to the cloud and do not know who may gain access to the recordings.
Wrong wake words
The research team from the profile area cyber security (CYSEC) of TU Darmstadt and their partners have managed to prove that many devices with integrated voice assistants can unintentionally eavesdrop on conversations. Typically, voice assistants are activated by a wake word, as in the case of Amazon's Echo, which is usually woken up by sentences starting with “Alexa, …”. The team carried out an extensive series of experiments with voice assistants like Alexa and identified numerous English terms that Alexa wrongly interprets as wake words. These activate the voice assistant, which leads to an unexpected audio transmission. Examples of such words include everyday terms such as ‘letter’ or ‘mixer’. In their experiments the researcher used words both spoken by synthesised voice as well as by human voice.
To tackle this problem, the researchers have developed a device as a Proof of Concept that detects these anomalies and built a fully functional prototype.
The device, which is called LeakyPick, can be placed in a user's smart home and used to test other voice assistants in its vicinity on a regular basis using audio commands. The subsequent network traffic is monitored for statistical patterns that would indicate audio transmission. LeakyPick then identifies the devices that are recording audio without the knowledge of the user and informs him.
The LeakyPick device could also help against a sophisticated attack on the voice assistant Alexa. In these attacks the adversary sends wake words and commands in the ultrasonic range, which is inaudible to humans, and places orders online with Amazon, for example. The ultrasound commands cannot be picked up by the human ear, but are understood by Alexa. LeakyPick device detects such activity in the network traffic through statistical pattern analysis for benign behaviour.
The device is currently in the prototype stage and is not yet available on the market.