Researchers have discovered a way to hack voice assistants, such as Amazon’s Alexa, Apple’s Siri, and Google Assistant, by shining a laser on the devices’ microphones. Experts at the University of Electro-Communications in Tokyo and the University of Michigan released a research paper detailing how so-called “Light Commands” allowed them to inject commands into voice assistants using laser lights. They didn’t even have to be in the same building.
For seven months, researchers tested the technique on 17 voice-controlled devices enabled with Alexa, Siri, Facebook Portal, and Google Assistant. They created three different tests using ordinary laser pointers, laser drivers, a telephoto lens, and an enhanced flashlight. They then exploited a vulnerability in the microphones that allows them to unintentionally respond to light as if it were sound.
The researchers were able to use lasers to gain full control of the voice assistants at distances up to 110 meters (361 feet) by simply modulating the amplitude of the laser light. They took over one device housed on the fourth floor of an office building from the top of a tower on an adjacent building. They were able to lock and unlock the doors and trunk of a Tesla Model S with Google Assistant’s EV car app installed, as well as remotely open the doors and start the engine on a Ford car via the Ford Pass app.
While there are no known instances of someone using light commands to hack a device, the researchers believe all devices that use MEMS microphones are susceptible to these types of attacks. They say the only foolproof way to protect against light commands is to keep devices out of sight from windows. The researchers say that they have shared their findings with Amazon, Apple, Google, Tesla and Ford, as well as with the FDA and the Industrial Control Systems Cyber Emergency Response Team (ICS-CERT).