A group of researchers from the United States conducted an interesting series of experiments related to the security of digital assistants.
Scientists have filed a Siri command and Alexa at a frequency which is inaccessible to the human ear. Thus, it is possible to give a signal to the assistant secret from the owner of the device. The results of the study wrote in The New York Times.
Students of the University of California, Berkeley, and Georgetown University in Washington has published a study, in which they put sonic team in a musical work or normal speech.
If these records were played next to the column Amazon or Apple, Siri, and Alexa took the team to add anything to your shopping list. People nearby did not hear the calls to a virtual assistant.
One of the most obvious ways of earnings on the “hidden teams” – a phone call on the line with expensive per-minute billing. In this video from 2017 shows an example of an attack vector.
The phone catches the sound signal at a distance of up to seven and a half meters. Theoretically, an attacker can arm themselves with a special device for the transmission of sound, go to a crowded place and just start to broadcast inaudible to the human ear signals to call a certain number.
Follow the news in our Telegram-channel, as well as in the app on iOS MacDigger.