What if your smartphone starts making calls, sending text messages, and browsing malicious websites on the Internet itself without even asking you?
This is no imaginations, as hackers can make this possible using your smartphone’s personal assistant like Siri or Google Now.
A team of security researchers from China’s Zhejiang University have discovered a clever way of activating your voice recognition systems without speaking a word by exploiting a security vulnerability that is apparently common across all major voice assistants.
Dubbed DolphinAttack, the attack technique works by feeding the AI assistants commands in ultrasonic frequencies, which are too high for humans to hear but are perfectly audible to the microphones on your smart devices.
With this technique, cyber criminals can “silently” whisper commands into your smartphones to hijack Siri and Alexa, and could force them to open malicious websites and even your door if you have a smart lock connected.
The attack works on every major voice recognition platforms, affecting every mobile platform including iOS and Android. So, whether you own an iPhone, a Nexus, or a Samsung, your device is at risk.
The attack takes advantage of the fact that human ears generally can’t hear sounds above 20kHz. But the microphone software still detects signals above 20 kHz frequency.
So, to demonstrate the DolphinAttack, the team first translated human voice commands into ultrasonic frequencies (over 20 kHz), then simply played them back from a regular smartphone equipped with an amplifier, ultrasonic transducer and battery—which costs less than $3.
Source: The Hacker News