A team of security experts at the University of Florida working with security audit company CertiK has found that a certain class of cyberattacks could cause a smartphone to catch fire via its wireless charger. The team has posted a paper describing their research and results on the arXiv preprint s...
According to the researchers, "A charger can be manipulated to control voice assistants via inaudible voice commands, damage devices being charged through overcharging or overheating, and bypass Qi-standard specified foreign-object-detection mechanism to damage valuable items exposed to intense magnetic fields."
So if someone swaps your Qi charger for a malicious one they can ruin your phone (or some other device it's supposed to detect as not a phone ?) and maybe execute arbitrary voice commands... 🥱
I don't really get how they consider this a meaningful attack vector at all. Of course I can set the phone on fire if I can replace the charger - that's pretty much always going to be true and there's no reasonable way to fix it. The only possible use I see is to do it when someone is not intentionally charging their phone, e.g. holding a malicious charger close enough when they have the phone in their pocket.
If feel this is (unintentionally) stretching the use of the word cyberattack. Rightly or wrongly, most people consider a cyberattack a form of hacking/attack that's executed via a network or the internet.
I know its true definition any form of attack against data, network, or computing device (including smartphones), but this headline could easily lead people to think their phones could be set on fire by some anonymous l337 hAx0r over the internet.
While technically true, it requires physical exploit first.
Using ultrasonic frequencies to induce vibration and transfer sound humans can't hear to voice assistants has been demonstrated a fee years ago. With the right equipment (nothing you can't find on AliExpress) this isn't too difficult.
With modern smart assistants, you'll also need to take the owner's voice, though AI can do that if you record just one conversation at a decent quality.
In practice, assistants are quite useless, though. Ask them anything dangerous, such as leaking contacts or sending files, and the phone will start showing you results from Google rather than actually doing something.
You could trick the phone into opening a website with an exploit kit, but then your target needs to be vulnerable anyway, and there are other options to do that (i.e. buying ads with a very specific profile that only matches your target).
The physical harm of a fire is probably worse than anything you should expect out of a voice assistant attack.
Right, and Google uses those frequencies to pair Chromecasts - my point was that if they're using it (and aware of it), surely they have a way to detect (and filter) it.