A simple design flaw makes it astoundingly easy to hack Siri and Alexa — Co.Design — “Using a technique called the DolphinAttack, a team from Zhejiang University translated typical vocal commands into ultrasonic frequencies that are too high for the human ear to hear, but perfectly decipherable by the microphones and software powering our always-on voice assistants. This relatively simple translation process lets them take control of gadgets with just a few words uttered in frequencies none of us can hear.”
- Paris adds support for iPhone transit card top ups
- One in three Indian households now use digital payments
- China unveils plan to build a global payment network based on central bank digital currencies
- Samsung unveils ultra wideband Smart Tags
- Hyundai to let drivers use their iPhone to unlock their vehicle
This article is more than three years old