A new hack uses lasers to send inaudible commands to your Amazon Echo

A new photoacoustic flaw in voice assistants such as Siri, Alexa, and Google Assistant can render them vulnerable to a number of attacks that use lasers to inject inaudible commands into smartphones and speakers, and surreptitiously cause them to unlock doors, shop on e-commerce websites, and even start vehicles. The attacks — dubbed Light Commands — were disclosed by researchers from Tokyo-based University of Electro-Communications and University of Michigan.

Read full article on The Next Web