Malicious Siri commands can be hidden in music and innocuous-sounding speech recordings<div class="feat-image">
</div><p>A group of students from Berkeley have demonstrated how malicious commands to <a href="
https://9to5mac.com/guides/siri/" target="_blank" rel="noopener">Siri[/url], Google Assistant and Alexa can be hidden in recorded music or innocuous-sounding speech.</p>
<p>Simply playing the tracks over the radio, streaming music track or podcast could allow attackers to take control of a smart home …</p>
<p> <a href="
https://9to5mac.com/2018/05/10/malicious-siri-commands/#more-532651" class="more-link">more…[/url]</p><div class="feedflare">
<img src="[url]http://feeds.feedburner.com/~ff/9To5Mac-MacAllDay?i=WWyGhTr3-GI:m78SeoBn6iY:D7DqB2pKExk" border="0"></img>[/url]
</div><img src="
http://feeds.feedburner.com/~r/9To5Mac-MacAllDay/~4/WWyGhTr3-GI" height="1" width="1" alt=""/>
Source:
Malicious Siri commands can be hidden in music and innocuous-sounding speech recordings