Title: Malicious Siri commands can be hidden in music and innocuous-sounding speech recordings Post by: HCK on May 13, 2018, 04:05:21 pm Malicious Siri commands can be hidden in music and innocuous-sounding speech recordings
<div class="feat-image">(https://9to5mac.files.wordpress.com/2018/05/homepod-siri-touch-area.jpg?quality=82&strip=all&w=1500)</div><p>A group of students from Berkeley have demonstrated how malicious commands to <a href="https://9to5mac.com/guides/siri/" target="_blank" rel="noopener">Siri[/url], Google Assistant and Alexa can be hidden in recorded music or innocuous-sounding speech.</p> <p>Simply playing the tracks over the radio, streaming music track or podcast could allow attackers to take control of a smart home …</p> <p> <a href="https://9to5mac.com/2018/05/10/malicious-siri-commands/#more-532651" class="more-link">more…[/url]</p><div class="feedflare"> <img src="http://feeds.feedburner.com/~ff/9To5Mac-MacAllDay?i=WWyGhTr3-GI:m78SeoBn6iY:D7DqB2pKExk" border="0"></img> (http://feeds.feedburner.com/~ff/9To5Mac-MacAllDay?a=WWyGhTr3-GI:m78SeoBn6iY:D7DqB2pKExk) </div><img src="http://feeds.feedburner.com/~r/9To5Mac-MacAllDay/~4/WWyGhTr3-GI" height="1" width="1" alt=""/> Source: Malicious Siri commands can be hidden in music and innocuous-sounding speech recordings (http://feedproxy.google.com/~r/9To5Mac-MacAllDay/~3/WWyGhTr3-GI/) |