Pages: [1]   Go Down
  Print  
Author Topic: Siri 'distance activation hack'—what you need to know!  (Read 443 times)
HCK
Global Moderator
Hero Member
*****
Posts: 79425



« on: October 16, 2015, 03:00:19 am »

Siri 'distance activation hack'—what you need to know!

<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p><a href='http://www.imore.com/siri-silent-control-hack-what-you-need-know' title="Siri 'distance activation hack'—what you need to know!"><img src='http://www.imore.com/sites/imore.com/files/styles/large_wm_blw/public/field/image/2015/10/iphone-6s-plus-hey-siri-hero.jpg?itok=8veJvpBn' />[/url]</p> <p class="intro">Hackers claim to be able to hijack Siri and Google Now from a short distance, so should you be concerned?</p> <p>Researchers from ANSSI, France's National Information System Security Agency, have demonstrated a "hack" where, using transmitters from a short distance away, they can trigger Apple's Siri and Google Now under certain specific circumstances. Wired:</p>
<p>The researchers' silent voice command hack has some serious limitations: It only works on phones that have microphone-enabled headphones or earbuds plugged into them. Many Android phones don't have Google Now enabled from their lockscreen, or have it set to only respond to commands when it recognizes the user's voice. (On iPhones, however, Siri is enabled from the lockscreen by default, with no such voice identity feature.) Another limitation is that attentive victims would likely be able to see that the phone was receiving mysterious voice commands and cancel them before their mischief was complete.</p>
<h2>Wait, why does Wired's headline and lede paragraph focus only on Apple's Siri but the demo and rest of the article talk about Google Now?</h2> <p>Good question.</p> <h2>But my iPhone asked about Siri at setup and does have Voice ID, what gives?</h2> <p>Mine too and I'm not entirely sure. There seem to be several glaring errors in the article as published.</p> <ul><li>As of iOS 9, the iPhone absolutely does have a Voice ID feature, which is part of the setup process.</li> <li>If you buy a new iPhone that comes pre-installed with iOS 9, it will ask during setup if you want to enable "Hey Siri", and then require you to go through a setup process to enable it. (And, if you do, defaults to Lock screen access because that's the whole point of hands-free.)</li> <li>If you upgrade an existing iPhone to iOS 9 and it supports "Hey Siri", the first time you enable it or toggle it off and back on, it will require you to go through the setup.</li> <li>Only iPhone 6s and iPhone 6s Plus can do persistent "Hey Siri". Older iPhones can only do "Hey Siri" when plugged into power, and if expressly enabled in Settings. While battery packs are a possibility, most iPhones connected to headphones on the go probably won't be in that state.</li> </ul><p>The article does state that, absent "Hey Siri", the "hackers" can spoof the audio signal used by the headset button to trigger Siri. But again, it's unclear how they could suppress Siri's audio replies and confirmations.</p> <h2>And doesn't Siri give audio replies and confirmations as well?</h2> <p>Indeed. You don't have to be visually attentive to see mysterious voice commands because Siri responds with audio replies you can hear.</p> <p>While connected headphones stuffed into pockets are possible, they're probably not the most common situation.</p> <h2>So why does the headline say "silently" hack?</h2> <p>That is unclear.</p> <h2>At what distances does this "hack" work?</h2> <p>Wired says 16-feet in their headline, but later elaborate:</p>
<p>In its smallest form, which the researchers say could fit inside a backpack, their setup has a range of around six and a half feet. In a more powerful form that requires larger batteries and could only practically fit inside a car or van, the researchers say they could extend the attack's range to more than 16 feet.</p>
<h2>What can be done if someone activates voice from a distance?</h2> <p>From earlier in the article:</p>
<p>Without speaking a word, a hacker could use that radio attack to tell Siri or Google Now to make calls and send texts, dial the hacker's number to turn the phone into an eavesdropping device, send the phone's browser to a malware site, or send spam and phishing messages via email, Facebook, or Twitter.</p>
<p>Communications is something that can be triggered directly using Siri. Other features, like using Siri to go to website, require passcode or Touch ID unlock first. That's if you could get Siri to recognize the likely obscure and not easily rendered name of a malicious website and request it to begin with.</p> <p>It's also unclear how an audio transmission could form spam or phishing messages precisely enough to be functional. (Again, try getting Siri to render a complex URL for you and see how far you get.)</p> <h2>But everything from podcasts to prankster friends have been triggering voice activation for years, right?</h2> <p>Right. More Wired:</p>
<p>any smartphone's voice features could represent a security liability—whether from an attacker with the phone in hand or one that's hidden in the next room.</p>
<p>While true, of course, it's absolutely nothing new. When Google Now debuted, especially on Google Glass, CES pranksters loved to jump into rooms filled with early adopters and yell out search requests for... various parts of the human anatomy.</p> <p>It's why Apple and other vendors have added Voice ID technology.</p> <p>The difference here is a clever use of transmitters by security researchers unfortunately wrapped in what seems like really poor reporting.</p> <h2>Can Apple and Google prevent this type of "hack"?</h2> <p>The researchers make some recommendations for mitigating the "hack", including the ability to set custom trigger words. Some Android devices let you do that already, and I've been wishing for it on iOS for a while as well.</p> <p>To prevent spoofing the button press, they also recommend enhanced shielding in the headphone cords. Though no doubt an expense, even if only the most popular brands implemented that, it would reduce the surface potential of the "hack".</p> <h2>So, should I be worried about any of this?</h2> <p>Like usual, it's something to be aware of but not overly concerned about. Once again, we should all be more concerned about the state of security reporting at mainstream publications.</p> <p>Siri and Google Now are enabling and empowering technologies that help people live better lives. We should all be informed and educated about any potential security issues, but not sensationalized or made to feel scared in any way.</p> <h2>What, if anything, should I do?</h2> <p>iPhone 6s implements Voice ID, which profoundly reduces the chances of third-party activations, accidental, prank, or malicious. Keeping your headphones on when they're plugged in mitigates the potential consequences of any third-party activations as well—because you can hear them and intervene.</p> <p>Security and convenience are almost always at odds. Siri, Google Now, "Hey Siri", and "Okay Google Now" provide for increased convenience at the expense of some security. If you don't use or need voice activation or Lock screen access, by all means turn them off.</p> <p>Updated 3:30pm: Further explained how "Hey Siri" Voice ID setup works.</p> </div></div></div><br clear='all'/>

<a href="http://rc.feedsportal.com/r/241226034742/u/49/f/616881/c/33998/s/4ab4263b/sc/15/rc/1/rc.htm" rel="nofollow"><img src="http://rc.feedsportal.com/r/241226034742/u/49/f/616881/c/33998/s/4ab4263b/sc/15/rc/1/rc.img" border="0"/>[/url]

<a href="http://rc.feedsportal.com/r/241226034742/u/49/f/616881/c/33998/s/4ab4263b/sc/15/rc/2/rc.htm" rel="nofollow"><img src="http://rc.feedsportal.com/r/241226034742/u/49/f/616881/c/33998/s/4ab4263b/sc/15/rc/2/rc.img" border="0"/>[/url]

<a href="http://rc.feedsportal.com/r/241226034742/u/49/f/616881/c/33998/s/4ab4263b/sc/15/rc/3/rc.htm" rel="nofollow"><img src="http://rc.feedsportal.com/r/241226034742/u/49/f/616881/c/33998/s/4ab4263b/sc/15/rc/3/rc.img" border="0"/>[/url]

<img src="[url]http://da.feedsportal.com/r/241226034742/u/49/f/616881/c/33998/s/4ab4263b/sc/15/a2.img" border="0"/>[/url]<img width="1" height="1" src="http://pi.feedsportal.com/r/241226034742/u/49/f/616881/c/33998/s/4ab4263b/sc/15/a2t.img" border="0"/><img width='1' height='1' src='' border='0'/><img src="http://feeds.feedburner.com/~r/TheIphoneBlog/~4/MNTf-wpVeZY" height="1" width="1" alt=""/>

Source: Siri 'distance activation hack'—what you need to know!
Logged
Pages: [1]   Go Up
  Print  
 
Jump to: