Apple confirms "a small portion" of Siri recordings get reviewed by contractors
First Amazon, then Google, and now Apple: it seems that every company with a digital assistant to its name uses human beings to review a selection of the interactions that users are having with their smart speakers and phones.
After a whistleblower tipped off the Guardian to the practice in regards to Siri, Apple confirmed that "a small portion of Siri requests" do get reviewed by contracted workers, though the recordings are not linked to an Apple ID.
Information such as location, contact details, and other app data are logged and included with the recordings, according to the anonymous tipster who contacted the Guardian.
- Huawei tweaks the Mate X design
- Apple Maps could get an AR upgrade
- The latest on the 2019 MacBook Pro
"Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements," Apple says, adding that less than 1 percent of daily Siri requests get reviewed in this way.
Easy listening
The Guardian's source says a lot of the recordings come from accidental activations: conversations about medical details, drug deals, and sexual encounters have all apparently been captured and reviewed.
As with Amazon Alexa and Google Assistant, the aim of this review process is to improve accuracy, Apple says – workers have to grade the clips, usually just a couple of seconds long, on whether Siri dealt with the interaction appropriately.
It's still a little disconcerting that real-life human beings could be listening to your daily chit chat if Siri is within listening distance. At the moment there's no way of opting out of having your recordings reviewed in this way.
Given Apple's focus on user privacy, it may well take steps to further anonymise the recordings before they're reviewed, or give users more options over how their recordings get processed. As is often the case though, it's taken some investigative reporting to bring the practice to light.
Via TechCrunch
0 comments:
Post a Comment