Friday, March 29, 2024

Apple contractors listening to Siri requests hear sex, drug deals, and more

Share

Apple contractors routinely hear sensitive things like confidential medical information, couples having sex, and drug deals as part of their work related to quality control for the company’s virtual assistant Siri, The Guardian reports.

The recordings are passed on to contractors who are asked to determine whether the activation of Siri was intentional or accidental and to grade Siri’s responses.

Less than 1% of daily Siri activations are sent on to a human for grading. However, Apple does not expressly tell customers that their recordings might be used in this way. The issue was brought to light by an anonymous whistleblower who spoke to The Guardian. That individual said that the recordings often contain sexual encounters as well as business dealings, and that they feel Apple should expressly tell users that Siri content might be reviewed by a human.

“A small portion of Siri requests are analyzed to improve Siri and dictation,” Apple told the Guardian in a statement. “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” 

We reached out to Apple to additional details but have yet to receive a response. We’ll update this story is we hear back. Siri can sometimes turn on and start listening to you if it thinks it’s accidentally heard a wake word — typically “Hey Siri!” or something similar — even if you didn’t mean to turn it on.

The human beings who listen to these conversations (or worse) work to determine what the person who was recorded was asking for and if Siri provided it. If not, they determine whether Siri should have realistically been able to answer your question.

If the complaints about Apple sound familiar, it’s likely because Amazon battled a similar issue earlier this year. While Amazon also sends recordings to humans to analyze later, and retains text data of requests even when recordings are deleted, the company also offers an option within Alexa’s settings where customers can opt-out of their data being used for that purpose.

Apple does not currently offer an opt-out option for Siri.

Editors’ Recommendations

  • Random people might have listened to your Google Assistant commands
  • Apple Music vs. Spotify: Which service is the streaming king?
  • Amazon retains text data on users even when audio recordings are deleted
  • How to use Siri on a Mac
  • The best iPhone apps available right now (July 2019)






Read more

More News