After Criticism, Apple Puts an End to Third-Party Companies Listening to Siri

News Apple Siri Listening Featured

As great as digital personal assistants, such as Alexa, Siri, Google Assistant, etc. are, everyone began to realize that the assistants are still digital, meaning they’re keeping data, and the multiple requests and/or conversations you have are on record somewhere.

Privacy-focused Apple has reversed one of its policies and has stopped the practice of listening to Siri recordings. They had previously admitted they were making these recording available to third-party contractors.

Only Siri Is Listening

Once invoked on an Apple device, Siri will send you a message that reads, “Go ahead … I’m listening.” But it really should say “we’re listening,” or maybe even “whoever wants to is listening.”

It was recently reported that some Siri questions or conversations were being listened to for “grading purposes.” The Guardian reported the requests were being “analyzed to improve Siri and dictation.”

Along with that, it was noted that “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities, and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

Yet, it was noted that the third-party contractors who were given access to the Siri responses accidentally heard “confidential medical information, drug deals, and recordings of couples having sex.”

News Apple Siri Listening Iphone

Apple has always been known for being privacy-focused, so this didn’t sit well with their users. It’s expected out of Google but Apple loyalists don’t expect it.

The company will allow users the opportunity to opt out of sharing practices in a future update, but for now, they are ending the practice of Siri listening.

“We are committed to developing a great Siri experience while protecting user privacy,” vowed Apple in a statement. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, others will have the ability to choose to participate in this or opt out.

There are many good things in this, although it’s reasonable that many users won’t trust Apple that they are really ending the practice. However, Google is not known to stop any of their anti-privacy practices, and Amazon has never suspended the practice while working on a solution.

What You Can Do Now

If you don’t trust Apple when they say they are suspending the practice until a software update can fix it, you can visit this article where it details different options for stopping Apple from listening in on your conversations with Siri.

But above all, no matter what Apple, Amazon, and Google promise about the digital voice assistants, it’s probably best to just assume you’re always being listened to and to not discuss anything private.

Does any of this put you at ease? Or are you still very bothered that Apple was doing this with Siri all along? Let us know what you think about Apple’s Siri habits and the suspension of the practice in the comments below.

Subscribe to our newsletter!

Our latest tutorials delivered straight to your inbox