Home >> Trending Topics >> Science, Technology, and Social Media >> Apple Apologizes For Listening To Users’ Conversations With Siri

Apple Apologizes For Listening To Users’ Conversations With Siri

http://dailycaller.com/

Apple apologized Wednesday for listening to recordings of people talking to Siri, the company’s digital voice assistant.

The tech giant’s grading program allowed under .2% of audio Siri requests and their transcripts “to measure how well Siri was responding and to improve its reliability,” according to Apple.

“We realize we haven’t been fully living up to our high ideals, and for that we apologize,” reads Apple’s statement, which then lists changes it is making to Siri.

(Article Continues Below Advertisement)

An anonymous whistleblower told The Guardian in July that there had been “countless” instances in which contractors could hear private conversations between doctors and patients, businessmen, potential criminals and sexual partners, and those conversations were also connected to location, contact details and app data.

(Article Continues Below Advertisement)
Sponsored Content

“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading,” the apology states. “We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.”

While Apple acknowledges the importance of privacy as “a human right” in the beginning of its statement, it goes on to say that users’ personal data “makes Siri better.”

“In order for Siri to more accurately complete personalized tasks, it collects and stores certain information from your device. … Apple sometimes uses the audio recording of a request, as well as the transcript, in a machine learning process that ‘trains’ Siri to improve,” the statement reads.

Apple plans to end its process of hiring contractors to listen in on real human conversations in an attempt to improve Siri’s performance and will now rely on computer-generated transcripts to help the voice assistant be more accurate. Apple will also allow users to give consent before their audio samples are used, and if they do give consent, only Apple employees will be allowed to listen in.

“Apple is committed to putting the customer at the center of everything we do, which includes protecting their privacy. We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for pushing us to constantly improve,” the statement concludes.

Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact [email protected]

Wake up Right! Subscribe to our Morning Briefing and get the news delivered to your inbox before breakfast!

Sponsored Content
0

About Audrey Conklin

One comment

  1. Force Recon Marine

    Better to ask forgiveness than ask for permission??? There is no forgiving assholes who invade your privacy then profit from it!!! That Is why I have no social media presence and only have a flip phone I limit my electronic footprint as mush as possible employing proxy services for w communications

    It can be a hassle but I find it more than worth the effort

Sign up for our Newsletter

* indicates required field




Email Format


Subscribe!