„Improving Siri’s privacy protections“ (Beste Überschrift!)

We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.

Apple PR

Ein guter Ausgang von einer gar nicht so guten Geschichte. Apple hätten auch ohne das öffentliche Interesse diese „Verbesserungen“ einfallen können.

Trotzdem schön, dass Apple die wichtigsten Änderungen a) verständlich formuliert, b) direkt an den Anfang der Pressemitteilung stellt und c) eine Support-Seite eingerichtet hat.

  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
  • Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.