If this is true then siri is recording me saying some not so nice things both 
to and about her parentage 

So I hope apple is taking notice ` 

-----Original Message-----
From: [email protected] <[email protected]> On 
Behalf Of Karen Lewellen
Sent: Friday, 30 August 2019 6:13 AM
To: [email protected]
Subject: Re: Apple apologizes for Siri audio recordings, announces privacy 
changes going forward, The Verge

half a moment.
Am I to understand that one gets recorded when using  siri, or any of the 
others automatically...why?
If Apple or Google or Amazon wants to improve how their voice services learn,  
you higher staff to do the testing, in house.  Even the computer transcript 
business seems  speaking personally, nonsensical.
There is a question here.
I have an  unlocked IPhone.  My goals will never involve using the phone for 
calls.  Is there any value to me as a consumer locking this phone to a service?
Good on the guardian for cracking the story!
Karen



On Thu, 29 Aug 2019, Steve Matzura wrote:

> Believe any of that, and I have a bridge you might want to buy. Just 
> like Zuckerberg, they got caught with their pants down, and just like 
> Zuckerberg, this won't be the last time.
>
>
> On 8/28/2019 12:29 PM, M. Taylor wrote:
>>  Apple apologizes for Siri audio recordings, announces privacy 
>> changes  going  forward  By Chaim Gartenberg, Aug 28, 2019, 11:07am 
>> EDT
>>
>>  Apple has issued a formal apology for its privacy practices of 
>> secretly  having human contractors listen to recordings of customers 
>> talking to its  Siri digital assistant to improve the service. "We 
>> realize we haven't been  fully living up to our high ideals, and for 
>> that we apologize," Apple's  statement reads.
>>  The company also announced several changes to Siri's privacy policy:
>>  First, by default, we will no longer retain audio recordings of Siri  
>> interactions. We will continue to use computer-generated transcripts 
>> to  help  Siri improve.
>>  Second, users will be able to opt in to help Siri improve by 
>> learning from  the audio samples of their requests. We hope that many 
>> people will choose  to  help Siri get better, knowing that Apple 
>> respects their data and has  strong  privacy controls in place. Those 
>> who choose to participate will be able to  opt out at any time.
>>  Third, when customers opt in, only Apple employees will be allowed 
>> to  listen  to audio samples of the Siri interactions. Our team will 
>> work to delete  any  recording which is determined to be an 
>> inadvertent trigger of Siri.
>>
>>  Apple was one of several major tech companies - including Google, 
>> Amazon,  Facebook, and Microsoft - that was caught using paid human 
>> contractors to  review recordings from its digital assistant, a fact 
>> that wasn't made  clear  to customers. According to The Guardian's 
>> report, those contractors had  access to recordings that were full of 
>> private details, often due to  accidental Siri triggers, and workers 
>> were said to each be listening to up  to 1,000 recording a day.
>>  In the aftermath of that report, Apple announced that it would 
>> suspend the  grading program that would see those recordings 
>> reviewed. "We are  committed  to delivering a great Siri experience 
>> while protecting user privacy," an  Apple spokesperson said in a 
>> statement to The Verge at the time.
>>  Previously, Apple policy would keep random recordings from Siri for 
>> up to  six months, after which it would remove identifying 
>> information for a copy  that it would keep for two years or more.
>>  Per today's announcement, both the non-optional recording and the  
>> subsequent  grading policies are now being suspended for good. Apple 
>> says it will no  longer keep audio recordings from Siri unless a user 
>> specifically opts in.
>>  And in cases where customers do choose to give Apple their data, 
>> only  Apple  employees will have access (not, it would seem to imply, 
>> hired  contractors).
>>  The company additionally promises that it will work to delete 
>> recordings  of  accidental triggers, which The Guardian's report 
>> claims were the main  source  of sensitive information.
>>  According to Apple's statement, the company plans to resume grading 
>> Siri  recordings under those new policies later this fall, following 
>> a software  update that adds the new opt-in option to its devices.
>>
>>  Original Article at:
>>  
>> https://www.theverge.com/2019/8/28/20836760/apple-apology-siri-audio-
>> recordi
>>  ngs-privacy-changes-contractors
>> 
>> 
>
> --
> The following information is important for all members of the Mac 
> Visionaries list.
>
> If you have any questions or concerns about the running of this list, 
> or if you feel that a member's post is inappropriate, please contact 
> the owners or moderators directly rather than posting on the list itself.
>
> Your Mac Visionaries list moderator is Mark Taylor.  You can reach mark at: 
> [email protected] and your owner is Cara Quinn - you can reach Cara at 
> [email protected]
>
> The archives for this list can be searched at:
> http://www.mail-archive.com/[email protected]/
> --- You received this message because you are subscribed to the Google 
> Groups "MacVisionaries" group.
> To unsubscribe from this group and stop receiving emails from it, send 
> an email to [email protected].
> To view this discussion on the web visit 
> https://groups.google.com/d/msgid/macvisionaries/891e68c9-f9ab-41eb-0efc-917b400dbc0f%40noisynotes.com.
>
>
>

-- 
The following information is important for all members of the Mac Visionaries 
list.

If you have any questions or concerns about the running of this list, or if you 
feel that a member's post is inappropriate, please contact the owners or 
moderators directly rather than posting on the list itself.

Your Mac Visionaries list moderator is Mark Taylor.  You can reach mark at:  
[email protected] and your owner is Cara Quinn - you can reach Cara at 
[email protected]

The archives for this list can be searched at:
http://www.mail-archive.com/[email protected]/
--- 
You received this message because you are subscribed to the Google Groups 
"MacVisionaries" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/macvisionaries/SY2PR01MB29873F24CEBBFD5A2E5C7B9C8ABD0%40SY2PR01MB2987.ausprd01.prod.outlook.com.

Reply via email to