Apple has apologised for allowing its contractors to listen in on Siri’s audio recordings in order to test its reliability. A grade would then be assigned on the task following the completion of a “thorough review” by the contractors. However, Apple said that it will be making changes to this practice and has since suspended human grading of Siri requests.
The grading programme was part of the Siri quality evaluation process and the review came after multiple media outlets, including The Guardian, reported in July that Apple’s contractors could hear confidential information on a regular basis as part of their job in ensuring quality control.
“As a result of our review, we realise we haven’t been fully living up to our high ideals, and for that we apologise,” Apple said in a statement. Apple added that it plans to resume the Siri grading programme when software updates are released to its users, but only after making changes such as using computer-generated transcripts to help Siri improve. As part of the changes, Apple will no longer retain audio recordings of Siri interactions.
“Apple is committed to putting the customer at the center of everything we do, which includes protecting their privacy. We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for pushing us to constantly improve,” the company said in the statement.
Users can opt in to help Siri improve by learning from the audio samples of their requests. Those who choose to participate will be able to opt out at any time. When consumers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. The team will work to delete any recording which is determined to be an inadvertent trigger of Siri, Apple added.
“We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place,” the statement read.
While Apple is trying to ensure that that are measures in place to protect users’ data, the company also explained that consumers’ data will enable to make Siri work better. For example, in order for Siri to more accurately complete personalised tasks, it collects and stores certain information from users’ devices.
“Apple sometimes uses the audio recording of a request, as well as the transcript, in a machine learning process that “trains” Siri to improve,” the company said in the statement.
[Digital Marketing Asia Conference 2019 in Singapore is back! Join us on 8-9 October as we hear from experienced practitioners and thought-leaders on how they are managing complex digital transitions and reimagining new ways for their marketing to become more customer focused, agile and interactive. Check out the agenda and book your seats today.]
Apple also explained in the statement that Siri has been engineered to protect user privacy “from the beginning” by minimising the amount of data it collects with Siri. In fact, Siri uses as little data as possible to deliver an accurate result.
Apple added that it does not use the Siri data stored in its servers to build a marketing profile and “never [sells] it to anyone”.
It added, “We use Siri data only to improve Siri, and we are constantly developing technologies to make Siri even more private.”
Siri also uses a random identifier — a long string of letters and numbers associated with a single device — to keep track of data while it’s being processed, rather than tying it to users’ identity through their Apple ID or phone number.
(Photo courtesy: 123RF)