Apple is attempting to assuage privacy concerns about its digital assistant Siri with some sweeping changes to the program’s default privacy settings.
In a blog post titled “Improving Siri’s privacy protections,” the company pledged to delete Siri recordings by default and to allow users to choose human review for some of their voice recordings, though it didn’t say when the change would go into effect.
“As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize,” a spokesperson wrote.
On Wednesday, Apple also announced that it would only allow employees to review recordings going forward, rather than relying on contractors for the work.
In July, The Guardian revealed that Apple evaluates the quality of Siri’s responses via human contractors, who grade the relevance of the action Siri took to fulfill a user’s request—a revelation that outraged users. The activities detailed in the snippets were intimate: drug deals, hookups, hospital visits, and more, and users had often activated Siri by accident.
The tech giant paused human review in response to the controversy in early August but plans to resume it in fall, according to the blog post. TechCrunch reported the switch to opt-in audio review would happen as part of an upcoming software update.
The same day it announced the changes to Siri’s default privacy settings, the company laid off more than 300 contractors in Europe responsible for evaluating the quality of Siri’s responses, according to The Guardian. One of the most well-known voice-activated assistants in the world, Siri comes as pre-installed software on iPhones, iPads, Mac computers, HomePods, and Apple Watches.
Apple’s decision follows months of reports documenting how various tech companies use human contractors—rather than solely artificial intelligence systems, as most users might assume—to train voice-activated programs like Amazon’s Alexa and the Google Assistant. Apple did not immediately respond to request for comment.