U.S. Markets closed

Google Assistant voices new privacy actions for devices

Jason Schott

Google has responded to the outrage over privacy issues involving its Google Assistant-powered products by making adjustments to how it is storing audio recordings.

The concerns arose from Google’s process in which language experts can listen to and transcribe audio data from the Assistant to help improve speech technology for different languages.

“It's clear that we fell short of our high standards in making it easy for you to understand how your data is used, and we apologize,” Nino Tasca, Senior Product Manager, Google Assistant, wrote in a Google blog post this week, “When we learned about these concerns, we immediately paused this process of human transcription globally to investigate, and conducted a full review of our systems and controls.”

Google claims that, by default, they have not retained people’s audio recordings, and that will continue to be the case.

If users want to store audio data, they can "opt-in" to the Voice & Audio Activity (VAA) setting when their Assistant is set up. By doing this, VAA helps the Assistant better recognize the user’s voice over time, and it helps improve the Assistant for everyone by allowing Google to use small samples of audio to understand more languages and accents. People can view their past interactions with the Assistant, and delete any of these interactions whenever they like. 

Google is also updating its settings to highlight that when VAA is turned on, and human reviewers may listen to your audio snippets to help improve speech technology. For existing Assistant users, they will have the option to review their VAA setting and confirm your preference before any human review process resumes. Audio from the human review process will not be included unless the user has re-confirmed their VAA setting as on.

“We take a number of precautions to protect data during the human review process,”  wrote Tasca. “Audio snippets are never associated with any user accounts and language experts only listen to a small set of queries (around 0.2 percent of all user audio snippets), only from users with VAA turned on. Going forward, we’re adding greater security protections to this process, including an extra layer of privacy filters."

Google further claims that the Assistant already immediately deletes any audio data when it realizes it was activated unintentionally, for example, by a noise that sounds like “Hey Google.” We understand it’s important to get this right, and will continue to focus on this area, including implementing additional measures to help us better identify unintentional activations and exclude them from the human review process. The company will soon add a way to adjust how sensitive Google Assistant devices are to prompts like “Hey Google,” giving the user more control to reduce unintentional activations, or if users prefer to make it easier to get help, especially noisy environments.

The company says it will strive towards minimizing the amount of data its stores, and it is applying this approach to the Google Assistant too. Google's policy will be updated to vastly reduce the amount of audio data stored. For users who have opted in to VAA, Google will soon automatically delete the vast majority of audio data associated with accounts that are older than a few months. This new policy will be coming to VAA later this year.

The announcement concludes with a promise that has often been a challenge to big technology companies:  “We believe in putting you in control of your data, and we always work to keep it safe.”

Related Articles