Join SoapBox + CA at SXSW EDU 2024: Learn more

voice AI

We’re a privacy-first voice company

One of the first conversations our team will have with new clients — whether you’re in the gaming, media, toy, or education industry — is about voice data privacy. Children’s need for digital and data privacy protections far exceeds that of adults, and SoapBox is a privacy-first company.

Our privacy-by-design approach informs every process and system in the company, from how we build and deploy our speech recognition solutions, to how we generate, present, and contextualize the voice data we return back to our education and entertainment clients.

s on the perimeter
A girl sitting in a car, playing on an iPad.

Six things to know about voice data privacy at SoapBox

  • We are externally audited for compliance with COPPA and GDPR.
  • All voice data sent to our system is anonymized and de-identified.
  • All voice data is encrypted in our system (in transit and at rest).
  • Our clients decide on the jurisdiction for data processing and whether data is retained post-processing.
  • Any voice data retained in our system is only used to improve the accuracy of our voice engine and is never used for marketing or shared with third parties for any other purpose.
  • The fundamental digital rights of end users are protected by our approach (e.g., right of deletion).

PRIVO has been working with SoapBox Labs since its founding in 2013. We’re proud to count them as one of our esteemed customers and partners.

Denise G. Tayloe, Co-founder & CEO, of PRIVO – Safe Harbor Service

What is privacy-by-design?

SoapBox believes in a proactive approach to voice data privacy. That approach, called privacy-by-design, means that we proactively take steps and build processes that ensure we never identify an individual child when processing their voice data.

In this video, CEO Dr. Martyn Farrows explains further.

Frequently asked questions

Why is voice data privacy so important for kids?

If there’s a privacy concern when it comes to adult users of speech recognition technology, that concern is much greater when it comes to kids. Here are three reasons why:

  1. Voice gives kids agency and allows them to learn and play in the most natural way possible – using their voices. But voice is another entry point to the internet, and we must be mindful of what we let our kids access, play, and digitally control when using their voices.
  2. Voice data is a form of biometric data. It conveys more than just what was said. Voice data convey our identity, our emotions, our intentions, our environment, and certain health conditions. Even our socioeconomic and educational background can be inferred from our accents and dialects.
  3. The data kids share on consumer platforms, be it voice data or otherwise, can be used to create profiles of them as individual consumers, and these profiles can follow them throughout their lifetime.

To learn more about the importance of protecting kids’ data privacy, read our interview with children’s data privacy expert Dr. Veronica Barassi.

How does SoapBox’s approach to privacy differ from other speech recognition companies?

When SoapBox Labs entered the voice technology market in 2013, it still felt like the wild west in terms of data privacy compliance for kids. Even so, a privacy-by-design approach to data and technology was part of our DNA from the very beginning. It wasn’t faster or cheaper to design-in privacy. Still, as a deep-tech, kid-focused company, we wanted to set ourselves up for long-term success – and that meant respecting every kid’s fundamental right to privacy.

To this day, other speech recognition software providers have competing business models that process and sell voice data for its commercial value. Not SoapBox. Here are the steps we take — and encourage all players in the voice industry to do the same — to ensure voice technology does not threaten kids’ rights:

  1. Deliver transparency on how kids’ voice data is used once it is stored. 
  2. Commit to treat kids’ data differently to adults’ data, even with consent.
  3. Commit to identify voice data captured without consent (e.g., from a playmate or visitor to the home whose parent has not given consent) and delete it.
  4. Develop kid/adult voice classifiers to protect kids from adult-centered digital environments and to recognize when parental consent is needed before storing voice data.
  5. Develop voice solutions where processing of voice data is embedded on-device.

What questions should I ask my voice or speech technology company to understand how they treat privacy?

Kid-focused companies need to partner with a voice provider that takes all necessary steps to protect kids’ data privacy. Ask your voice provider these questions to ensure privacy is one of their top priorities:

  • How do you store voice data?
  • What advertising, marketing, and other revenue models are associated with your data?
  • Do you reuse or sell voice data?
  • Who owns the kids’ voice data on your system?
  • Do you employ a safe harbor company to certify your approach to voice data privacy? 
  • What rights do I have to the voice data processed by your system?

What other resources would you recommend to learn about kids’ privacy?

Great question! There are lots of fantastic organizations doing the important work of promoting kids’ digital privacy rights. We recommend checking out these resources to learn more:

  • Sign up for the SoapBox newsletter and newsletters from other leading organizations in the kids and safety space like PRIVO and SuperAwesome.
  • Read the latest privacy reports and AI toolkits from the World Economic Forum.
  • Learn about the UK’s and California’s age-appropriate design codes. 
  • Understand the impacts of regional data protection legislation, like COPPA and GDPR.
  • Check out the UK’s new Online Safety Bill, which further enhances protections for children.
  • Read the EU’s new strategy to deliver a better internet for children (an addition to their existing GDPR legislation).
  • Keep up to date with the new US Kids Online Safety Act introduced in early 2022, that requires online platforms to provide parents and minors with “easy-to-use” tools to keep them safe, limit screen time, and protect their data. 

Have a privacy-related question for our in-house privacy experts?