This Just In: Emerging Threats for Kids’ Digital Privacy

April 20, 2022

Rectangle Circle

This year is set to be an eventful year for kids’ digital privacy.

In February, California’s state legislature unveiled the California Age-Appropriate Design Code Act. This proposed new legislation comes on the heels of similar efforts in the UK to enforce online services (apps, search engines, online games, etc.) to protect kids’ data by, for example, offering strong privacy protections by default.

At the beginning of March, President Biden emphasized the need for better privacy protections for kids in his State of the Union address. “It’s time to strengthen privacy protections, ban targeted advertising to children, demand tech companies stop collecting personal data on our children,” he stated clearly. 

And recently, a UK judge ruled that a lawsuit against TikTok claiming that the social media app uses children’s data unlawfully can proceed. If the action is successful, TikTok would have to pay billions of dollars in damages for abusing millions of kids’ online data.

Regulations and efforts like these signal strong momentum in the kids’ digital privacy space, and represent a step forward for privacy advocates like us here at SoapBox. 

The work to fully protect children’s data privacy, however, is only just beginning. 

New challenges on the horizon for kids’ digital privacy

According to the European Commission, kids make up approximately one third of internet users. They’re accessing the internet at younger and younger ages — scrolling through social media, playing games, etc. — often without adult supervision. 

With so many children online, it’s important to be aware of the next generation of emerging privacy threats. 

A growing number of technologies can capture voice prints, biometrics, and psychosocial data

Recent technological advances have made it possible to collect and parse children’s biometric and psychosocial information. This could be used to create unique ID profiles that follow children across a lifetime.

As we’ve discussed before, human voices — a type of biometric data — are as unique as fingerprints and can communicate a wealth of data about us, from our age, gender, and health status to our educational and socioeconomic background.

A boy smiling while sitting on a couch, talking into a tablet computer.

And as the popularity of voice-enabled experiences grows, so too do the related privacy concerns, especially when it comes to kids. Gaming is one example of an online digital experience that may be especially problematic from a kids digital privacy perspective in the coming years, and one we’ll be discussing in more depth in future SoapBox articles on privacy. 

Current digital privacy laws and the metaverse

From a regulatory perspective, the metaverse is new and uncharted territory, but the more we see and understand these digital world experiences, and emerging technologies like eye- and gait-tracking, the greater and more urgent the need to get kids’ data privacy right.

Every new innovation cycle for the last 30 years has pitted the tech industry against privacy — from ecommerce and online banking, to AI and facial recognition, all the way back to the invention of the internet itself. 

As this VentureBeat article explains, the aims of the metaverse and data privacy are fundamentally at odds with each other. The metaverse wants to dive deeper into personal data; whereas data privacy wants to protect it.

As Peter Evans, CEO of threat detection systems provider Patriot One Technologies, explains: “We see these issues repeating themselves over and over again, with governments and data privacy often lagging. […] With each new iteration of innovation, we see an order-of-magnitude jump in both business benefits as well as the complexity of data privacy issues.” 

With each new iteration of innovation, we see an order-of-magnitude jump in both business benefits as well as the complexity of data privacy issues.

Peter Evans, CEO Patriot One Technologies

Our solution: A two-pronged approach to kids’ digital privacy

Since our founding in 2013, SoapBox Labs has been committed to building voice technology that fundamentally protects kids’ voice data privacy. It wasn’t faster or cheaper to build privacy into our technology and processes, but we’re on a mission to create more immersive, fun, and magical experiences for kids, and protecting their voice data privacy is an intrinsic part of that mission. 

Privacy by design

SoapBox Labs is a privacy-first company. We respect children’s privacy rights at all stages of product design and engineering. This enables us to power privacy-first voice-enabled experiences for our education and entertainment clients and their end users from 2 to 12 years old.

In contrast with other companies whose voice offerings co-exist alongside core advertising or marketing-related business models, the data gathered by our voice engine is never shared, sold, or used to identify an individual child and build a profile of them over time. 

A girl lying on a bed, speaking into an iPad. A dog is lying beside her.

Voice technology built for kids

The vast majority of voice technologies on the market were trained on older teen and adult speech. Not SoapBox.

We developed our proprietary speech recognition system using the voice data of kids ages 2–12, recorded in real-life environments. Our acoustic models are trained on children’s voices and come from 193 countries, representing a broad spectrum of accents and dialects.

This makes our technology a uniquely powerful tool that can ensure younger children are protected in all of their digital experiences.

Want to learn more about kids’ digital privacy?

Watch this space for more articles on emerging metaverse technologies, experiences and related privacy issues. Read Voice Prints and Children’s Rights, our submission in conjunction with Dr. Veronica Barassi to the Office of the UN High Commissioner for Human Rights, or contact us directly to find out how our technology helps protect kids’ digital privacy.

Share this