Follow Datanami:
March 29, 2022

Are We Ready for the Dangers of Smart Toys?

Carsten Rhod Gregersen

(josefkubes/Shutterstock)

Forget teddy bears or plush animals–the toys of today are seriously hi-tech. Dolls with facial recognition and robots that learn their owner’s name and behavior are not uncommon, yet parents and tech commentators alike wonder about the dangers of this evolution.

The truth is that these “toys” are more like Internet of Things (IoT) devices with sensors, cameras, and companion apps. It begs the question: are we ready for the dangers of smart toys?

The Rise of Smart Toys

Toys have certainly come a long way from when I was a boy. Today, toys can provide a personalized play experience with embedded software that offers speech and image recognition, app integration, RFID functionality, and web searching functions. Moreover, this segment of the toy market is growing almost 30% every year.

Hello Barbie, for example, works by recording and processing users’ voices. Pressing a button on her belt prompts the toy to ask a question, and then record the response with an embedded microphone. Voice-recognition software in the cloud saves and decodes the content, and then uses it to formulate an appropriate response from Barbie.

Hello Barbie transcribed and collected conversations of its owners before being discontinued by Matell 

As the toy “converses” with the child over time, it “learns” their name, interests and conversational habits, which Mattel says is intended to improve the quality of responses. The fine print of the toy, however, notes that the company may use, transcribe, and store such recordings and that they can be shared with third parties.

Interestingly, this doll is now discontinued, and the maker maintains that no data collected can nor will be used for marketing, advertising, nor publicity purposes. However, generous data collection policies appear to be the status quo for smart toys. Broad permissions like this raise important questions: How is this data stored? Can consumers opt-out from sharing information with third parties? What protections are in place to prevent this information from falling into the wrong hands?

This last question gives pause for thought. After all, compromised toys with sensors and cameras inside of the home could essentially snoop without the owners even realizing it.

Data and Digital Footprints

Further than the hackability of these devices, however, is the impact that smart toys could have on the way children interact with the world.

The ability to personalize the play experience of Hello Barbie, for example, is intended to deepen the bond between child and toy. In this sense, I wonder about “emotion AI” – which gives a computer the ability to interpret and respond to our emotions by analyzing our language, gestures and facial expressions – and how this could be used in smart toys. Just think how effective basic Tamagotchi digital pets were back in the late 1990s at making their child owners feel obliged to “feed” them. Add AI into the mix and the emotional pull between child and toy could be that much greater.

And then there is the fact that these toys are essentially creating a digital footprint for our children. Location, age, gender, language: these are core metrics that advertisers leverage for messaging. Theoretically, smart toys could also gather this information on our children.

In Australia, 84% of parents believe children should have the right to grow up without being profiled and targeted, and 70% are uncomfortable with businesses tracking the location of a child without permission. Similarly, more than two-thirds are uncomfortable with businesses obtaining personal information about a child and selling it to third parties. The rise of smart toys, however, essentially creates digital footprints from the earliest of ages.

Playing It Safe

From bad actors infiltrating your home to private companies selling your child’s personal information, there are evident dangers in smart toys. While the long-term effects of allowing these toys into the home remain to be seen, there are precautionary steps that parents can take to protect their privacy.

Toys today function like IoT devices (Prostock-studio/ShutterstocK)

For households with connected toys, particularly with AI capabilities, parents should take basic safety steps including limiting time on the devices, monitoring your child’s experience with the toy, ensuring all software patches are updated and talking to your child about internet safety.

I’d also argue that toy developers need to ensure that if they are going to collect sensitive information on our children, they must be held accountable for its use and misuse. One suggestion for developers to overcome IoT security and privacy challenges is to encrypt the data and remain up to speed with the latest attack methods. This will not only help them guard against data breaches but help provide confidence in these toys.

As adults, we have a clear duty of care to protect children and childhood, so any use of data about children and systems that judge and interact in new ways must be entered into with utmost care. This isn’t to say that new forms of entertainment, pleasure and education are off-limits, just that as technologists, educators, business leaders, policymakers and engaged citizens, we need to get it right.

About the author: Carsten Rhod Gregersen is the CEO and Founder of Nabto, the company providing a peer-to-peer (P2P) based platform to IoT devices. Carsten counts almost two decades of experience leading software and innovation companies with an aim to create technology that makes the world a better place – one line of code at a time.

Related Items:

Security, Privacy, and Governance at the Data Crossroads in ‘22

To Bridge the AI Ethics Gap, We Must First Acknowledge It’s There

Governance, Privacy, and Ethics at the Forefront of Data in 2021

 

Datanami