Tick tock goes the child data clock: The result of not playing nice
With the growing use of technology and social media, we have a duty to advocate for our children and protect their safety and privacy. Are the tech giants doing enough to ensure this? We are not aiming to deny children access to technology and social media platforms, but rather, to control and manage it so we can create a safer environment for children to use it to their advantage. We challenge organisations to step-up the way they process children’s personal data.
The mainstream media and other law-making authorities worldwide have placed social media platforms like Tik Tok and Facebook under the spotlight by looking at how they process children’s personal data. There have also been several recent legal developments regarding the rights of children and the processing of their personal information on these platforms, including new bills and actions taken by Supervisory Authorities.
Recent developments
America – Social Media Safety Act
In April, the Governor of Arkansas endorsed a Bill that requires social media companies to engage with third-party service providers to conduct age verification screenings on new users. The Bill aims to allow parents to assert control over their children’s social media usage on apps that generate short entertainment videos and gaming-focused social apps. A key requirement of this Act is ID verification – before a child sets up a social media account, an ID needs to be produced. The Act also required parental consent before a child can set up such an account. Therefore, if any child under 13 years old uses a fake ID, a social media account cannot be created without the parent’s consent. This new legislation will take effect in September 2023.
Meta is misbehaving
In early May 2023, the Federal Trade Commission (‘FTC’) proposed changes to its privacy order with Facebook (now Meta). The FTC proposed a complete ban to prevent Meta from monetising and processing children’s personal data. The FTC put the proposal forward in response to concerns that Facebook failed to keep its privacy promises – that children would only be able to communicate with contacts that their parents pre-approved when using the Messenger Kids app. However, the FTC’s investigations later revealed that in certain instances, children were able to communicate with unapproved contacts (potential strangers) in group text chats and group video calls.
The FTC’s Order will in future apply to Meta’s other services, like Instagram and WhatsApp. This means that Meta can no longer collect and use children’s data for commercial gain, even after a child turns 18. Furthermore, Meta will no longer be able to release new or modified products or services without written confirmation from the third-party assessor that its privacy programme is in compliance with the Order and doesn’t contain “any material gaps or weaknesses”.
TikTok is ticking many people off
TikTok is a very popular social media app, especially for children, but lately, it has faced scrutiny over how it protects children’s data. Data protection authorities like the Information Commissioner’s Office (‘ICO’) have fined TikTok for not complying with data protection laws relating to processing children’s personal data.
Ireland
In April 2023, the ICO fined TikTok £12.7 million. The ICO conducted investigations into data breaches between May 2018 and July 2020 and found that TikTok collected the personal information of approximately 1.4 million children, including names, email addresses and geolocation data without parental consent. This means that children’s data that was processed may have been used to monitor and track them, resulting in the potential delivery of harmful and age-inappropriate material.
America
In February 2019, the FTC fined TikTok $5.7 million for violating the Children’s Online Privacy Protection Act (“COPPA’). The FTC said that TikTok illegally collected children’s names, email addresses and geolocation data without parental consent. TikTok knowingly collected personal information from children under the age of 13, despite its own rules which prohibit children under 13 to create an account.
TikTok’s CEO was grilled in the Capitol in March 2023 when several heads of State decided to ban the use of TikTok on all government devices. The State of Arkansas also announced in March that it filed lawsuits against TikTok and Meta, for not securing the processing children’s personal data and misleading its consumers.
TikTok has since implemented new policies and has become more transparent about its data collection practices. It now requires users to verify their age when they create an account and has made it easier for parents to manage their child’s account. We can only hope that they have learnt a lesson and that they follow their own policies and terms this time around.
Montana
On 18 May 2023, Montana’s Governor signed an Act that will ban TikTok’s operation within the territorial jurisdiction of Montana from 1 January 2024. This includes the operation of TikTok by companies and the option to download the TikTok mobile app on any mobile app store.
A stand is being taken against big giants who don’t take the necessary steps to ensure the privacy of our children when it comes to processing children’s personal data.
What are the solutions?
Many companies provide age-verification services in the form of video or selfie identification. The technology uses a facial analysis technique to estimate a person’s age without recognising the individual, it merely identifies a pattern. As a result, it does not need to do any cross-checking on vast databases. The AI is trained to make an estimate of a person’s age whilst considering factors such as skin tone and gender.
Companies such as Meta have put three age-verification methods to the test to verify a children’s age: uploading identity documents, recording video selfies and asking a mutual friend to confirm a child’s age. They have also partnered with a company which specialises in providing online age verification to ensure they maintain their privacy. By implementing stringent security protocols, data encryption and age verification mechanisms, platforms will create a safer online environment for children and processing children’s personal data.
In doing so, organisations don’t only protect the rights of their users, but they also validate their credibility and ensure sustainable growth in this digital age.
The way forward
Technology continues to evolve, and therefore, protecting children’s data and privacy should remain a top priority for organisations. We have endless resources at our disposal, they merely require us to assess and implement them. At the end of the day, it is up to us to remain vigilant about the protection of our children, processing children’s personal data and advocating for responsible and lawful collection and usage practices, in order to create a safe and responsible digital landscape.
Actions you can take
As parents, caregivers, and organisations processing children’s personal data, we must remain committed to creating and maintaining ethical technical environments and practices that are age-appropriate, from which our children can learn, play and work. Privacy by design is key. Everyone has a social and ethical responsibility to protect children. We need to skill ourselves in how we can create safer digital environments for our children. We need to place pressure and speak up when we notice organisations committing unlawful activities or when we feel they are not doing enough to protect our children against the harm unlawful technology and social media use can pose to children’s safety and privacy.
If you would like to know more about this topic or require the help of legal experts, please reach out to us.
Michelle Jonker and Minette Esterhuizen are the authors of this post.