The UK’s data protection regulator has launched an investigation into TikTok’s handling of teenagers’ personal data and how it influences content recommendations on the platform.
The Information Commissioner’s Office (ICO) expressed concerns about social media platforms using data generated by children’s online activity to power recommendation algorithms. The watchdog is particularly worried about the potential exposure of young users to harmful or inappropriate content.
John Edwards, the Information Commissioner, highlighted the importance of ensuring safety procedures for teenagers aged 13 to 17.
“It’s what they’re collecting, it’s how they work,” he stated. “I will expect to find that there will be many benign and positive uses of children’s data in their recommender systems. What I am concerned about is whether they are sufficiently robust to prevent children being exposed to harm, either from addictive practices on the device or the platform, or from content that they see, or from other unhealthy practices.”
As part of the inquiry, the ICO will also assess how Reddit and Imgur process children’s personal data and verify users’ ages.
TikTok, owned by Chinese technology giant ByteDance, defended its policies in a statement, asserting its dedication to child safety.
“Our recommender systems are designed and operate under strict and comprehensive measures that protect the privacy and safety of teens, including industry-leading safety features and robust restrictions on the content allowed in teens’ feeds,” the company said.
This latest scrutiny follows a £12.7 million fine imposed on TikTok in 2023 for mishandling children’s data and breaching safeguards for young users’ privacy. The ICO found that TikTok had failed to remove underage users adequately, allowing up to 1.4 million children under 13 in the UK to access the platform in 2020, despite its rules prohibiting them from having accounts.
The ICO’s ongoing investigation aims to determine whether TikTok and other platforms are taking sufficient steps to protect young users from potential risks associated with their recommender algorithms and data collection practices.