The Popular Video Platform Reportedly Directs Child Accounts to Pornographic Content In Just a Few Taps
According to a fresh inquiry, TikTok has been observed to steer profiles of minors to explicit material within a small number of clicks.
How the Study Was Conducted
A campaign organization established simulated profiles using a birthdate of a 13-year-old and enabled the app's "restricted mode", which is designed to limit exposure to sexually suggestive content.
Study authors discovered that TikTok suggested sexually charged search terms to the simulated accounts that were created on new devices with no previous activity.
Troubling Search Prompts
The terms recommended under the "suggested searches" feature contained "extremely revealing clothing" and "inappropriate female imagery" – and then advanced to terms such as "explicit adult videos".
In three cases of the accounts, the inappropriate search terms were proposed instantly.
Quick Path to Pornography
After a "small number of clicks", the researchers encountered adult videos from exposure to graphic sexual acts.
Global Witness stated that the content attempted to evade moderation, usually by displaying the clip within an innocuous picture or video.
Regarding one profile, the process took two taps after signing in: one tap on the search function and then one on the proposed query.
Compliance Requirements
Global Witness, whose scope includes examining big tech's impact on societal welfare, stated it carried out multiple testing phases.
Initial tests occurred prior to the activation of child protection rules under the UK's Online Safety Act on July 25th, and another subsequent to the regulations took effect.
Concerning Discoveries
Researchers stated that multiple clips featured someone who appeared to be a minor and had been sent to the Internet Watch Foundation, which oversees exploitative content.
The campaign group claimed that the social media app was in breach of the Online Safety Act, which obligates social media firms to prevent children from encountering dangerous material such as pornography.
Official Reaction
A spokesperson for Britain's media watchdog, which is charged with regulating the act, commented: "We acknowledge the effort behind this study and will analyze its findings."
Official requirements for complying with the legislation specify that online services that pose a substantial threat of showing harmful content must "modify their programming" to block harmful content from children's feeds.
The app's policies ban pornographic content.
TikTok's Statement
The video platform announced that upon receiving information from the research group, it had removed the violating content and introduced modifications to its search recommendations.
"Upon learning of these claims, we acted promptly to investigate them, delete material that contravened our rules, and implement enhancements to our search suggestion feature," commented a spokesperson.