A human rights group claims TikTok recommends pornography and sexualised videos to children. Researchers created fake child accounts, enabled safety settings, and still received explicit search prompts. These included clips showing masturbation simulations and pornographic sex. TikTok says it acted immediately after being alerted and insists it remains committed to safe, age-appropriate experiences for young users.
Fake child accounts reveal risky content
In July and August, Global Witness researchers set up four TikTok profiles. They posed as 13-year-olds using false birth dates. The platform did not request additional verification. Investigators enabled TikTok’s “restricted mode”. The company markets this feature as a filter against sexual or mature material. Despite this, the accounts received sexualised search suggestions in the “you may like” section. These led to videos of women flashing underwear, exposing breasts, and simulating masturbation. At the most extreme, explicit pornography appeared hidden in ordinary-looking clips to evade moderation.
Campaign group issues warning
Ava Lee from Global Witness described the findings as a “huge shock”. She said TikTok not only fails to protect children but actively recommends harmful content. Global Witness usually investigates how technology affects democracy, human rights, and climate change. The organisation first discovered TikTok’s explicit material during unrelated research in April.
TikTok defends safety measures
Researchers reported the findings earlier this year. TikTok said it removed the flagged content and implemented fixes. But when Global Witness repeated the test in late July, sexual videos appeared again. TikTok says it offers more than 50 safety features for teenagers. It claims nine out of ten violating clips are removed before anyone views them. After the latest report, the company said it upgraded search tools and removed additional harmful content.
Children’s Codes heighten platform responsibility
On 25 July, the Children’s Codes under the Online Safety Act came into force. Platforms must enforce strict age verification and prevent children from accessing pornography. Algorithms must also block content linked to self-harm, suicide, or eating disorders. Global Witness conducted a second study after the codes came into effect. Ava Lee urged regulators to act, stressing that children’s online safety must now be enforced.
Users question sexualised recommendations
During the investigation, researchers observed reactions from TikTok users. Some expressed confusion over sexualised search prompts. One wrote: “can someone explain to me what is up with my search recs pls?” Another asked: “what’s wrong with this app?”