December 5, 2025
images (4)

🧨 “TikTok’s Algorithm Under Fire: How a 13-Year-Old’s Search Became a Shortcut to Porn”

LONDON — A damning new report has revealed that TikTok’s own search algorithm is steering underage users toward pornographic and highly sexualized content — even when parental controls are switched on.

The investigation, conducted by the UK-based nonprofit Global Witness, found that accounts registered as belonging to 13-year-olds were offered explicit search suggestions immediately after sign-up. Even more troubling, the tests were performed on factory-reset phones with no prior search or viewing history — meaning the results reflected what the platform itself was generating.

Shocking Search Suggestions

Researchers discovered that just clicking inside TikTok’s search bar prompted suggestions like “very rude babes,” “very very rude skimpy outfits,” and “hardcore pawn [sic] clips.” In multiple tests, selecting one of these suggestions led to sexually explicit videos within two clicks — including simulated sexual acts and revealing imagery.

Possible Breach of UK Law

The findings could place TikTok in direct violation of the UK’s Online Safety Act, enacted in July 2025. The legislation requires tech platforms to ensure minors are shielded from harmful content, including pornography and violent material. Legal experts quoted in the report said TikTok’s failure to prevent exposure may constitute a serious breach of those obligations.

Leave a Reply

Your email address will not be published. Required fields are marked *