TikTok’s ‘For You’ feed exposes Irish teens to harmful content, study finds

Researchers at Trinity College Dublin and the HSE’s Department of Public Health found that almost one in 20 clips suggested for teenagers on the platform’s “For You” feed violated TikTok’s own safety policies.
TikTok’s ‘For You’ feed exposes Irish teens to harmful content, study finds

Darragh Mc Donagh

Passive scrolling on the social media platform TikTok is exposing children to harmful content, including hateful ideologies, depictions of suicide, and disordered eating, a new study has found.

Researchers at Trinity College Dublin and the HSE’s Department of Public Health found that almost one in 20 clips suggested for teenagers on the platform’s “For You” feed violated TikTok’s own safety policies.

They said there was an “urgent need” for stronger regulatory enforcement and digital health education to protect children from exposure to harmful content, finding that voluntary compliance was proving insufficient.

The findings were based on an experiment in which the researchers created four “dummy” TikTok accounts representing two male and two female users aged between 13 and 15.

Each account scrolled the “For You” feed for three hours, pausing to view content relating to one of four popular themes: conflict, mental health, drugs and alcohol, and diet and body image.

The scrolling was passive as none of the users conducted any searches, followed any accounts, commented on any videos, or ‘liked’ any of the content that was recommended for them.

A total of 12 hours of screen recordings were reviewed, comprising 3,023 videos from the “For You” feed. The study found that 128 or 4.2 per cent of these violated TikTok’s published safety policies.

The most frequent types of harmful content that was suggested for the four users included depictions or glamorisation of suicide and self-harm, disordered eating or extreme dieting, and promotion of alcohol consumption.

They also featured content relating to weapons and gun violence, and extremist or hateful ideologies, according to the study published in the latest edition of the Irish Journal of Medical Science.

“These reels appeared without any search conducted by the users – the platform’s algorithm promoted potentially harmful content based solely on passive viewing time within themed content areas,” the authors explained.

“This study provides evidence that TikTok’s algorithm exposes underage users to harmful and policy-violating content, even without active engagement.”

The researchers noted that this phenomenon was not unique to TikTok, stating that similar algorithmic risks had been reported across other major social media platforms.

“While these platforms offer entertainment and peer connection, growing concerns exist about children’s exposure to harmful content,” they added.

The authors said problematic use of social media had been linked to negative mental health outcomes, including anxiety, depression and self-harm.

“Ireland, as the regulatory base for several major technology companies, is uniquely positioned to lead on protecting children’s digital wellbeing,” they wrote, claiming that the study had shown voluntary compliance to be insufficient.

The study found that TikTok’s recommendations system could expose underage users to harmful content, even without active engagement. This underscores the urgent need for stronger regulatory enforcement, it said.

“Protecting young people online will require both structural accountability from technology platforms and coordinated public health strategies that address adolescents’ developmental susceptibility to emotionally charged and addictive content.”

More in this section