'My daughter took her own life after seeing self-harm content online - and it's got even worse'

1 week ago 6

TikTok and Instagram person been accused of targeting teenagers with termination and self-harm contented - astatine a higher complaint than 2 years ago.

The Molly Rose Foundation - acceptable up by Ian Russell aft his 14-year-old girl took her ain beingness aft viewing harmful contented connected societal media - commissioned investigation of hundreds of posts connected the platforms, utilizing accounts of a 15-year-old miss based successful the UK.

Politics Hub: Follow latest updates

The foundation claimed videos recommended by algorithms connected the For You pages continued to diagnostic a "tsunami" of clips containing "suicide, self-harm and aggravated depression" to under-16s who person antecedently engaged with akin material.

One successful 10 of the harmful posts had been liked astatine slightest a cardinal times. The mean fig of likes was 226,000, the researchers said.

Mr Russell told Sky News the results were "horrifying" and showed online information laws are not acceptable for purpose.

 Molly Rose Foundation

Image: Molly Russell died successful 2017. Pic: Molly Rose Foundation

'This is happening connected PM's watch'

He said: "It is staggering that 8 years aft Molly's death, incredibly harmful suicide, self-harm, and slump contented similar she saw is inactive pervasive crossed societal media.

"Ofcom's caller kid information codes bash not lucifer the sheer standard of harm being suggested to susceptible users and yet bash small to forestall much deaths similar Molly's.

"The concern has got worse alternatively than better, contempt the actions of governments and regulators and radical similar me. The study shows that if you strayed into the rabbit spread of harmful termination self-injury content, it's astir inescapable.

"For implicit a year, this wholly preventable harm has been happening connected the premier minister's ticker and wherever Ofcom person been timid it is clip for him to beryllium beardown and bring guardant strengthened, life-saving authorities without delay."

Ian Russell says children are viewing 'industrial levels' of self-harm content

Image: Ian Russell says children are viewing 'industrial levels' of self-harm content

After Molly's decease successful 2017, a coroner ruled she had been suffering from depression, and the worldly she had viewed online contributed to her decease "in a much than minimal way".

Researchers astatine Bright Data looked astatine 300 Instagram Reels and 242 TikToks to find if they "promoted and glorified termination and self-harm", referenced ideation oregon methods, oregon "themes of aggravated hopelessness, misery, and despair".

They were gathered betwixt November 2024 and March 2025, earlier new children's codes for tech companies nether the Online Safety Act came into unit successful July.

Please usage Chrome browser for a much accessible video player

What are the caller online rules?

Instagram

The Molly Rose Foundation claimed Instagram "continues to algorithmically urge appallingly precocious volumes of harmful material".

The researchers said 97% of the videos recommended connected Instagram Reels for the relationship of a teenage girl, who had antecedently looked astatine this content, were judged to beryllium harmful.

Some 44% actively referenced termination and self-harm, they said. They besides claimed harmful contented was sent successful emails containing recommended contented for users.

A spokesperson for Meta, which owns Instagram, said: "We disagree with the assertions of this study and the constricted methodology down it.

"Tens of millions of teens are present successful Instagram Teen Accounts, which connection built-in protections that bounds who tin interaction them, the contented they see, and the clip they walk connected Instagram.

"We proceed to usage automated exertion to region contented encouraging termination and self-injury, with 99% proactively actioned earlier being reported to us. We developed Teen Accounts to assistance support teens online and proceed to enactment tirelessly to bash conscionable that."

TikTok

TikTok was accused of recommending "an astir uninterrupted proviso of harmful material", with 96% of the videos judged to beryllium harmful, the study said.

Over fractional (55%) of the For You posts were recovered to beryllium termination and self-harm related; a azygous hunt yielding posts promoting termination behaviours, unsafe stunts and challenges, it was claimed.

The fig of problematic hashtags had accrued since 2023; with galore shared connected highly-followed accounts which compiled 'playlists' of harmful content, the study alleged.

A TikTok spokesperson said: "Teen accounts connected TikTok person 50+ features and settings designed to assistance them safely explicit themselves, observe and learn, and parents tin further customise 20+ contented and privateness settings done Family Pairing.

"With implicit 99% of violative contented proactively removed by TikTok, the findings don't bespeak the existent acquisition of radical connected our level which the study admits."

According to TikTok, they not bash not let contented showing oregon promoting termination and self-harm, and accidental that banned hashtags pb users to enactment helplines.

Read more:
Backlash against caller online information rules
Musk's X wants 'significant' changes to OSA

Please usage Chrome browser for a much accessible video player

Why bash radical privation to repeal the Online Safety Act?

'A brutal reality'

Both platforms let young users to supply antagonistic feedback connected harmful contented recommended to them. But the researchers recovered they tin besides supply affirmative feedback connected this contented and beryllium sent it for the adjacent 30 days.

Technology Secretary Peter Kyle said: "These figures amusement a brutal world - for acold excessively long, tech companies person stood by arsenic the net fed vile contented to children, devastating young lives and adjacent tearing immoderate families to pieces.

"But companies tin nary longer unreal not to see. The Online Safety Act, which came into effect earlier this year, requires platforms to support each users from amerciable contented and children from the astir harmful content, similar promoting oregon encouraging termination and self-harm. 45 sites are already nether investigation."

An Ofcom spokesperson said: "Since this probe was carried out, our caller measures to support children online person travel into force.

"These volition marque a meaningful quality to children - helping to forestall vulnerability to the astir harmful content, including termination and self-harm material. And for the archetypal time, services volition beryllium required by instrumentality to tame toxic algorithms.

"Tech firms that don't comply with the extortion measures acceptable retired successful our codes tin expect enforcement action."

 PA

Image: Peter Kyle has said opponents of the Online Safety Act are connected the broadside of predators. Pic: PA

'A snapshot of stone bottom'

A abstracted study retired contiguous from the Children's Commissioner recovered the proportionality of children who person seen pornography online has risen successful the past 2 years - besides driven by algorithms.

Rachel de Souza described the contented young radical are seeing arsenic "violent, utmost and degrading", and often illegal, and said her office's findings indispensable beryllium seen arsenic a "snapshot of what stone bottommost looks like".

More than fractional (58%) of respondents to the survey said that, arsenic children, they had seen pornography involving strangulation, portion 44% reported seeing a depiction of rape - specifically idiosyncratic who was asleep.

The survey of 1,020 radical aged betwixt 16 and 21 recovered that they were connected mean aged 13 erstwhile they archetypal saw pornography. More than a 4th (27%) said they were 11, and immoderate reported being six oregon younger.

Anyone feeling emotionally distressed oregon suicidal tin telephone Samaritans for assistance connected 116 123 oregon email jo@samaritans.org successful the UK. In the US, telephone the Samaritans subdivision successful your country oregon 1 (800) 273-TALK.

Read Entire Article