UK News

More children seeing violent and degrading pornography online, says commissioner

today19/08/2025

More children seeing violent and degrading pornography online, says commissioner
share close

More children seeing violent and degrading pornography online, says commissioner

The proportion of children saying they have seen pornography online has risen in the past two years, according to a report which also found most are likely to have stumbled upon it accidentally.

Children’s Commissioner Dame Rachel de Souza said her research is evidence that harmful content is being presented to children through dangerous algorithms, rather than them seeking it out.

She described the content young people are seeing as “violent, extreme and degrading” and often illegal, and said her office’s findings must be seen as a “snapshot of what rock bottom looks like”.

More than half (58%) of respondents to the survey said that, as children, they had seen pornography involving strangulation, while 44% reported seeing a depiction of rape – specifically someone who was asleep.

Made up of responses from 1,020 people aged between 16 and 21 years old, the report also found that while children were on average aged 13 when they first saw pornography, more than a quarter (27%) said they were 11, and some reported being aged “six or younger”.

The research suggested four in 10 respondents felt girls can be “persuaded” to have sex even if they say no at first, and that young people who had watched pornography were more likely to think this way.

The report, a follow-on from research by the Children’s Commissioner’s office in 2023, found a higher proportion (70%) of people saying they had seen online pornography before turning 18, up from 64% of respondents two years ago.

Boys (73%) were more likely than girls (65%) to report seeing online pornography.

A majority (59%) of children and young people said they had seen pornography online by accident – a rise from 38% in 2023.

The X platform, formerly Twitter, remained the most common source of pornography for children, with 45% saying they had seen it there compared with 35% seeing it on dedicated pornography sites – a gap which has widened in the past two years.

Dame Rachel said: “This report must act as a line in the sand. The findings set out the extent to which the technology industry will need to change for their platforms to ever keep children safe.

“Take, for example, the vast number of children seeing pornography by accident. This tells us how much of the problem is about the design of platforms, algorithms and recommendation systems that put harmful content in front of children who never sought it out.”

The research was done in May, ahead of new online safety measures coming into effect last month including age checks to prevent children accessing pornography and other harmful content.

Dame Rachel said the measures “provide a real opportunity to make children’s safety online a non-negotiable priority for everyone: policymakers, big tech giants and smaller tech developers”.

Some 44% of respondents agreed with the statement “girls may say no at first but then can be persuaded to have sex”, while a third (33%) agreed with the statement “some girls are teases and pretend they don’t want sex when they really do”.

For each statement, young people who had seen pornography were more likely to agree.

The commissioner’s report comes as a separate piece of research suggested dangerous online algorithms were continuing to recommend suicide, self-harm and depression content to young people “at scale” just weeks before the new online safety measures came into effect.

The Molly Rose Foundation – set up by bereaved father Ian Russell after his 14-year-old daughter Molly took her own life having viewed harmful content on social media – analysed content on Instagram and TikTok from November until June this year on accounts registered as a 15 year-old girl based in the UK.

The charity said its research found that, on teenage accounts which had engaged with suicide, self-harm and depression posts, algorithms continued to “bombard young people with a tsunami of harmful content on Instagram Reels and TikTok’s For You page”.

Mr Russell, the foundation’s chairman, said: “It is staggering that, eight years after Molly’s death, incredibly harmful suicide, self-harm and depression content like she saw is still pervasive across social media.”

The foundation has previously been critical of the regulator Ofcom’s child safety codes for not being strong enough and said its research showed they “do not match the sheer scale of harm being suggested to vulnerable users and ultimately do little to prevent more deaths like Molly’s”.

Mr Russell added: “For over a year, this entirely preventable harm has been happening on the Prime Minister’s watch and where Ofcom have been timid it is time for him to be strong and bring forward strengthened, life-saving legislation without delay.”

TikTok and Meta, which owns Instagram, have been contacted for comment.

Published: by Radio NewsHub

Written by: Radio News Hub


Search Rother Radio

About Us

Rother Radio – Love Local, Love Music! → Discover more