Children as young as three ‘tricked into producing online sexual images’

Child on smartphone
Child on smartphone
  • Oops!
    Something went wrong.
    Please try again later.

Children as young as three are being tricked into producing online sexual images of themselves, a report has found, as the Security Minister warns “no child is safe if unsupervised online”.

Three- to six-year-olds are being manipulated by “opportunistic” predators who strike while the children are online on phones and devices often used within the family home, according to the research, the first of its kind, by the Internet Watch Foundation (IWF).

The findings will fuel demands for a Government ban on the sale of smartphones to children under the age of 16 after research by the regulator Ofcom found nearly a quarter of five- to seven-year-olds now have their own smartphone.

The IWF, a charity whose experts work to identify and remove child abuse content, has uncovered thousands of self-generated images by children that are now being found on the open internet. In 2023 they found 275,655 webpages containing child sexual abuse.

In an exclusive article for The Telegraph, Tom Tugendhat, the Security Minister, said the findings “proved two things beyond doubt: more needs to be done to protect children online, and no child is safe if unsupervised online”.

Mr Tugendhat said the Government would take further tougher action if tech firms failed to remove sex abuse images and did not prevent children from accessing harmful content on their sites – as they will be required to do under the Online Safety Act.

Tech companies should not wait for Ofcom fines before they protect their users. It’s not acceptable for tech executives to make vast profits from their youngest users while failing to protect them. They have the expertise and the resources. Now they need to step up and take action,” he said.

“If tech companies do not do more to stop this activity, the Government will go further. We will not stand by and let any company – no matter how big and powerful – put children at risk.”

Tom Tugendhat says more needs to be done to protect children online
Tom Tugendhat says more needs to be done to protect children online - ISABEL INFANTES/AFP via Getty Images

Ministers are currently considering whether children under 16 could be banned from buying mobile phones in an attempt to protect young people from the harmful effects of social media.

Data from Ofcom last week showed that around a third of children aged between five and seven used social media without parental supervision.

Miriam Cates, co-chair of the New Conservative group of MPs, said: “We have rigorous safeguarding checks for anyone who wants to work in a school and yet smartphones allow predators and paedophiles directly into the bedrooms of our very youngest children.

“It’s not good enough to pretend that six-year-olds just need to be ‘better educated’ as if they are somehow responsible for allowing themselves to be abused. Smartphones are not safe for children and the Government needs to act swiftly.”

The IWF said opportunistic internet predators were directing children remotely and often recording them without their knowledge before sharing it on dedicated child sexual abuse websites.

The online safety organisation said it showed the need for more protections online and that platforms needed to act immediately rather than waiting for new regulations, such as the Online Safety Act, to take effect. In its annual report, the IWF said it had discovered more Category A child sexual abuse material online than ever before. This material contains the most severe kinds of sexual abuse.

It took action to remove 51,369 of the webpages that contained Category A child sexual abuse material in 2022, double the number from 2020.

Proportionally, Category A material now accounts for 20 per cent of all the content the IWF sees – up from 18 per cent in 2021, and 17 per cent in 2020.

Susie Hargreaves, chief executive of the IWF, said: “The opportunistic criminals who want to manipulate your children into disturbing acts of sexual abuse are not a distant threat – they are trying to talk to them now on phones and devices you can find in any family home.

“If children under six are being targeted like this, we need to be having age-appropriate conversations, now, to make sure they know how to spot the dangers. A whole society approach is needed.”


All adults must do our part to protect children

By Tom Tugendhat

Child sexual abuse is horrific. The thought of someone violating a child makes the blood of any parent – indeed any decent person – run cold. As Security Minister I am full of admiration and gratitude for those heroic professionals who fight predators, save lives, and help victims in the aftermath of abuse.

All adults must do our part to protect children. The Independent Inquiry into Child Sexual Abuse, which published its final report in 2022, documented decades of horrendous harm visited upon a great many children throughout England and Wales.

It also revealed that public bodies and organisations had routinely failed to protect them. That only happens when individuals do not stand up and do the right thing. Each of us must do all we can to avoid repeating such failures.

And we face a moment of huge significance right now. The online world – so often a force for good – also gives predators opportunities to facilitate, organise, and conduct child sexual abuse and exploitation. People are often surprised to learn that most of this does not happen on the Dark Web.

Most of it happens on apps, video-conferencing technology, gaming machines, and other platforms which you likely use every day. This is happening on a heart-breaking, industrial scale – and it is getting worse.

The Internet Watch Foundation, based in Cambridge, is a global charity dedicated to reducing the availability of child sexual abuse images and videos. The report it has just published is devastating. What it calls the “most extreme year on record” has seen the widespread abuse of children – including pre-school children - in their own bedroom by predators. From the supposed safety of their own homes, they are groomed or coerced into sexual activity via webcam.

Overall in 2023, the IWF discovered 275,655 webpages containing child sexual abuse. Every page can contain thousands of images or videos. There was a 22 per cent increase in Category A material, the most extreme kind. It proves two things beyond doubt: more needs to be done to protect children online, and no child is safe if unsupervised online.

The Government has prioritised this issue, including through the Online Safety Act. Significant parts of the Act are in force, with cyber-flashers already being put behind bars. Once fully in place Ofcom will have the power to require platforms – including those employing end-to-end encryption – to put in place systems and processes designed to prevent child abuse material occurring on their platforms.

Companies will also need to prevent illegal material and protect children from accessing harmful content on their sites. If they fail in their duties, they could face fines running into the billions.

But the Government cannot do this alone. Tech companies need to take responsibility too and they should not wait for Ofcom fines before they protect their users. It’s not acceptable for tech executives to make vast profits from their youngest users while failing to protect them. They have the expertise and the resources. Now they need to step up and take action. It doesn’t matter that most activity on popular platforms is normal and decent.

The fact remains that there is a colossal amount of child sexual abuse online and that if companies turn a blind eye to this, then the good work they do will be fatally undermined. Nobody can be considered a heroic tech pioneer if they don’t do enough to stop paedophiles from preying on children on their platforms. And some things matter more than profits.

If tech companies do not do more to stop this activity, the Government will go further. We will not stand by and let any company – no matter how big and powerful – put children at risk. Ideally, though, we want to work with big tech – and the UK will continue to corral other governments so that we can find solutions to what is an international problem. The US, Canada, Australia, and New Zealand, for example, are all resolved that tech companies must do better.

The UK Government has already funded the development of tools to detect child sexual abuse in end-to-end encrypted environments. The experts are clear: it is perfectly possible to take the fight to predators while ensuring privacy for the rest of us. All of us need to send a clear message to the tech companies who depend upon our patronage about what we expect from them. And that starts with protecting our children from terrible harm.