Facebook's apps used in more than half of online child sex crimes

Charles Hymas
·3 min read
BRAZIL - 2020/07/25: In this photo illustration the social media icons (Messenger, WhatsApp, Instagram and Facebook) seen displayed on a smartphone. (Photo Illustration by Rafael Henrique/SOPA Images/LightRocket via Getty Images) - Rafael Henriques/LightRocket via Getty Images
BRAZIL - 2020/07/25: In this photo illustration the social media icons (Messenger, WhatsApp, Instagram and Facebook) seen displayed on a smartphone. (Photo Illustration by Rafael Henrique/SOPA Images/LightRocket via Getty Images) - Rafael Henriques/LightRocket via Getty Images

Facebook’s apps were used in more than half of online child sex crimes from October 2019 to October 2020, new NSPCC data reveals.

Police recorded more than 9,477 instances where social media or other communication platforms were used in offences involving child sex abuse imagery or online child sex offences.

Of these, 52 pr cent were on Facebook-owned apps with Instagram accounting for the biggest proportion, and 50 per cent more than in the previous year.

A third of the total – or 3,212 – were on Instagram with a further 13 per cent or 1,191 on Facebook or Facebook Messenger, and 568 on WhatsApp.

The remaining 4,464 were on other platforms.

The disclosure prompted fresh demands for Facebook to rethink plans to introduce end-to-end encryption of its platforms which NSPCC warned would prevent law enforcement agencies investigating child sex abuse online.

Andy Burrows, NSPCC head of child safety online policy, said: “Facebook is willingly turning back the clock on children’s safety by pushing ahead with end-to-end encryption, despite repeated warnings that its apps will facilitate more serious abuse more often.

“This underlines exactly why Oliver Dowden must introduce a truly landmark Online Safety Bill that makes sure child protection is no longer a choice for tech firms and resets industry standards in favour of children.

“If legislation is going to deliver meaningful change it needs to be strengthened to decisively tackle abuse in private messaging, one of the biggest threats to children online.”

The NSPCC said end-to-end encryption should only be rolled out if and when platforms could demonstrate it would not compromise children’s safety.

It said the problem had been highlighted by the encrypted messaging of WhatsApp which accounts for one in ten instances recorded by police where Facebook’s apps were involved in online child sexual abuse, according to the new data.

However, last year they only made up 1.3 per cent of child abuse tip-offs from Facebook to the National Crime Agency (NCA) because they cannot see the content of messages to report abuse.

Private messaging is a major source of risk as it is the most common avenue for abusers to contact children.

Last month, the Office for National Statistics revealed children are contacted via direct message in nearly three quarters of cases when they are approached by someone they don’t know online.

A Facebook spokesman said: “Child exploitation has no place on our platforms and we will continue to lead the industry in developing new ways to prevent, detect and respond to abuse. For example, last week we announced new safety features on Instagram including preventing adults from messaging under 18s who don't follow them.

“End-to-end encryption is already the leading security technology used by many services to keep people, including children, safe from having their private information hacked and stolen. Its full rollout on our messaging services is a long-term project and we are building strong safety measures into our plans."