NSPCC calls for action on the risks of end-to-end encryption as Facebook’s apps were used to abuse children

0

Facebook’s apps were used in more than half of online child sex crimes, new NSPCC data reveals as the charity calls on the Government to deliver meaningful change in the Online Safety Bill to tackle the biggest threat to children online.

In a single year, police recorded more than 9,470 instances across the UK where the means of communication was known in reports of child sex abuse image and online child sex offences – 52% of which took place on Facebook-owned apps.

In the 12 months between October 2019 and October 2020 police forces in the East of England (not including Thames Valley) recorded 1,075 online child sex crimes where the method of communication was known – 48% of these involved Facebook-owned apps. * (A full force-by-force breakdown is available in the notes to editors).

This data has been obtained from 34 police forces in England, Wales and the Channel Islands by the NSPCC via Freedom of Information requests.

The charity fears many of these could go unreported if Facebook proceeds with end-to-end encryption without necessary safeguards in place.

They urge the Government to give Ofcom the power to take early and meaningful action against firms whose dangerous design choices put children at risk.

It is crucial private messaging is in scope of legislation but the charity argues the current plans for it, released by the Government in December, need to be strengthened.

End-to-end encryption offers a range of benefits, but child protection experts, law enforcement worldwide and Facebook themselves have said it will hinder their ability to identify and disrupt child abuse on their services.

The NSPCC has repeatedly demanded that it should only be rolled out if and when platforms can demonstrate it won’t compromise children’s safety.

The issue is brought into sharp focus by WhatsApp which accounts for one in ten instances recorded by police where Facebook’s apps were involved in online child sexual abuse, according to the new data.

But last year they only make up 1.3% of child abuse tip-offs from Facebook to the NCA because they can’t see the content of messages to report abuse.

Private messaging is a major source of risk as it is the most common avenue for abusers to contact children.

Last month, the Office for National Statistics revealed children are contacted via direct message in nearly three quarters of cases when they are approached by someone they don’t know online.

Andy Burrows, NSPCC Head of Child Safety Online Policy, said: “Facebook is willingly turning back the clock on children’s safety by pushing ahead with end-to-end encryption despite repeated warnings that their apps will facilitate more serious abuse more often.

“This underlines exactly why Oliver Dowden must introduce a truly landmark Online Safety Bill that makes sure child protection is no longer a choice for tech firms and resets industry standards in favour of children.

“If legislation is going to deliver meaningful change it needs to be strengthened to decisively tackle abuse in private messaging, one of the biggest threats to children online.”

The NSPCC are calling on the Government to:

· Shift the onus to be on tech firms to show they are identifying and mitigating risks on products before they roll them out. The current onus is for Ofcom to prove risk, rather than companies to show they are taking steps to protect children. But they won’t be able to do this with end-to-end encryption in place because the majority of child abuse reports will disappear, creating a catch 22 for the regulator.

· Give Ofcom the power to force tech firms to act before harm has happened rather than after. Under the current plans, the regulator needs to demonstrate persistent and prevalent child abuse before it can force platforms to act. But the thrust of the legislation should be to catch harm at the earliest stage to prevent it.

· Make Ofcom consider the interplay of risky design features to see if it’s likely to exacerbate risk. End-to-end encryption is likely to present particularly severe risks if it can be exploited by abusers in conjunction with other high-risk design choices, for example algorithmic friend suggestions and livestreaming functionality. This is why the NSPCC is particularly concerned about the proposals to introduce end-to-end encryption on Facebook.

Earlier this month, the NSPCC’s published its ‘Delivering a Duty of Care’ report, which assessed plans for legislation against its six tests to achieve bold and lasting protections for children online.

The NSPCC has been the leading voice for social media regulation and the charity set out detailed proposals for a Bill in 2019. The draft Bill is expected in the Spring.

Share this: