Emma Motherwell from the NSPCC says future generations would be protected from online sexual exploitation

We are currently standing on the precipice of important change when it comes to the future of online safety.

Right now, politicians are scrutinising the draft Online Safety Bill.

If passed, this law could determine how safely our children explore the internet.

And if executed correctly, it could prevent future generations from suffering the horrific outcomes of grooming and online sexual exploitation, whilst punishing big tech giants if they fail in their duty of care to protect these children from online abuse.

This is an important moment that could alter our online experiences for the better and protect our future generations from the fallout of online abuse, which we know can cause adverse effects on mental health.

The NSPCC recently released police force data from Freedom of Information requests which found that there were 9,742 online child sexual offences recorded by 41 forces across the UK last year.

And over the last four years crimes have surged by over three-quarters from 5,458 in 2016/17 to 9,736 in 2020/21, when comparing the 39 forces who provided data for that time period.

Early last week the NSPCC published Duty to Protect - a new assessment of the draft legislation. The charity believes that in its current form the Government’s plan to regulate social media falls significantly short when it comes to protecting children from preventable online abuse.

In response, the NSPCC has relaunched its 2018 Wild West Web campaign, which asked people to petition the government to bring in laws to make social media platforms protect young users from sexual abuse online.

In less than 12 months, 46,000 people signed the petition. In April 2019, the government launched the Online Harms White Paper and in December 2020 it was announced that a future Online Safety Bill will place a legal duty of care on tech companies to protect young people on their platforms.

Their goal was to make the UK the safest place in the world to be a child online.

Now the NSPCC’s online safety experts warn that the current draft legislation must be significantly strengthened.

It must impose a duty on tech firms to tackle cross-platform risks, including the way groomers often target children on social networks then move across platforms to encrypted messaging and livestreaming sites. Companies must be required to assess cross-platform risks when designing their sites, and work together to proactively share information about offender behaviour, threats to children’s safety, or new features that could lead to child abuse.

Additionally, the Bill must disrupt abuse at the earliest possible stage. The Bill fails to effectively tackle how abusers use these platforms and leave ‘digital breadcrumbs’ that signpost to child abuse images.

The Bill must also fix major gaps in child safety duty. Sites such as Telegram and OnlyFans could be excluded from duties to protect children from harmful content due to their smaller audiences.

Currently, only companies with a significant number of children on their apps would offer protection, giving abusers an opportunity to jump to less populated platforms to continue their abuse.

Furthermore, it’s imperative that companies do face criminal sanctions if they fail to tackle children’s safety on their platforms, and there should be a Named Persons Scheme which makes individuals of tech companies personally liable where they fail to uphold their duty of care.

Despite the NSPCC campaigning for this for some time the current draft Bill still does not propose these sanctions which would include fines, censure and disbarment.

This is a crucial opportunity to change the face of online abuse for the better and the charity is urging the public to write to secretary of state Nadine Dorries to amend the Bill in its current format, to place child safety at its heart.

Visit www.nspcc.org.uk/support-us/campaigns/end-child-abuse-online for further information.