What the Online Safety Act is – and how to keep children safe online
3 mins read

What the Online Safety Act is – and how to keep children safe online

The way people in the UK might navigate the internet is changing.

Under the Online Safety Act, platforms must take action – such as carrying out age checks – to stop children seeing illegal and harmful material from July.

Services face large fines if they fail to comply with UK’s sweeping online safety rules. But what do they mean for children?

Here’s what you need to know.

What is the Online Safety Act and how will it protect children?

The Online Safety Act’s central aim is to make the internet safer for people in the UK, especially children.

It is a set of laws and duties that online platforms must follow, being implemented and enforced by Ofcom, the media regulator.

Under its Children’s Codes, platforms must prevent young people from encountering harmful content relating to suicide, self-harm, eating disorders and pornography from 25 July.

This will see some services, notably porn sites, start checking the age of UK users.

Ofcom’s rules are also designed to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.

Firms which wish to continue operating in the UK must adopt measures including:

  • changing the algorithms which determine what is shown in children’s feeds to filter out harmful content
  • implementing age verification methods to check whether a user is under 18
  • removing identified harmful material quickly and supporting children who have been exposed to it
  • identifying a named person who is “accountable for children’s safety”, and annually review how they are managing risk to children on their platforms

Failure to comply could result in businesses being fined £18m or 10% of their global revenues – whichever is higher – or their executives being jailed.

In very serious cases Ofcom says it can apply for a court order to prevent the site or app from being available in the UK.

What else is in the Online Safety Act?

The bill also requires firms to show they are committed to removing illegal content, including:

  • child sexual abuse
  • controlling or coercive behaviour
  • extreme sexual violence
  • promoting suicide or self-harm
  • selling illegal drugs or weapons
  • terrorism

The Act has also created new offences, such as:

  • cyber-flashing – sending unsolicited sexual imagery online
  • sharing “deepfake” pornography, where artificial intelligence is used to insert someone’s likeness into pornographic content

Why has it been criticised?

A number of campaigners want to see even stricter rules for tech firms, and some want under-16s banned from social media completely.

Ian Russell, chairman of the Molly Rose Foundation – which was set up in memory of his daughter who took her own life aged 14 – said he was “dismayed by the lack of ambition” in Ofcom’s codes.