The Online Safety Bill is on the cusp of becoming law.
It is expected to be one of the government’s flagship pieces of legislation for this term, but comes following several delays due to controversy over its potential privacy implications.
Online Safety Bill set to pass final hurdle – politics latest
Sky News understands it will finally pass its last parliamentary hurdle on Tuesday, going through the House of Lords without further amendments to head for royal assent.
Ahead of its long-awaited passing, here’s what you need to know about the Online Safety Bill.
What does the Online Safety Bill aim to do?
The government never shies away from an opportunity to present the UK as a global leader, and has said the bill will make this country “the safest place in the world to be online”.
It aims to do this by imposing rules upon companies like Meta, Apple, and even Wikipedia, with the goal of keeping inappropriate and potentially dangerous content away from vulnerable eyes.
This includes things like self-harm material, which a coroner ruled last year contributed to teenager Molly Russell taking her own life.
The bill also aims to hold platforms responsible for illegal content such as child sexual abuse images, make adult websites properly enforce age limits, and stop underage children being able to create social media accounts.
Perhaps most controversially, one of the proposals would force platforms like WhatsApp and Signal to undermine messaging encryption so private chats could be checked for illegal content.
Be the first to get Breaking News
Install the Sky News app for free
I’ve been reading about this for ages – why’s it taken so long?
As that last section indicates, this is a very wide-ranging piece of legislation.
Other illegal content it wants to crack down on includes selling drugs and weapons, inciting or planning terrorism, sexual exploitation, hate speech, scams, and revenge porn.
Then there’s the potentially harmful but not illegal material, like eating disorder content and alleged bullying.
There have been concerns within the Tory Party that it is simply too far-reaching, potentially to the point of threatening free speech online.
Those worries weren’t enough to knock the bill’s former chief advocate, the then culture secretary Nadine Dorries.
Indeed, proposals got even tougher between the bill’s first pitch in 2019 and eventual parliamentary debut in 2022, adding measures like criminalising cyber-flashing.
That already long three-year gap was blamed on the pandemic, and subsequent delays have been exacerbated by prime ministerial downfalls – first Boris Johnson and then Liz Truss.
The bill now falls under the watch of Michelle Donelan, the technology secretary, who’s made some changes to alleviate criticism while still satisfying its supporters.
Please use Chrome browser for a more accessible video player
Who’s in favour?
Among the bill’s backers have been charities like the NSPCC, safety group the Internet Watch Foundation (IWF), bereaved parents who say harmful online content contributed to their child’s death, and sexual abuse survivors.
Ahead of the bill facing its final stages in parliament this week, a woman who suffered years of abuse on an encrypted messaging app was one of more than 100 people who signed a letter to big tech bosses aimed at highlighting the need for action.
The NSPCC’s recent campaigning cited reports of a rise in online child grooming cases, which the charity said showed the legislation is “desperately needed”.
And the IWF released new figures a day before the bill’s expected passage through the House of Lords warning of “unprecedented” numbers of children falling victim to online sexual extortion.
The father of Molly Russell is one of several parents who have voiced their support for the bill, and welcomed an amendment filed during its committee stage that could grant coroners and bereaved families access to data on deceased children’s phones.
Four in five UK adults are also said to support making senior managers at tech firms legally responsible for children who are harmed by what they see on their platforms.
Please use Chrome browser for a more accessible video player
Who has opposed it?
Aside from Tory MPs, the main opposition has unsurprisingly come from tech companies.
They had long expressed concerns about the rules around legal but harmful content, suggesting it would make them unfairly liable for material on their platforms.
Ms Donelan acknowledged the issue and removed the requirement, but the bill still tasks them with protecting children from damaging content like that which promotes suicide and eating disorders.
The update also saw material encouraging self-harm made illegal.
Much of the recent criticism from tech firms has centred around messaging encryption, with major platforms like WhatsApp even threatening to leave the UK if they are forced to enable scanning texts.
Encryption protects messages from being seen by people outside the chat.
Advocates of the technology say any attempt by government to allow for a “backdoor” would compromise people’s privacy and potentially let bad actors break into them too.
Ministers have sought to downplay the chances of this measure ever actually being used, but it remains in the bill.
Read more tech news:
Rockstar’s biggest game turns 10
What you need to know about new iPhone update
Please use Chrome browser for a more accessible video player
How will the bill be enforced?
Enforcement will fall to media regulator Ofcom.
Companies found to be in breach of the bill can be fined up to £18m or 10% of their annual global turnover, whichever’s greater (and in the case of a company like Meta, it’s comfortably the latter).
Firms and senior managers could also be held criminally liable if found not to be doing enough to protect children.
In extreme cases, platforms may even be completely blocked from operating in the UK.