If someone forwarded you this email you can subscribe. View in web browser.

Hi ,

The government says the Online Safety Act is about protecting children — but campaigners warn it’s really about expanding state and corporate control over what we see online.

As new Ofcom regulations take effect, sex workers and other marginalised groups say they’re being censored, deplatformed and forced to give up their anonymity, even as the law fails to keep young people safe.

Today, we unpack how a law sold as safeguarding is becoming a tool for rights erosion.

Read more below.

- openDemocracy

 
EDITOR'S PICKS
 
1
Israel’s biggest security threat has never been Hamas. It’s Netanyahu

The catastrophic war inflicted on Gaza over the past two years has left Israel less secure than ever Read more...

2
Ten years on, Venezuelans still face precarity in Colombia

Displaced Venezuelans found support and protection in Colombia. But more must be done to prevent their exploitation Read more...

3
PODCAST: Now that we have to say 'genocide' | With Lila Hassan
Did Western media manufacture consent for Israel's ongoing genocide of Palestinians in Gaza? Read more...

 

 

FEATURED STORY

Online Safety Act: 'Protect the kids’ is pretext for rights erosions

Marin Scarlett

New Ofcom regulations, introduced in July, are forcing a reckoning for online content. This overhaul of the digital landscape is part of the second phase of the Online Safety Act (OSA), which ostensibly targets content harmful to children.

First passed in October 2023, the OSA’s stated objective of protecting young people garnered widespread support from children’s charities. This was echoed by digital safety campaigners frustrated with the lack of accountability placed on Big Tech and desperate to see its power reined in.

But while these new rules may aim to protect one group, others are being put in harm’s way. Sex workers and other marginalised communities are now less able to access content on harm reduction, face huge income losses due to their accounts being targeted, and risk losing precious anonymity due to identification rules.

Minimising online harms and holding powerful tech companies to account are worthy goals. But despite fixating on explicit content, the OSA isn’t even delivering on its key aims – a recent study by the Children’s Commissioner for England showed children’s exposure to porn is higher now than before it came into effect, and algorithms continue to bombard young people with content promoting suicide and self-harm. And when that non-progress comes at the expense of other vulnerable groups of people, such as sex workers, it’s clear that it’s not fit for purpose.

 

Will you help defend democracy?

A world in turmoil needs fearless, independent investigative journalism that can overcome censorship and hold power to account. 
That’s the kind of media you deserve – and you can support it when you donate to openDemocracy today. When you give today, you can:
  • Keep openDemocracy free to read for everyone
  • Provide our team with the support they need to work safely in a dangerous world
  • Deliver the reporting that matters to you – and that reaches as many people as possible
Please support independent non-profit journalism by donating today.
Please donate now
 

We’ve heard ‘protect our children’ before

Blanket calls to ‘protect the children’ should always be treated with caution. Safeguarding young people often serves as a flimsy pretext for discriminatory legislation, and the emotive language provides useful cover for lawmakers seeking to shield themselves from criticism.

There are many examples of this in action. In the US, racial segregation during the Jim Crow era was justified in part as promoting the “best interests” of children. The US’s 1996 Communications Decency Act also has marked similarities with the OSA, which aimed to protect minors from "indecent" internet content, although most original provisions were struck down by the Supreme Court as unconstitutional.

In the UK, the notorious Section 28 prohibited the promotion of homosexuality in schools under the guise of giving children “a sound start in life”, according to then prime minister Margaret Thatcher. Section 28 was repealed across the UK from 2000 to 2003, with then prime minister David Cameron offering a public apology in 2009. In 2018, he was joined by the key architect of the law, Baroness Knight – but who maintained in her apology that her intention had only been the “wellbeing of children”.

The following decade saw significant gains for LGBTQ+ communities and, until 2015, the UK was considered among the most progressive countries for queer and trans people globally. However, it has since crashed into 22nd place, with hate crime up by 112% against gay people and 186% against trans people in the last five years. Amidst a rollback of LGBTQ+ rights, the language of protecting children is once again never far away. Last year, statutory guidance was updated to require schools to teach about biological sex, advising against materials that encourage pupils to question their gender. Politicians claimed to be safeguarding children from “disturbing”, “inappropriate” and “contested views”.

Even the 2023 Illegal Migration Act, which provided for draconian measures to tackle irregular dinghy crossings on the English Channel, was framed by the government as “a way to protect vulnerable people, including children” from criminal gangs. This rhetoric has featured heavily in a recent surge of far-right demonstrations, with Tommy Robinson telling a London rally earlier this month that migrants have made “our daughters scared to walk the streets”.

Children need protecting, but not everything that claims to protect them does what it says on the tin. While the OSA has not leveraged child safety for such openly xenophobic purposes, it is causing profound harms in the name of protection.

 

Join our FREE online event on 15 October

“We are all arrestables now.” 

Teenagers, grandparents, lawyers, doctors and vicars have all been arrested for demanding an end to the climate crisis, to genocide, to police brutality. And anti-protest laws introduced by the Tories have continued under Labour.

Join openDemocracy for our online event on 15 October where we will discuss the impact of the anti-protest laws on free speech in the UK – and learn how you can keep safe while standing up for your rights.

You will hear from activists, lawyers and have a chance to ask questions about the issues raised, and make connections with other activists who care deeply about creating a fairer and more just society.

Sign up now
 

OSA phase two

The OSA is a behemoth piece of legislation with three distinct phases of implementation. The first was completed in mid-March 2025, requiring services to conduct risk assessments and implement safety measures to tackle illegal content. The second phase focuses specifically on content harmful to children – but that is not necessarily illegal.

At the start of 2025, a statement was published by Ofcom, the government’s regulatory body for online and offline communications. This outlined requirements for in-scope user-to-user services (where content generated, uploaded, or shared by one user can be seen or "encountered" by another user) and search engines to complete an assessment on the likelihood of children accessing their service by April 2025, conduct risk assessments and implement safety measures by July 2025, and pornography services to introduce age checks by July 2025.

April 2025 also saw the release of Ofcom’s Guidance on Content Harmful to Children, which details content that platforms must act against. Included in the highest priority tier of harm is any content that is pornographic or that depicts sexual activity. In its Codes of Practice, Ofcom recommends algorithmic filtering and age verification checks to prevent underage viewers’ exposure to adult content.

Pornography isn’t the sole focus of the OSA – the highest tier of harm also includes content that promotes suicide, self-harm and eating disorders, which should be filtered and removed entirely. However, placing all explicit content in the highest possible tier of harm, along with the distinct, extensive guidance exclusively focused on it, underscores the preoccupation of legislators and campaigners. Content that depicts or encourages serious violence is ranked lower, at merely priority content. And other content that should concern any parent – promoting conspiracy theories or medical misinformation, for example – doesn’t get a look in.

Compliance and enforcement

Failure to comply comes with a hefty price tag. Ofcom has the power to issue fines of up to £18m or 10% of a company’s annual global revenue, whichever is higher. For major tech companies, this unprecedented penalty could amount to billions of pounds, and is designed to ensure that even the largest platforms with significant revenue streams outside the UK are held accountable. Severe cases of non-compliance can lead to the government blocking access to the site from the UK, and criminal liability for a company’s senior managers and executives.

Read the full opinion piece here.

 

COMMENTS

Sign in 💬

Our award-winning journalists can now respond directly to your comments underneath the articles on our site!

Just sign in or register underneath any of our articles to start leaving your thoughts and questions today.

Sign in and join the conversation

MORE FROM OPENDEMOCRACY

Weekly Newsletter
The Dark Arts
Beyond Trafficking and Slavery 
Bluesky Facebook X / Twitter Mastodon Instagram YouTube


openDemocracy, 18 Ashwin Street London, E8 3DL United Kingdom

Unsubscribe