If someone forwarded you this email you can subscribe. View in web browser.

Hi ,

Technology is opening up a disturbing new front in domestic abuse. From hidden cameras in children’s toys to coercion into online sex work, perpetrators are exploiting digital tools to harass, monitor and control women and girls.

Refuge, the UK’s largest specialist domestic abuse charity, warns that without urgent action to address this rise in tech-enabled abuse, the UK government’s pledge to halve violence against women and girls will remain out of reach.

Read the full article below.

- openDemocracy

 
EDITOR'S PICKS
 
1
Reparations for racial injustice: Black fathers must be first in line

Racial inequities mean Black kids increasingly grow up without fathers in the US. Reparations could break the cycle Read more...

2
Israel’s growing pariah status is Gaza’s best hope after UN confirms genocide

Netanyahu’s finance minister this week described the development opportunities in Gaza as a ‘bonanza’ for Israel Read more...

3
PODCAST: Sri Lanka, Bangladesh, Nepal: Is This A South Asia Spring? | With Roman Gautam

Journalists Roman Gautam and Aman Sethi discuss if we are witnessing a South Asian version of the Arab Spring Read more...

 

FEATURED STORY

How tech became the new frontier of domestic violence against women and girls

Emma Pickering

From hiding spycams in children’s toys to coercing partners into online sex work on platforms such as OnlyFans, abusers are increasingly weaponising technology to perpetrate new and insidious forms of violence against women and girls (VAWG).

One in three women in the UK has experienced online abuse or harassment, with almost one in five of them reporting that the perpetrator was a partner or former partner, according to research we at Refuge, the UK’s largest specialist domestic abuse charity, carried out in 2021. For young women, the scale of the abuse was even higher, with two in three reporting online abuse or harassment.

Perpetrators of domestic abuse often use technology to extend existing patterns of coercive control. Location-tracking tools, instant messaging, and social media have made it easier than ever to monitor, harass and intimidate partners or ex-partners under the guise of ‘normal’ online behaviour. We’re increasingly seeing the use of stalkerware, hidden trackers and social media ‘maps’ to surveil survivors – part of a broader pattern of control over what survivors do and who they see.

The issue is now so great that this summer, a little-publicised Home Affairs Select Committee report found that the government is unlikely to fulfil its pledge to halve VAWG unless it introduces effective strategies to tackle online misogyny and tech abuse and increases coordinated funding to do so.

 

FOLLOWING THE FLOTILLA

openDemocracy stands with the Global Sumud Flotilla as it sails to Gaza carrying vital medical and food supplies.
 
We will be tracking its progress every day in our newsletters, and sharing photos from its journey.
 
To find out more about the flotilla and see how you can get involved and support its efforts, click the button below.
Find out more about the flotilla here
 

At Refuge, we have seen this crisis unfold firsthand. In 2017, in response to soaring demand, we established a specialist team to support survivors of tech-facilitated and economic abuse. Referrals to this team increased by 205% between 2018 and 2024, highlighting an urgent need for specialist support.

Stalking was one of the most common harms reported to this team last year, with perpetrators using social media to repeatedly message and keep tabs on their victims. One survivor, who was stalked and harassed on social media by a man they met on a dating app, told us: “Somehow he managed to find all of my social media accounts and even those of my friends and family. The messages just kept coming, and no matter how many times I blocked him, he would manage to create new profiles and continue to harass me.

“At first, the police didn’t take the situation seriously and it took me filing an official complaint for them to press charges,” she said. “Just because abuse is happening online, it doesn’t mean that the effects of it aren’t serious. My anxiety and depression spiralled because of the abuse and I had to take time off work. I was terrified that he was behind every corner.”

Stories like this have led us, along with others in the sector, to urge Ofcom, the UK’s independent internet watchdog, to include stalking as a standalone ‘key harm’ in its upcoming guidance for technology companies on minimising VAWG, which it is required to publish under the Online Safety Act 2023. This would ensure perpetrators are held accountable and survivors are better protected.

We also urgently need stronger regulation around surveillance and intimate image abuse, which perpetrators commonly use to humiliate, distress and psychologically harm survivors. Many survivors tell us their abusers have used hidden spycams – embedded in everything from furniture to children’s toys – to film them without their knowledge, and then used the footage to blackmail and control them.

While it is illegal to share intimate images without consent or even to threaten to do so, there is a loophole for perpetrators who create intimate images without their victim’s consent but don’t share them, despite this still being a serious violation of survivors’ autonomy and privacy. Proposals in the Crime and Policing Bill to make it an offence to even install equipment intended to capture intimate images without consent are a welcome first step towards resolving this, but the government must go further still.

Ministers must urgently follow through on their commitment to criminalise the taking and making of all intimate images without consent – including the use of AI to generate deepfake nudes from real images. So-called ‘nudification’ apps, which are specifically designed to produce these images and videos, have boomed in popularity over recent months, leading Australia to announce plans to ban them earlier this month.

This issue of perpetrators weaponising intimate images is being exacerbated by a lack of safeguards on popular sex work platforms, which not only allow abusers to flourish but also to profit from the abuse. Survivors report being coerced into creating content or having intimate photos or videos of themselves uploaded without consent.

 
READERS DISCUSS
 
Share your stories of togetherness
 
Around the world, many of us are living through times of heightened political discontent – and populists are capitalising on it. Here in the UK, last weekend saw the country’s largest ever far-right rally held in London, with speakers including Elon Musk addressing crowds to spread hatred.
 

We seem more divided than ever, and too many politicians are capitulating to the far right rather than standing up to it. It’s easy to feel despondent, but we know that away from the headlines, many communities are working hard to fight for a better world.

We’d love to celebrate some of these groups’ efforts, so please do write to us about the work you’re doing (we’d love to see photos, too) so we can prove there are more of us than them.

I have something to share (email supporters@opendemocracy.net)
 

Even where laws are supposedly in place to protect survivors, awareness and reporting remain dangerously low – and when intimate image abuse crimes are reported, they are rarely prosecuted. The gap between the legislation and survivors’ lived experiences is yet another example of policy failing to keep pace with technology.

A recent UK-wide poll commissioned by Refuge found that fewer than one in three people would report certain forms of digital coercion, such as location tracking or a partner demanding access to their phone, if they happened to them or someone they knew. Just 58% said they would report the non-consensual sharing of intimate images, falling to 44% among 18- to 24-year-olds. This exposes a worrying lack of awareness among Gen Z; abuse thrives when it is minimised or dismissed, and this lack of recognition only compounds survivors’ trauma.

Ofcom’s draft guidance on how online platforms should respond to VAWG should be celebrated, particularly with the inclusion of online domestic abuse as a ‘key harm’ that tech companies must respond to. Refuge also strongly supports the watchdog’s recommendation that platforms should scan for duplicates of all non-consensual explicit material (and ensure such content is delisted from search results) as a welcome step towards tackling the spread of intimate images shared or made without consent.

At the same time, we are highly concerned that when it’s rolled out after the consultation process has concluded, the guidance will be hamstrung by the fact that tech companies will not be legally bound to comply.

Time and time again, we see companies prioritise profits over women’s safety – and it would not be surprising to see such behaviour continue. This is why Refuge and others in the sector are calling for the guidance to be elevated to a legally binding Code of Practice, backed by a cross-government department commitment to tackling tech abuse in the forthcoming VAWG strategy.

A piecemeal approach is not enough; only a whole-system response can confront the systemic weaponisation of technology against women and girls, and this must include policy frameworks which are effective at ensuring companies take action to prevent VAWG and are held to account where they fail to do so.

More generally, we need a fundamental policy shift towards designing technology with ‘safety by design’ – not bolting on protections as an afterthought. In internal research carried out by Refuge in March and April 2025, we found that certain AI chatbot models gave inappropriate responses to prompts about survivors seeking help – including advising them to stand up to their abuser, a suggestion that could put women at serious risk.

This is just one example of the potentially fatal consequences of treating safety as optional. All AI systems and social media platforms must be safety-tested from the outset, with survivor input and consultation with VAWG experts like Refuge.

Meaningful change will come only when survivor voices are embedded in both technology development and regulation, and when tech companies are held fully accountable for the harm they enable.

 

COMMENTS

Sign in 💬

Our award-winning journalists can now respond directly to your comments underneath the articles on our site!

Just sign in or register underneath any of our articles to start leaving your thoughts and questions today.

Sign in and join the conversation

MORE FROM OPENDEMOCRACY

Weekly Newsletter
The Dark Arts
Beyond Trafficking and Slavery 
Bluesky Facebook X / Twitter Mastodon Instagram YouTube


openDemocracy, 18 Ashwin Street London, E8 3DL United Kingdom

Unsubscribe