We must legislate to place firm guardrails around the individual, ensuring that our rights are protected regardless of how technology changes. We need to bolster our civil rights.
Dawn Butler is the Labour Party MP for Brent East
Abuse on the social media platform X has taken an even darker turn lately. Users began exploiting its AI tool, Grok, to create sexually explicit “nudified” images of women without consent after users posted innocent photographs.
This vile trend exposed just how fragile human protections are, especially in the social media world, in an age of rapidly advancing artificial intelligence. It has been a chilling glimpse of the dystopian future we face if we fail to act.
Shockingly, a recent study undertaken by The New York Times and Centre for Countering Digital Hate showed that in a period of just nine days Grok created 4.4 million images and estimated at least 41% of those posts contained sexualised images of women.
It took more than a week for X to agree to remove Grok’s ability to generate these images. Even then, the platform initially suggested that paid subscription users would retain access to this immoral command – making exploitation a premium service.
That decision alone should alarm us. One man making a decision that will affect X’s 600 million users and potentially billions of people worldwide. It underlines the grave dangers we all, but especially women and girls, face, as AI develops at an eye-watering, largely unregulated, pace, unchecked and driven by profit rather than public safety – it is what is now called techno capitalism.
In some cases, women reported dozens of explicit AI-generated images created from their photographs. Let me be clear, this is violence against women and girls. It causes real harm, real trauma.
So how do we protect women and girls and the wider public in a world where technology is evolving faster than the law? We must legislate to place firm guardrails around the individual, ensuring that our rights are protected regardless of how technology changes. We need to bolster our civil rights.
I’m pleased the Government has acted swiftly and has fast-tracked legislation which could see users who create sexually explicit images using deepfake technology face up to six months in prison. But we must go further and faster.
As parliamentarians, we cannot keep chasing the next app, update, or AI model. Legislation that targets specific technologies will always be obsolete before the ink is used, let alone dry.
Instead, we need a digital bill of rights, one that centres human dignity, bodily autonomy, and consent. Whether it is your image, your voice, or your identity, these must be centred in this debate and protected in law. That way, we will not be relying on these companies to choose to be ethical.
While X claims this particular issue has been resolved with another rule change, users continue to report the production of sexualised images even now. The uncomfortable truth is that technology alone is not to blame. AI tools do not act independently, well yet at least, they are directed by people.
Grok is not the only tool that can do this therefore we must also confront the uncomfortable truth of who is using these tools and why. Predominantly, men and boys are exploiting AI to enact misogynistic or incel fantasies. This cannot be dismissed as online behaviour or harmless experimentation. It is abuse. It is also a disturbing sign of the future of violence against women and girls and how it is being normalised.
Alongside enforcement, there must be education, because we have a responsibility to make clear that violence against women and girls is never entertainment, never acceptable, and never without consequence. Harm done online is as dangerous as harm done offline.
Without a two-pronged approach, robust legal protections and serious prevention and education, we will remain trapped in a cycle of reaction, always one step behind the next technological harm or update.
We must also be honest about how this violence is racialised. Black women are often the first and most viciously targeted. I have personally received racist AI-generated images of myself on X, reproducing dehumanising tropes that Black women know all too well.
Our first Black female MP, Diane Abbott has suffered mor abuse than any other MP. Racist, misogynistic, and sexually violent images have been posted repeatedly beneath her name, normalised by platforms that fail to act. A new digital bill of rights would impose clear duties on users, platforms, and AI developers alike – ensuring accountability where there is currently none. This should also include revisiting the requirement for all users to have profile verification measures in place.
The fact that Elon Musk was willing to allow paying users continued access to this abusive capability speaks volumes about the priorities of these techno-capitalist platform owners, monetisation of violence against women and girls tools and prioritising profit over women’s dignity and consent. And the fact is, they only act when they are forced to, or when it affects their bottom line.
When vast wealth and unchecked power collide, it is women and girls who pay the price. This is not innovation. We are witnessing the automation of unchecked sexual abuse.
Pandora’s box has already been opened. There is no going back. But there is still a choice about how we respond. It is our responsibility, as politicians, to protect our citizens, not by centring our laws on technology, but by centring them on people – so join me in the fight to secure a digital bill of rights for human beings.
If we fail to act now, we are not just permitting harm. We are legitimising it. It’s time to put guardrails around human beings. This, in the end, is the only thing we can control.
Left Foot Forward doesn't have the backing of big business or billionaires. We rely on the kind and generous support of ordinary people like you.
You can support hard-hitting journalism that holds the right to account, provides a forum for debate among progressives, and covers the stories the rest of the media ignore. Donate today.

