Congleton
Nub News Logo
Nub News

Congleton councillor demands action after X's AI chatbot Grok creates sexualised images of women and children

By Matthew Hancock-Bruce   15th Jan 2026

Equality Party leader Kay Wesley has demanded action on X's AI chatbot, Grok (Credit: Equality Party/Grok)
Equality Party leader Kay Wesley has demanded action on X's AI chatbot, Grok (Credit: Equality Party/Grok)

A Congleton councillor has led the call for Grok to be taken down.

The Equality Party is demanding action be taken after the AI chatbot, part of the social media platform X, generated and displayed sexualised images of women and children to users.

This includes images of real people being undressed, placed in sexual positions and some involving violence, and have been widely shared on X.

Leader of the Equality Party, Congleton town councillor Kay Wesley, said: "If an AI system accessible in the UK is producing sexualised images of women and children, that is not a hypothetical risk — it is a live safeguarding failure and a crime taking place in real time. The government and Ofcom's response is inadequate. The technology has been available for months and no action was taken until the media ran the story.

"Some public figures, including Nigel Farage, have sought to frame opposition to banning Grok as a matter of 'free speech', and Elon Musk has similarly attacked regulatory efforts as censorship. Free speech does not permit sexual or child abuse and these arguments show their total disregard for the safety of women and children.

 "The UK law is clear on this point, and enforcement should happen immediately where image abuse or child sexual exploitation is taking place. If people distributed sexualised images of women and children in the high street, they would be arrested. Anyone helping to facilitate this would also be brought to justice. Why is this not happening online?

 "Where a platform cannot demonstrate that it can operate such technology safely and lawfully, it should be required to remove it. If it refuses, prosecution or exclusion from the UK market must be the next step."

No enforcement action has taken place so far.

The technology remains active and accessible to UK users, and researchers estimate that it is producing more than 6,700 undressed images a minute.

 Ofcom has opened an investigation and asked for a response from X, promising to expedite its investigation and pointing out that the legal responsibility sits with platforms to ensure they do not host illegal content.

     

CHECK OUT OUR Jobs Section HERE!
congleton vacancies updated hourly!
Click here to see more: congleton jobs

     

Join the 1% Less than one percent of our regular readers pay to support our work.

We send messages like this because, honestly, we need to.
We believe the kind of journalism we produce is important.
That’s why we rely on readers like you.

Please consider joining that 1% today.
Monthly supporters will enjoy:
Ad-free experience

Share:

Comments (0)

Post comment

No comments yet!


Sign-up for our FREE newsletter...

We want to provide congleton with more and more clickbait-free news.

     

...or become a Supporter.
Congleton. Your Town. Your News.

Local news is essential for our community — but it needs your support.
Your donation makes a real difference.
For monthly donators:
Ad-free experience