Congleton councillor demands action after X's AI chatbot Grok creates sexualised images of women and children
By Matthew Hancock-Bruce 15th Jan 2026
By Matthew Hancock-Bruce 15th Jan 2026
A Congleton councillor has led the call for Grok to be taken down.
The Equality Party is demanding action be taken after the AI chatbot, part of the social media platform X, generated and displayed sexualised images of women and children to users.
This includes images of real people being undressed, placed in sexual positions and some involving violence, and have been widely shared on X.
Leader of the Equality Party, Congleton town councillor Kay Wesley, said: "If an AI system accessible in the UK is producing sexualised images of women and children, that is not a hypothetical risk — it is a live safeguarding failure and a crime taking place in real time. The government and Ofcom's response is inadequate. The technology has been available for months and no action was taken until the media ran the story.
"Some public figures, including Nigel Farage, have sought to frame opposition to banning Grok as a matter of 'free speech', and Elon Musk has similarly attacked regulatory efforts as censorship. Free speech does not permit sexual or child abuse and these arguments show their total disregard for the safety of women and children.
"The UK law is clear on this point, and enforcement should happen immediately where image abuse or child sexual exploitation is taking place. If people distributed sexualised images of women and children in the high street, they would be arrested. Anyone helping to facilitate this would also be brought to justice. Why is this not happening online?
"Where a platform cannot demonstrate that it can operate such technology safely and lawfully, it should be required to remove it. If it refuses, prosecution or exclusion from the UK market must be the next step."
No enforcement action has taken place so far.
The technology remains active and accessible to UK users, and researchers estimate that it is producing more than 6,700 undressed images a minute.
Ofcom has opened an investigation and asked for a response from X, promising to expedite its investigation and pointing out that the legal responsibility sits with platforms to ensure they do not host illegal content.
CHECK OUT OUR Jobs Section HERE!
congleton vacancies updated hourly!
Click here to see more: congleton jobs
Share: