Grok is removing women’s clothes and it’s really creepy
It’s another year and here we are with yet another reason to steer away from X-owned AI bot Grok, that has substituted spreading misinformation (and then lying about spreading misinformation) with straight-up creepiness.
It’s only been a few months since our article about how Elon Musk’s AI bot was not only amplifying fakes news stories but also lying and denying that it had, demonstrating that AI bots really do have the ability to tell some pretty brazen whoppers.
Of course, for anyone remotely concerned with limiting the amount of misinformation circulating social media, this was hardly welcome news. But now Grok has lowered the bar once again, and is now sexualising images of women – and children – at the behest of X users, and it’s really creepy. Not just creepy, but also illegal.
It seems some questionable corners of the X community have figured out a way of luring Grok to the dark side by “tricking” the bot into using genuine photos of women to generate fake images where their clothes have been removed, and replaced by skimpy bikinis or – more worrying – “clear tape”. And Grok appears all too happy to oblige.
While Grok stops short at generating fake nude photos of women – most likely because that’s now completely illegal in most countries – the images Grok generates are still highly sexualized, and are spawning many complaints from the unwitting women that are targeted.
Sponsored Content. Continued below…
Those X users that are commanding Grok to do their voyeuristic-bidding seem to have worked out that while Grok won’t sexualize images of third parties, all the user has to do is pretend that the photo they want edited is actually of themselves, and consequently Grok is happy to go to work, with little due diligence as to whether the user is telling the truth. An example is below (which we’ve had to blur out extensively to protect the identify of the woman as well as Grok’s sexualized output).

Another example also blurred out is below.

In fact, during our research to this article, we uncovered at least one instance of Grok removing so many clothes from the woman (or perhaps girl, since we could not ascertain her age) the fake image was essentially a nude AI image, and we consequently reported the output to law enforcement including the FBI.
There is an element of irony here, since X has previously pledged to remove users who digitally undress women from the platform, seemingly overlooking the inconvenient fact that the most prolific offender is their own AI bot Grok.
Many of the woman targeted by Grok in this manner have also taken to X to complain.

It is perhaps only a matter of time before X will have to intervene, and if not, law enforcement, as Grok continues to cement its reputation as now more than just the “bad boy” of artificial intelligent bots.
Continued below...
Thanks for reading, we hope this article helped, but before you leave us for greener pastures, please help us out.
We're hoping to be totally ad-free by 2025 - after all, no one likes online adverts, and all they do is get in the way and slow everything down. But of course we still have fees and costs to pay, so please, please consider becoming a Facebook supporter! It costs only 0.99p (~$1.30) a month (you can stop at any time) and ensures we can still keep posting Cybersecurity themed content to help keep our communities safe and scam-free. You can subscribe here
Remember, we're active on social media - so follow us on Facebook, Bluesky, Instagram and X