For the past few weeks, Elon Musk’s Grok AI bot has been generating pornographic images of women and underage girls, without their consent, at an astounding rate. A recent Bloomberg analysis found that Grok creates 6,700 such images per hour, or more than one per minute. On Friday, X at last put some minor guardrails on the tool, with a new policy that only paying subscribers can use Grok to generate or alter images. On the standalone Grok app, however, anyone can prompt Grok to generate new images, meaning the deepfaked porn continues.
Grok has long been one of the more suggestive of the major AI models, with “spicy” and “sexy” settings that can be toggled on and off. While employees have warned that the bot is being used to generate child sex abuse materials, Musk has remained committed to the idea that Grok would be the sexiest AI model. On X, Musk has defended the choice on business grounds, citing the famous tale of how VHS beat Betamax in the 1980s after the porn industry put its weight behind VHS, with its larger storage capacity. “VHS won in the end,” Musk posted, “in part because they allowed spicy mode 😉.”
There is a certain amount of truth to Musk’s take. The porn industry tends to reward early adopters, and the money to be made in porn means that it has impressive leverage when it comes to choosing between two competing and incompatible forms of technology.
Yet the idea that porn as an industry, neutral and amorphous, settles tech wars doesn’t show us the whole truth. It would be more accurate to say that the technologies we use to generate and share images are, more often than not, shaped by people distributing images of women’s bodies — often with dubious consent from the women themselves. In that sense, Grok’s abilities are par for the course.
Porn didn’t only help VHS win over Betamax. The industry has also been linked to the mainstreaming of Super 8 film (easy and convenient for amateur filmmakers), the development of streaming video (private and easily accessible), the development of web payments (comes with paywalled streaming video), the development of web analytics (good for the complex business transactions of adult streaming), and the victory of Blu-ray over HD DVD. (Like VHS, Blu-rays held a lot more data than its competitors, which is especially attractive in the porn market.)
Then there were the systems of image distribution that developed outside of porn as an industry. A surprising amount of them revolved around people trying to share sexualized images of women’s bodies as quickly as possible — only in these cases, the people whose images getting distributed weren’t necessarily consenting adults who were getting paid for their trouble.
Sometimes the innovation was more or less harmless. Google Images was developed because so many people went searching for pictures of Jennifer Lopez in her famously low-cut Versace gown in 2000, a distinction Lopez has treated as a feather in her cap. In this case, Lopez wore the dress to a high-profile event and wanted to be seen and talked about, so it’s reasonable to assume consent.
Other times it got cloudier. The impetus for YouTube came when developers wanted to watch Janet Jackson’s 2004 wardrobe malfunction and were frustrated that it took so long to find video of it on the internet. Jackson has always maintained that she did not intend for her breast to be seen on national TV, so here, we’re dealing with nonconsensual nudity.
Meanwhile, Mark Zuckerberg’s progenitor for Facebook was Facesmash, a Hot or Not rip-off developed to compare the women of Harvard University’s student body to farm animals. The intent here was less to create nonconsensual pornography than it was to perform a sexualized humiliation of nonconsenting women — an act that turned out to be so popular that it overwhelmed Harvard’s servers the night it launched.
So Musk is not entirely wrong when he says that technologies with what he euphemistically refers to as “spicy mode” tend to do well. A more accurate phrasing, however, might be to say that in our misogynistic society, objectifying and humiliating the bodies of unconsenting women is so valuable that the fate of world-altering technologies depends on how good they are at facilitating it.
AI was always going to be used for this, one way or the other. But only someone as brutally uncaring and willing to cut corners as Elon Musk would allow it to go this wrong.

