For the past few months, Elle Simpson-Edin, a scientist by day, has been working with her wife on a novel, due out late this year, that she describes as a “grimdark queer science fantasy.”
As she prepared a website to promote the book, Simpson-Edin decided to experiment with illustrating its content using one of the powerful new artificial intelligence-poweredart-making tools, which can create eye-catching and even photo-real images to match a text prompt. But most of these image generators are designed to restrict what users can depict, banning pornography, violence, and pictures showing the faces of real people. Every option she tried was too prudish. “The book is quite heavy on violence and sex, so art made in an environment where blood and sex is banned isn’t really an option,” Simpson-Edin says.
Happily for Simpson-Edin, she discovered Unstable Diffusion, a Discord community for people using unrestricted versions of a recently released, open source AI image tool called Stable Diffusion. Users share illustrations and simulated photographs that might be considered pornographic or horror-themed, as well as plenty of images that feature nude figures made grotesque by the software’s lack of any understanding of how bodies should actually look.
Simpson-Edin was able to use the unfiltered tool to create some suitably erotic and violent images for her book. Although relatively tame and featuring limited nudity, other image generators would not have been able to make them. “The big selling point of the uncensored Stable Diffusion variants is that they allow so much more freedom,” Simpson-Edin says.
The world’s most powerful AI projects remain locked inside large tech companies that are reluctant to provide unfettered access to them—either because they are so valuable or because they might be abused. Over the past year or so, however, some AI researchers have begun building and releasing powerful tools for anyone to use. The trend has sparked concern around the potential misuses of AI technology that can be harnessed to different ends. Some users of the notorious image board 4chan have discussed using Stable Diffusion to generate celebrity porn, or deepfakes of politicians as a way to spread misinformation. But it is unclear whether any effort has been made to actually do this.