If you’ve scrolled through any social media feed recently, you’ve probably seen at least 32 AI-generated portraits that look like cartoonish versions of your friends. It’s all because of Lensa, an AI portrait app that surged in popularity almost overnight. And while it’s all fun and games on the outside, the hyped app has a dark side that we’re slowly beginning to discover.
As some users have noticed, Lensa seems to be inherently misogynic and makes women’s portraits overly sexualized, even when they’re created from just photos of a face. But it gets worse. It appears to be easy to trick the app into generating NSFW content. This leads to realistic, nonconsensual nudes of pretty much anyone, including both adults and children.
I came across an article on Jezebel that sent shivers down my spine when I read about the content Lensa is capable of creating. And from there on, I fell into a rabbit hole of reading about AI-generated nudes, child abuse content, sexualized portraits, copyright issues, and other “delights” of living in the 21st century. But let’s try and sort all these problems out somehow, shall we?
Sexualized portraits of women
I’ll start with an article by Olivia Snow, a research fellow at UCLA’s Center for Critical Internet Inquiry. She wrote for Wired about her experience with Lensa, calling the app “a nightmare waiting to happen.” And rightfully so, considering that the app returned several fully nude results “despite uploading only headshots.” Olivia writes that many women have noticed that the app adds “cartoonishly sexualized features” to their portraits, including “sultry poses and gigantic breasts.” Brandee Baker gave a perfect illustration:
Is it just me or are these AI selfie generator apps perpetuating misogyny? Here’s a few I got just based on my photos of my face. pic.twitter.com/rUtRVRtRvG
— Brandee Barker (@brandee) December 3, 2022
Funnily enough, Lensa’s terms of service instruct users to only submit appropriate content containing “no nudes.” Still, it ascribes overly sexualized features to even just portraits of faces. “I’m desensitized enough to the horrors of technology that I decided to be my own lab rat,” Olivia writes in her article.
“I ran a few experiments: first, only BDSM and dungeon photos; next, my most feminine photos under the “male” gender option; later, selfies from academic conferences—all of which produced spectacularly sized breasts and full nudity.”
All of this leads us to two other issues: nonconsensual nudes of adults and children.
Nonconsensual nudes
Haje Jan Kamps over at TechCrunch confirmed Olivia’s observations in a test of their own. They uploaded two sets of images to Lensa: one based on 15 photos of a famous actor, and the other with the same 15 photos plus five photos of the same actor’s face, but Photoshopped onto topless models’ bodies.
“The second set, however, was a lot spicier than we were expecting. It turns out the AI takes those Photoshopped images as permission to go wild, and it appears it disables an NSFW filter. Out of the 100-image set, 11 were topless photos of higher quality (or, at least with higher stylistic consistency) than the poorly done edited topless photos the AI was given as input.”
Speaking with Tech Crunch, The Prisma Labs team who stands behind Lensa said that “if you specifically provoke the AI into generating NSFW images, it might, but that it is implementing filters to prevent this from happening accidentally.”
“To enhance the work of Lensa, we are in the process of building the NSFW filter. It will effectively blur any images detected as such. It will remain at the user’s sole discretion if they wish to open or save such imagery.”
Of course, those who want to generate fake nude images of someone else will do it on purpose, so I don’t see how this solves the problem. Imagine someone trying to destroy a celebrity’s reputation, or your sociopathic ex-boyfriend wanting to harm you. All they need to create their “nudes” is an app and a few photos, and basically, everyone can generate nudes of everyone. It’s a terrifying thought, but it gets even scarier.
Child abuse content
After experimenting with photos of her grownup self, Olivia embarked on a nightmare journey of feeding Lensa her childhood photos. She decided to test the app’s other restriction: “No kids, adults only.” And the results were terrifying.
“In some instances, the AI seemed to recognize my child’s body and mercifully neglected to add breasts. This was probably not a reflection of the technology’s personal ethics but of the patterns it identified in my photo; perhaps it perceived my flat chest as being that of an adult man. In other photos, the AI attached orbs to my chest that were distinct from clothing but also unlike the nude photos my other tests had produced.”
Olivia tried again, this time mixing her childhood photos and selfies. And this is when sh*t really hit the fan! The app returned “fully nude photos of an adolescent and sometimes childlike face but a distinctly adult body.”
“Similar to my earlier tests that generated seductive looks and poses, this set produced a kind of coyness: a bare back, tousled hair, an avatar with my childlike face holding a leaf between her naked adult’s breasts.”
I can’t even begin to think where this could go. These nudes could make children subject to all sorts of harassment and abuse: from their peers, but also adults. And all they need is a few photos and a few bucks. It’s a chilling thought that it’s become so easy to create content like this. While I have a particularly soft spot for children, this isn’t where the problems with AI apps like Lensa end.
Racial bias
Another thing many female Lensa users noticed is that the app makes them look more Caucasian. A few artists I follow on Instagram complained about it, and Olivia Snow noticed it as well. “Nearly a dozen women of color told me that Lensa whitened their skin and anglicized their features,” she writes “and one woman of Asian descent told me that in the photos ‘where I don’t look white they literally gave me ahegao face.’”
Shile artificial intelligence is, well, artificial – it’s still trained by humans. And humans give it their values and biases, which apparently happened with Lensa as well. Olivia cites scholars Ruha Benjamin and Safiya Noble, who explained that “machine-learning algorithms reproduce the cultural biases of both the engineers who code them and the consumers who use them as products.” And Lensa apparently sees female “beauty” as being highly sexualized, female, and white.
Copyright issues
Last but definitely not the least problem to address today is the set of potential copyright issues. It’s not only apps like Lensa, but any AI image generator that can create them. After all, this is why major stock sites like Getty and Unsplash ban AI-generated images from their platforms.
To generate your AI portraits, Lensa uses Stable Diffusion. Prisma Labs states that “Lensa learns to create portraits just as a human would – by learning different artistic styles.” But who are these styles copied from? That’s right, from real artists. Sydney-based artist Kim Leutwyler talked about this with The Guardian. She used Have I Been Trained to check whether her art was used to train AI models. The was an unpleasant surprise waiting for her: almost every portrait she has ever painted. “Every painting I have shared on the internet.”
“When I started seeing all of these Lensa app-generated portraits posted by some of my friends, even some other artists, I was instantly sceptical,” Leutwyler said. “Some of the work is distinctly recognisable to other artists’ work.
They are calling it a new original work but some artists are having their exact style replicated exactly in brush strokes, colour, composition – techniques that take years and years to refine.”
“Companies like Lensa say they’re “bringing art to the masses,” said artist Karla Ortiz to The Guardian, “but really what they’re bringing is forgery, art theft [and] copying to the masses.”
Text-to-image generators and AI-generated art are here and there’s not much we can do about it. We can embrace it and play with it, creating fun and quirky “artworks.” But unfortunately, we live in a nasty world filled with nasty people, and the scope of AI-generated imagery can go way beyond fun and artsy. Like all technology, it can be a nightmare if it ends up in the wrong hands – and this is what terrifies me the most.
FIND THIS INTERESTING? SHARE IT WITH YOUR FRIENDS!