
I just discovered a site that I wish did not exist – smashorpass.ai*. This site lets men rate images of AI-generated women. If you are familiar with Mark Zuckerberg’s infamous “hot-or-not” website, FaceMash, you will not be surprised by the similarities. (If you are not familiar with the “hot-or-not” website, check out The Social Network).
Like Zuckerberg’s original site, Smashorpass.ai invites users to rate AI-generated female images using “Smash” or “Pass” buttons. The key difference is that all the “women” featured on this site are actually AI-generated images. As a result, they are exhibiting the common biases often found in AI-generated female photos.
Exaggerated features
Predictably, AI generates women’s photos with exaggerated physical features. You can count on unrealistically large breasts, huge Disney-like eyes, and airbrushed facial qualities. Some images even depict headless female figures, taking objectification to its upper limits.
The site’s developer, Emmet Halm, apparently introduced it as a “generative AI party game” without further explanation. Halm’s tweet promoting the project generated significant attention, with over 500 retweets and 1,500 likes. In a subsequent tweet, he claimed that the top three images on the site had approximately 16,000 “smashes.”
AI experts have expressed their dismay and disbelief at the project’s approach. Sasha Luccioni, an AI researcher at HuggingFace, commented on the site to Motherboard. She called it “truly disheartening” that technology is still being used for objectification and clicks.
Backlash to Smashorpass
In response to the sexist app, developer Rona Wang created a nearly identical parody website called ‘Friend of Foe’. It rates men, but this time, it assesses their likelihood of being dangerous predators of women.
Wang’s parody plays on the common comparison between men’s and women’s fears when meeting up on a date. Men are typically worried about meeting a less attractive woman than they hoped. Women, on the other hand, are more concerned about personal safety.
There is plenty of documentation on Sexist and racist biases in AI systems. Unfortunately, there are still developers who are ignoring (if not embracing) these biases in harmful ways. If you are looking for more information about over-sexualisation of women depicted in AI-images, this link would be a great resource.
It seems that some men never really quite grow up. They will always see women as hyper-sexualised objects, whether they exist in real life or just figments of their own fantasies.
You may think that it’s all rather harmless. However, there are studies linking women objectification trends, the rising number of Incels (involuntary celibate), and increases in violence against women.
* You may have noticed that I did not link to the site. I only called it by name. This is intentional. I will not give any Google juice to sites like this.
FIND THIS INTERESTING? SHARE IT WITH YOUR FRIENDS!