0 0
An Overly Sexualized Picture is Worth a Thousand AI Think Pieces - Metaphors Are Lies

An Overly Sexualized Picture is Worth a Thousand AI Think Pieces

Read Time:2 Minute, 22 Second

A popular program that crates AI generated avatars has a tendency to create nudes when you are a woman, especially a non-white woman:

When I tried the new viral AI avatar app Lensa, I was hoping to get results similar to some of my colleagues at MIT Technology Review. The digital retouching app was first launched in 2018 but has recently become wildly popular thanks to the addition of Magic Avatars, an AI-powered feature which generates digital portraits of people based on their selfies.

But while Lensa generated realistic yet flattering avatars for them—think astronauts, fierce warriors, and cool cover photos for electronic music albums— I got tons of nudes. Out of 100 avatars I generated, 16 were topless, and in another 14 it had put me in extremely skimpy clothes and overtly sexualized poses.

Lensa’s fetish for Asian women is so strong that I got female nudes and sexualized poses even when I directed the app to generate avatars of me as a male. 

The viral AI avatar app Lensa undressed me—without my consent | MIT Technology Review

The last bit was a nice touch, I thought. “I am a man! Nope, you look like an Asian female, so into the porn pictures you go!” Just a perfect distillation of where we are with so-called AI. No notes.

And to be clear, this is not the fault of the so-called AI. Because there is no such thing as artificial intelligence. Human beings — almost certainly overwhelmingly men in this case — wrote a bit of code to take come pictures and turn them into other pictures. They then trained that code on images they selected. But because the code was written in such a way as to follow the initial rules to use patterns from the training data to generate remixed images, we have allowed them to pretend that the machine, the code, is somehow responsible, has some measure of intelligence.

Bullshit.

It is just code. It is an algorithm like any other — a set of rules and constraints ultimately coded by human beings. Human beings decided what training data was used. Human beings decided to not include anti-porn rules in the code. Human beings decided to not correct for biases against woman and woman of color. Not machines. People.

And when this thing or something like it is inevitably used to generate revenge porn or child pornography, we won’t hold the people who made it responsible. Oh no. Because we pretend that the internet is somehow special, a place where every product is speech and every line of code can be made into a quasi-person if we chant the magical phrase “AI” at it enough times, no harm will come to the irresponsible jackasses who released the broken product onto the world. Just their victims.

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.