Why is this AI app undressing female users without their consent?

Lensa has been spitting out hypersexualized images of women and lightening skin tones.

If your social feed is currently inundated with stylized portraits, blame Lensa.

Why is this AI app undressing female users without their consent?

The four-year-old AI app turns selfies into illustrations, starting at $3.99 for 50 images. Since introducing the “magic avatars” feature in November, the app has seen 17m+ downloads.

Does everyone like their portraits?

Def not. Women have noticed a disturbing trend: Even when they upload above-the-shoulder headshots, or ask the app to depict them as men, they get hypersexualized — or straight-up pornographic — images in return.

People of color have also reported that the app altered their features, lightening skin tones and eye colors.

Like most problems, humans are ultimately to blame:

  • The app uses Stable Diffusion, an AI model that generates images from text.
  • Stable Diffusion is built using LAION-5B, an open-source data set that scrapes images from across the internet.

And what does the internet have a lot of? Degrading photos of women.

If AI really is the future…

… we gotta fix this. Lensa isn’t alone in pulling from the dark corners of the web: Google’s Imagen and OpenAI’s DALL-E use similar training.

That means AI could come preloaded with all the human biases we hoped it would eliminate.

Plus, this isn’t the first time we’ve had to reckon with sexism in AI. Right, Siri and Alexa?

Topics: Ai Inclusivity

Related Articles

Get the 5-minute news brief keeping 2.5M+ innovators in the loop. Always free. 100% fresh. No bullsh*t.