Everyone’s talking about Lensa: The viral app that turns you into a superhero also has racist and masochistic biases

Lensa AI is one of those apps that everyone has been talking about on social media lately. Thanks to this, users can generate their own ‘magic avatars’ through technology based on artificial intelligence. However, there is a downside: while men get animated versions of their faces as warriors, astronauts, superhero inventors or politicians, women with their faces covered in hypersexualized drawingsEspecially if they are girls with Asian features.

have racist and sexist biases

Journalist Melissa Hikkila complained in an MIT Technology Review article, “My avatars were cartoonishly vulgar, while my male colleagues became astronauts, explorers and inventors.” Furthermore, it ensures The AI ‚Äč‚Äčreplicates this problem more when the person is an interracial girl.

Some results from Melissa Heikkila on Lensa AI.
Some results from Melissa Heikkila on Lensa AI.
MIT Technology Review

Heikila is of Asian descent, and Lensa acknowledged that trait only in her selfies: “I got pictures of normal Asian women.” Most of the results were nude avatars, clearly hypersexualized, some appearing to be crying. He tested on other classmates, a white one, which featured some nude and cleavage avatars, but fewer; And another, also of Asian descent, with results similar to yours.

Google is working on a skin tone scale to be more inclusive.

Lensa’s problem is replicated in other AI

Although Lensa’s case has gone viral, it is not the first AI to show racist and sexist bias. This technique is often trained on human generated data, therefore, They are actually replicating the problems that exist in our society.

Lensa uses an open source AI model called Stable Diffusion to generate its avatars. This in turn uses LAION-5B, a huge open source data set Extract images from the Internet.

A 2021 study looked at datasets similar to those used to create Stable Diffusion and found that AI training data is doomed. Racist stereotypes, pornography and even explicit images of rape, They were able to do this with the LAION dataset because it’s open source, but other AIs that don’t have open source (DALL-E or Google Images) repeat a similar pattern.

The new method is called FRAN (Facial Age Change Network).

beyond the biases of the lens

Lensa is an artificial intelligence-powered app for creating custom cartoon images, developed in 2018. His most viral feature right now is ‘Magic Avatar’A device that allows you to create pictures with the user’s face, although with its already mentioned controversies.

According to Sensor Tower, in the first five days of December, more than 4 million people downloaded the app and spent a total of $ 8 million on it. You may be one of the users who installed the app and tested it, but, Did you read the privacy policy before accepting?

David Leslie, director of research for ethics and responsible innovation at the Alan Turing Institute and professor at Queen Mary University of London, recalls that “We should always be careful with our biometric data used for any purpose.”

Selfies that users take in the case of Lensas are removed as avatars are generated, says Andrey Usoltsev, CEO and co-founder of Prisma Labs, the company behind the app.

Despite this, there have been other similar applications that have violated the privacy of their users by not reading the privacy policy and terms of use. For this reason, the ideal is that, on all platforms, and especially on those that receive your biometric data, check all the fine print, To prevent your information from being sold to third parties.

Sign up for our newsletter and get the latest technology news delivered to your inbox.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button