Lensa AI is one of those apps that everyone has been talking about on social media lately. Thanks to this, users can generate their own ‘magic avatars’ through technology based on artificial intelligence. However, there is a downside: while men get animated versions of their faces as warriors, astronauts, superhero inventors or politicians, women with their faces covered in hypersexualized drawingsEspecially if they are girls with Asian features.
have racist and sexist biases
Journalist Melissa Hikkila complained in an MIT Technology Review article, “My avatars were cartoonishly vulgar, while my male colleagues became astronauts, explorers and inventors.” Furthermore, it ensures The AI replicates this problem more when the person is an interracial girl.
Heikila is of Asian descent, and Lensa acknowledged that trait only in her selfies: “I got pictures of normal Asian women.” Most of the results were nude avatars, clearly hypersexualized, some appearing to be crying. He tested on other classmates, a white one, which featured some nude and cleavage avatars, but fewer; And another, also of Asian descent, with results similar to yours.
Lensa’s problem is replicated in other AI
Although Lensa’s case has gone viral, it is not the first AI to show racist and sexist bias. This technique is often trained on human generated data, therefore, They are actually replicating the problems that exist in our society.
Lensa uses an open source AI model called Stable Diffusion to generate its avatars. This in turn uses LAION-5B, a huge open source data set Extract images from the Internet.
A 2021 study looked at datasets similar to those used to create Stable Diffusion and found that AI training data is doomed. Racist stereotypes, pornography and even explicit images of rape, They were able to do this with the LAION dataset because it’s open source, but other AIs that don’t have open source (DALL-E or Google Images) repeat a similar pattern.
beyond the biases of the lens
Lensa is an artificial intelligence-powered app for creating custom cartoon images, developed in 2018. His most viral feature right now is ‘Magic Avatar’A device that allows you to create pictures with the user’s face, although with its already mentioned controversies.
David Leslie, director of research for ethics and responsible innovation at the Alan Turing Institute and professor at Queen Mary University of London, recalls that “We should always be careful with our biometric data used for any purpose.”
Selfies that users take in the case of Lensas are removed as avatars are generated, says Andrey Usoltsev, CEO and co-founder of Prisma Labs, the company behind the app.
Sign up for our newsletter and get the latest technology news delivered to your inbox.