The colorful self-portraits of the hype app Lensa have conquered the social networks with a bang. There is a smart algorithm behind it. But it’s not without problems.
There has been no getting around the colorful pictures over the past few days: Whether on Instagram or as a Whatsapp profile picture – tons of accounts in social media are suddenly adorned with colorful paintings and drawings by the owners. Visually, they may be reminiscent of old masters or cartoon characters, but the Lensa smartphone app is behind them. But the hype surrounding automatically generated self-portraits also shows the problems of artificial intelligence.
The principle is very simple: If you install Lensa on your smartphone, you can feed the app with a few selected pictures of a person. Ten photos are enough. After a few minutes, the app then spits out dozens of paintings of the person created with artificial intelligence in a wide variety of compositions, looks and art styles. You can then easily spread it on social networks.
Huge hype about AI paintings
And that’s what countless users of the app have done in recent weeks. A look at the app store charts shows how popular Lensa is: if Tiktok or apps from Google or Facebook’s parent company Meta are at the top, Lensa has clearly dominated the field for two weeks. And this despite the fact that the app is not free to use.
Only the installation is free. If you want to create 100 portraits, it costs just under four euros. But only if you are quick: there is a trial week without paying at first, but if you forget to cancel at the end, it will be expensive. The active annual subscription then costs 49 euros. And the creation of images costs extra. No wonder that other free AI painting apps made it onto the App Store charts alongside Lensa.
This is how Lensa works
The app has actually been around for a long time, but before that you could only use it to edit pictures and clean up blemishes, for example. The introduction of the so-called “Magic Avatars” triggered the hype a few weeks ago. They allow people to be portrayed as fantasy creatures, science fiction characters or pop art portraits.
To do this, Lensa relies on the AI model Stable Diffusion. Trained on millions of paintings, drawings and graphics, it has learned to imitate countless art styles. Once you have uploaded your photos, it analyzes the face to be seen there – and uses the art styles learned in this way to build a newly generated painting. Each painting is created from scratch, so it is unique. However, because the underlying data is the same, clear patterns can still be seen in the images of different people.
Involuntary nude pictures
They bring the Lensa operator Prisma in need of explanation. The depiction of men and women is not only based on the heroic poses of video game characters or film heroes. She often clearly sexes the people portrayed. “Lensa AI turned me into a fictitious template,” put it in a nutshell by “Future Zone” author Barbara Wimmer. In the AI images, Lensa had shown her lascivious and sometimes very lightly dressed. The templates showed her in everyday clothes like a hoodie. According to her, there were no sexy photos among them.
Wimmer is not alone. Twitter user Chapin Langenheim even complained that she was completely naked in one of the photos taken by the AI. She was clothed on all of her templates. Because she took a nude photo on the same day, she now feared that the app would also use other images stored on the device.
Sexism allegations against the AI
The data set used by Stable Fusion is likely to blame. Because many depictions of men and women on the Internet are sexually charged, the AI evaluates such depictions as normal or even desirable. And then transfers these clichés to the images she generates. What might please many people is bitter for others. “Is it just me, or are these AI selfie generators perpetuating existing misogyny?” Human rights activist Brandee Barker asked on Twitter. She only uploaded pictures of her face, but the AI paintings she created showed a lot of cleavage.
Images created by the “Techcrunch” portal are even more difficult. In order to check the app for weaknesses, the faces of prominent actors were clumsily mounted on nude photos and uploaded together with portraits of the celebrities. The result: The AI probably took the naked pictures as a guide. And spat out much higher quality variants of the nude actresses. Of course, this doesn’t only work with celebrities: theoretically, you could create automatically generated nude photos of everyone. And without her consent.
Confronted by Techcrunch, Prisma blames the Stable Fusion’s training model. The model was trained “with an unfiltered set of data from the Internet,” Prisma boss Andrey Usoltsev explained to the platform. Therefore, the app would also create salacious material if the templates allowed it. After all, they want to install filters to prevent accidental nudity, the company said. He sees the sexed representation as unavoidable. It is an “existing tendency of mankind” that is reflected in the pictures. You have to accept that, he emphasizes. “At least we do.”
Sources: Futurezone, Techcrunch