Companies steal art and people stopped caring long time ago

by: Artur Dziedziczak

December 12, 2022

I got sick a couple of days ago at a company party and that lead me to do a small research about the current state of AI purposed to generate images

If you read my microblog you might already know that I’m really critical of companies using AI. In this blog post, I’ll try to give you a small update on the current situation with Lensa AI [1] and other tools that allow you to generate images


After Dall-E [2] and Lensa AI was published many artists started to ask themselves questions like "Why do the images generated by this stupid AI look almost exactly like the work I do? Will I lose my job soon? Why I don’t get money when clearly AI copied my style?! Or maybe it copied my work directly? This signature there looks close to mine!"

All these questions need answers and to get them we need to dive a bit deeper into how this AI works and most important sources of its data to learn from

Current cases

First I’ll bring and summarize some articles which describe the current situation with Lensa AI

Article [6] puts perspective that new Lensa AI is sexualizing and racializing photos of its users and claims that Lensa as a company uses

"…​ legal loophole to train their AI using hundreds of thousands of copyright images and artwork, by claiming to be a non-profit …​"

There is a reference to some Instagram account that claimed that but since I don’t have an Instagram account it’s not possible for me to check if this information is true

The next one mentions an Australian artist that is accusing Lensa company of stealing her content and calling for stricter copyright laws

The response of Lensa was

"Developer of Lensa denies allegations, saying software learns to create portraits just as a human would – by learning different artistic styles"

They also mention that their AI "learns to create portraits just as a human would - by learning different artistic styles". It’s worth marking here that this article mentions that Lensa is using LAION-5B [3] dataset for their AI. The artist that claims her work was stolen used the tool [4] that allows you to search if your art was used to feed AI-based. This tool also indexes the model of LAION-5B

The most important quote from this article is

Once the training is finished, AI does not refer to the original images but applies what it has learned about the styles to the new image, they said

It suggests that after the learning phase AI only applies learned "style". After this spokesperson quote follows

In a similar way to a human being is capable of learning and self-training some elementary art principles by observing art, exploring imagery online and learning about artists and ultimately attempting to create something based on these aggregated skills,” the spokesperson said

Which I’ll also discuss later

The third one [7] suggests that Lensa AI leaves signatures of artists on some of its output. Later Andrey Usoltev Prisma Labs CEO wrote

“The notion of ‘remains of artists’ signatures’ is based on the flawed idea that neural networks might combine existing images. The actual process is different,”

He also mentions that AI learned that characteristics of some styles have signatures and then it just makes one up but this signature cannot be the same as the artist that work was used for training

The last article [8] I want to bring is about the general problems of Lensa APP. It also mentions signature problems and brings an interesting point of view that AI should be only a tool in hands of an artist. Not a replacement. In general, I think it’s the best article so far. It touches on multiple issues and has some quotes from sources and experts from different domains

Some facts

So first some facts and clarifications

No, it’s not proven that left signatures come from artists. At least for the time of writing this article

Yes, AI made by Lensa AI does not backtrack images made by artists once the learning was finished

From what I see LAION-5B uses data from 2014 - 2021 [5] that later was labeled by LAION for purpose of machine learning. From what I see Lensa AI did not break any law by using the content of common crawl and LAION commercially. It seems to be fully legal

Lastly comparing how AI learns to human beings should only be metaphorical. AI and humans have too many differences and saying stuff like "AI learns just like us" is a huge understatement that pisses off artists. To give you an example imagine a human artist that would copy someone’s style so much that it would also replicate that artist’s signature

What can artists do?

Well sadly for now much. Everything made by Lensa AI seems to be completely legal and LAION and Common Crawl have the right to create labeled datasets of publicly available data. But I think it’s all legal due to legislation issues and a lack of proper structure that would prevent commercial companies to benefit from open datasets

Examples to consider

Let’s say you make a website with your own content that has photos of your artwork made on paper. It only contains photos with some description that allows users to buy your physical paintings. Now imagine there is a tool online that takes your images and teaches AI what is your style of painting. Should this be legal?

For another example let’s imagine someone takes a photo of your artwork in some gallery and later posts it on social media. This social media uses your painting to train AI to create art with a similar style as yours. Should this be legal?

Lastly, imagine that there is another artist that now uses AI to generate art in the same style as you but with no skills at all and claims that he/she used AI as a tool. Should this be legal?

We really need to push governments to regulate this area one way or another. I don’t think a single person can regulate it. The issue is complex and requires a lot of collaboration between artists, developers, researchers, politicians, and big tech companies. We cannot give up though. If you are an artist that feels it’s not fair to use your art as an input for AI please FIGHT FOR YOUR RIGHTS even if you don’t have ones now. Think about women that couple hundred years ago did not have the right to vote. Just because something is not regulated and practiced does not mean it cannot be changed

Common solution

This part is mostly for "us" consumers of art. Feel free to use OpenAI, Lensa AI, or other tools for image generation but don’t let them replace artists. Today I watched an interesting video of one Polish Tech journalist. He mentioned that images generated by AI can look good but mostly they are just average in their beauty. Generating yourself a portrait with it is like going to Ikea and buying some photo art. It looks good but certainly, it’s not something you would show off to your friends


To conclude companies don’t really steal art from artists and we as art consumers stopped caring about art value a long time ago. But both of these can change and it only depends on us how we will regulate it and appreciate art

As always my words are negative but I’m positive that the change will happen and we regulate current issues for the greater good


[1] “Lensa - Prisma Labs.” Accessed: Dec. 12, 2022. [Online]. Available:

[2] “DALL·E 2.” Accessed: Dec. 12, 2022. [Online]. Available:

[3] “LAION.” Accessed: Dec. 12, 2022. [Online]. Available:

[4] “Have I Been Trained?” Accessed: Dec. 12, 2022. [Online]. Available:

[5] “Common Crawl.” Accessed: Dec. 12, 2022. [Online]. Available:

[6] A. & E. Desk, “Is AI Stealing from Artists?” The Daily Star, Dec. 08, 2022, Accessed: Dec. 12, 2022. [Online]. Available:

[7] S. E.-D. Mattei, “Artists Voice Concerns Over The Signatures In Viral LensaAI Portraits.”, Dec. 09, 2022, Accessed: Dec. 12, 2022. [Online]. Available:

[8] C. Mello-Klein, “The AI Portrait App Lensa Has Gone Viral, but It Might Be More Problematic than You Think.” News @ Northeastern, Dec. 09, 2022, Accessed: Dec. 12, 2022. [Online]. Available: