Overnight, the photo-editing app LensaAI became massively successful when they rolled out their Magic Avatar feature, which uses AI technology and 10-20 photos from users to create AI assisted portraits in a variety of styles.
However, artists who have been suspicious of the use of AI image generators became concerned when they saw what looked like signatures scrawled in the corners of many of these AI portraits. Some artists contend that this could prove that LensaAI was stealing work that had included the watermarks or signatures that artists use to prevent theft.
An illustrator who goes by Lauryn Ipsum on Twitter pointed out this phenomenon in what is now a viral Tweet: “These are all Lensa portraits where the mangled remains of an artist’s signature is still visible,” Ipsum wrote, with attached pictures. “That’s the remains of the signature of one of the multiple artists it stole from.”
The signatures the AI created were all illegible, and to Prisma Labs (the parent company of Lensa AI), do not constitute proof of theft.
“The notion of ‘remains of artists’ signatures’ is based on the flawed idea that neural networks might combine existing images. The actual process is different,” wrote Andrey Usoltsev, Prisma Labs CEO in an email to ARTnews.
He explained that while neural networks are trained on pre-existing images, once the training is done, the AI does not refer back to the vast dataset of images it was trained on. Rather, it has now learned how to mimic particular styles. According to Usoltsev, the AI has learned that a key characteristic of the category “painting” is a signature, so it makes one up.
“On this occasion, it mimics the paintings, the subset of the images that generally come with sign-offs. AI understands the sign-offs as an inherent style feature and imitates them,” wrote Usoltsev, who added that “The details pointed out don’t use any existing language and do not represent the particular artist’s signature.”
According to Usoltsev, this mimicry is not theft as no particular artist had their signature distorted or, in Ipsum’s word, “mangled”.
However, the issue may not be that any one artist is being stolen from, but that, en masse, their work might have been used to train the tech that threatens to replace them.