
Many artists have complained about tech giants using their work to train AI models. Some, including writer George R. R. Martin, have even filed lawsuits against OpenAI, Stability AI, and other companies for using their work for these purposes without permission.
OpenAI is now offering a chance to artists to remove their creative works from being used in the training data of DALL-E 3, the company’s newest image generator. But there’s a catch: the process is so detailed, complicated, and time-consuming, that artists have called it “enraging.”
The problematic opt-out process
This is the first time OpenAI has allowed creators to remove their work from training data. No wonder, considering that lawsuits just keep on coming. However, the company has found a way to ensure that not so many people will go down this route after all.
The thing is, artists or right holders must submit each piece of work separately with detailed descriptions if they want it to be excluded. It’s cumbersome enough for individuals like you and me. But imagine huge entities like museums! They hold thousands of artworks, so this process seems almost unmanageable for them.
Naturally, this complicated opt-out method has sparked debates. The artistic community has raised suspicions over whether it was intentionally designed to preserve as much data as possible for OpenAI’s AI models. Honestly, I tend to agree. I’m sure that the overly cumbersome process will fend off many people from opting out.
Another issue is that, basically, what’s done is done. In other words, the granted opt-out requests are only applicable to future training data. DALL-E 3 will already have used any artistic works submitted for removal, not to mention that it’s already been used for previous OpenAI’s models.
The possible consequences
This situation has ignited conversations around online content and the safety of posting creative works online. Greg Madhere, an IT consultant and photography hobbyist, wrote on Threads that he wanted to start sharing his photos online. “Where is it safe to even post online anymore?” he wondered, considering that tech giants scrape online content to train AI models.
Artist Toby Bartlett shared his frustration on Threads, calling the process “enraging.” “Now artists are going to have to almost ruin their work with watermarks of epic proportions in the hopes that their work doesn’t get used… if that even works!” Bartlett added.
OpenAI’s response
Commenting on the new opt-out method, an OpenAI spokesperson said:
“We’ve heard from artists and creative content owners that they don’t always want their content to be used for training and so we’re offering them the ability to opt out their images from future training of models”
The company says that it also has a potential solution for those with extensive creative collections or “high volume of images from specific URLs.” They advise the use of robots.txt to block OpenAI’s web crawler GPTBot from accessing specific URLs. However, this raises another set of concerns. The effectiveness of this solution is questionable as it requires artists to have access and modify the coding of every website hosting their images. Yup, another nearly impossible task!
This unfolding scenario raises the already high tension between artists and tech companies over intellectual property rights and the unauthorized use of artistic content. Copyright protection in the AI realm is still a gray area, and I feel that the artists’ concerns and anger are rising. This is yet another case that brings to light the need for clearer, more manageable solutions to allow artists control over their work.
FIND THIS INTERESTING? SHARE IT WITH YOUR FRIENDS!