Adobe Firefly arrived in March 2023 and extended to Photoshop and Google Bard shortly after. The company bragged about not using unlicensed photos for training, but is that as good as it sounds? A group of Adobe Stock creators says no! They recently spoke up about how they felt after learning that Adobe used their content to train its AI model – which is, to make things worse, aimed to replace them.
When Adobe Firefly was launched, the company said that it would “focus on images and text effects and is designed to generate content safe for commercial use.” They emphasized that the AI model was trained on Adobe Stock images, openly licensed content, and public domain content where the copyright has expired. Wait, Adobe Stock images? This is where we get to the core of the issue.
The issues with Adobe Firefly
AI-generated images replacing photographers
An issue that only makes the previous one worse is that AI-generated images are replacing real photos on stock websites. While using Adobe Stock photos to train its algorithm, the platform also allows creators to upload AI-generated content and make money off of it. And it’s the only one for now, as other major platforms have banned AI-generated images. I guess not many platforms want to risk while such content is still in murky areas of copyright laws.
On the other hand, Adobe Stock welcomes AI content and enables creators to license it. Paired with training its algorithm on Adobe Stock photos, it seems as if the company all of a sudden favors AI creators over real photographers.
Commercial use of AI-generated images
As I mentioned, when first launched Firefly they said that it would focus on content that’s safe for commercial use. This is another potential issue Adobe Stock creators could face. This made me remember a lawsuit several artists filed against Stable Diffusion and Midjourney. They also found out that their artwork was used to train algorithms (surprise, surprise), and they reflected on the potential result of generating AI work further.
The artists’ attorney Matthew Butterick noted at the time that the resulting images (“collages”) from AI image generators “may or may not outwardly resemble the training images.” Okay, they’re not exactly collages, but I thought he was onto something. There is a small possibility of AI-generated art resembling previously made artwork – and the same goes for photos. If not exact images, it could closely resemble someone’s distinctive style. With the right prompts and artistic style applied, you might actually generate an image that’s too similar to someone else’s work.
“It’s legal, but it’s not ethical”
UK-based creator and digital artist Dean Samed spoke with VentureBeat about Adobe using images from its stock platform to train the algorithm. “They’re using our IP to create content that will compete with us in the marketplace,” he said. “Even though they may legally be able to do that, because we all signed the terms of service, I don’t think it is either ethical or fair.”
Furthermore, Adobe Stock contributors weren’t notified that Adobe would use their photos for training purposes. “I don’t recall receiving an email or notification that said things are changing, and that they would be updating the terms of service,” Samed added.
“Back then, no one was thinking about AI,” said Eric Urquhart, who joined Adobe Stock in 2012 and has contributed thousands of images. “You just keep uploading your images and you get your residuals every month and life goes on — then all of a sudden, you find out that they trained their AI on your images and on everybody’s images that they don’t own. And they’re calling it ‘ethical’ AI.”
Like Samed, Urquhart agrees that Adobe didn’t do anything illegal and this was indeed within their rights. However, it was unethical not to pre-notify Adobe Stock artists about the Firefly AI training and offer them to opt out.
Rob Dobi, a Connecticut-based photographer, commented:
“I’m probably not adding anything new because they will probably still try to train their AI off my new stuff. But is there a point in removing my old stuff, because [the model] has already been trained? I don’t know. Will my stuff remain in an algorithm if I remove it? I don’t know. Adobe doesn’t answer any questions.”
VentureBeat got a response from Adobe regarding the artists’ concerns. The company said that its goal was “to build generative AI in a way that enables creators to monetize their talents.” An Adobe spokesperson said that it was important to note that Firefly is still in beta.
“During this phase, we are actively engaging the community at large through direct conversations, online platforms like Discord and other channels, to ensure what we are building is informed and driven by the community,” the Adobe spokesperson said. They added that Adobe remains “committed” to compensating creators and that they would “provide more specifics on creator compensation once these offerings are generally available,” since Firefly is still in beta.
Hi there, thanks for your question. Adobe’s use of Stock content is covered by our Stock Contributor license agreement. We are developing a compensation model for Stock contributors and will share details once Firefly is out of beta. ^BT
— Adobe (@Adobe) March 21, 2023
Here at DIYP, we had a heated discussion about Adobe Firefly’s “ethical” AI when it was first launched. Some of my colleagues argued that it was okay: Adobe has the license over images, and it’s all there in the contributor agreement, so the training was legal. I stubbornly stuck with the SJW attitude I sometimes have: it might be legal, but it was still not ethical. Now I see that I’m not the only one who holds this viewpoint. I’d like to hear from you – do you think it’s okay for Adobe to train its algorithms on Adobe Stock images? And was it okay not to offer contributors to opt out?