Scientists have developed a way for photographers and artists to fight back against AI harvesting their work without permission. The tool, dubbed Nightshade, works by effectively poisoning the image’s pixels. In bulk, this can leave the AI model completely useless.
University of Chicago professor Ben Zhao told MIT Technology Review that Nightshade would poison any model that uses images to train AI. This poisoned data can then damage any future images created by the AI generator, causing it to “break in chaotic and unpredictable ways.
The team introduced poisoned data samples into Stability AI’s Stable Diffusion XL (SDXL). This caused the model to interpret “car” as “cow” and “dog” as “cat.” Even art styles became distorted, with a “cartoon” prompt yielding results reminiscent of 19th-century impressionist art.
Artists’ styles protected
Nightshade also helps protect individual artists’ styles. For example, when asked to generate art in the style of a particular well-known artist, the poisoned model produces images less similar.
To have a significant impact, hundreds or even thousands of poisoned images are needed, depending on the AI model’s size. This tool could encourage AI developers to think twice before using scraped training data from the internet.
Stability AI, one of the companies facing backlash from artists, emphasized its commitment to equitable representation and bias reduction. However, Nightshade poses a significant threat to companies using artists’ work without consent.
Part of the Glaze tool
Nightshade is set to be integrated into the Glaze tool, which allows artists to mask their styles to prevent AI scraping. Furthermore, Nightshade will be open-source, allowing other developers to create similar protective tools.
While some methods exist to protect images from direct manipulation by AI, Nightshade stands out as a powerful weapon against unauthorized use. Tools like Nightshade could become essential safeguards against AI data scraping.
Until now, the only recourse for individual artists against these AI behemoths has been class action lawsuits. Even giants such as Getty have not been particularly successful so far at suing.
This tool could be a game-changer for artists in their ongoing battle to protect their work from unauthorized AI usage. It could be a vital addition to the artist’s arsenal against image theft by AI companies.