Nightshade is designed to be an offensive tool that poisons images, causing AI models to act in unpredictable ways if they use enough poisoned images as training data.
Artists are getting access to Nightshade, a new tool to deter AI creators from using their work without their consent.
This tool was announced a few months ago amid concerns that the work of artists is used to train AI models without the consent of the artists. Nightshade is now publicly available.
The tool aims to make images unsuitable for AI models by turning them into “poison” samples. These samples would cause the AI to act in unpredictable ways – provided it uses enough poisoned images as training data.
“Used responsibly, Nightshade can help deter model trainers who disregard copyrights, opt-out lists, and do-not-scrape/robots.txt directives,” the Nightshade team said in a blogpost. “It does not rely on the kindness of model trainers, but instead associates a small incremental price on each piece of data scraped and trained without authorisation.
“Nightshade’s goal is not to break models, but to increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.”
Some of the team behind Nightshade also designed Glade, an app that adds subtle changes to artworks to interfere with the ability of AI models to read the image’s data.
The researchers said Glade is a defensive tool to prevent style mimicry, while Nightshade is designed to be an “offensive tool” that disrupts AI training data while making minimal visible changes to the original image.
“While human eyes see a shaded image that is largely unchanged from the original, the AI model sees a dramatically different composition in the image,” the team said. “For example, human eyes might see a shaded image of a cow in a green field largely unchanged, but an AI model might see a large leather purse lying in the grass.
“Trained on a sufficient number of shaded images that include a cow, a model will become increasingly convinced cows have nice brown leathery handles and smooth side pockets with a zipper.”
While Nightshade promises to be another tool to help artists protect themselves from the rapid rise of AI, there are limitations to the model. The team warns that the tool can cause more noticeable differences to art with flat colours or smooth backgrounds – but there is a “low intensity” setting to mitigate this issue.
The team also noted that Nightshade is unlikely to stay “future proof” for long periods of time. Meanwhile, Matthew Guzdial, assistant professor at the University of Alberta, claimed Nightshade only works on certain AI models and that millions of images would have to be “poisoned” to have a significant impact on LAION – Large-scale Artificial Intelligence Open Network – models.
As AI and art continue to collide, an episode of For Tech’s Sake discussed what it means to be at this intersection with Beta festival director Aisling Murray. Meanwhile, a report last November year claimed many large language models are trained on copyrighted content from news organisations.
10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.