Nightshade is a new, free software tool that allows artists to "poison" the AI models they want to train on their projects.
The Nightshade developed under Professor Ben Zhao and by computer scientists at the University of Chicago.
It uses the popular open-source machine learning software PyTorch to determine what's in a given image and then subtly changes the image at the pixel level so that other AI programs see something completely different than what's there in fact.
An AI model that ended up being trained from many Nightshade-altered images would likely miscategorize objects for all users of that model, even on non-Nightshade-altered images.
As the team behind Nightshade explains, human eyes might perceive a distorted image of a cow in a green field that is largely unchanged, but an AI model might see a large leather wallet lying in the grass! .
An AI model trained on images of a cow "teased" to look like a wallet would start creating wallets instead of cows, even when the user asked the model to make a picture of a cow.
Artists who want to use Nightshade must have a Mac with an Apple chip inside (M1, M2, or M3) or a PC running Windows 10 or 11. You can download the tool for both operating systems from here.
While some artists have rushed to download Nightshade v1.0 and are already making use of it, some netizens have complained about it, suggesting that it amounts to a cyber attack on AI models and companies.
The Glaze/Nightshade team, for its part, denies that it is pursuing destructive ends, and says it wants AI model developers to have to pay artists to be trained on data from them.
And the war has just begun!