Nightshade is a new, free tool software which allows artists to "poison" the AI models they want to train on their projects.
The Nightshade developed under Professor Ben Zhao and by computer scientists at the University of Chicago.
It uses the popular open-source machine learning software PyTorch to determine what's in a given image and then subtly changes the image at the pixel level so that other AI programs see something completely different than what's there in fact.
An AI model that ended up being trained from many Nightshade-altered images would likely miscategorize objects for all users of that model, even on non-Nightshade-altered images.
As the team behind Nightshade explains, human eyes might perceive a distorted image of a cow in a green field that is largely unchanged, but an AI model might see a large leather wallet lying in the grass! .
Ένα μοντέλο τεχνητής νοημοσύνης εκπαιδευμένο σε "πειραγμένες" εικόνες μιας αγελάδας ώστε να μοιάζει με πορτοφόλι, θα άρχιζε να δημιουργεί πορτοφόλια αντί για αγελάδες, ακόμη και όταν ο χρήστης ζήτησε από το μοντέλο να φτιάξει μια φωτογραφία μιας αγελάδας.
Artists who want to use Nightshade must have a Mac with an Apple chip inside (M1, M2, or M3) or a PC running Windows 10 or 11. You can download the tool for both operating systems from here.
While some artists have rushed to download Nightshade v1.0 and are already using it, some netizens have complained about it, suggesting that it amounts to cyberattack in artificial intelligence models and companies.
The Glaze/Nightshade team, for its part, denies that it is pursuing destructive ends, and says it wants AI model developers to have to pay artists to be trained on data from them.
And the war has just begun!