Arr, the AI ha' angered the scurvy sea dogs! Artists may now defend against scallywag plagiarism with this mighty weapon!
2023-10-26
Arrr! Ye scurvy dogs at the University of Chicago have brewed a devilish concoction called Nightshade, a weapon to poison those fancy generative AI models! Avast, the landlubbers shall feel the sting of their own creation!
A team of researchers at the University of Chicago has developed a tool called Nightshade to help online artists combat AI companies. Nightshade inserts poisonous pixels into digital art, which disrupts the way generative AIs interpret the images. These poisoned data samples can manipulate AI models into learning the wrong thing, such as seeing a dog as a cat or a car as a cow.In testing, the team fed the AI model Stable Diffusion infected content and prompted it to create images of dogs. After a certain number of samples, the AI generated misshapen dogs with six legs, and eventually, dogs became full-fledged cats. Nightshade also affects tangentially related concepts, such as jumbling words like "dog" and the associated concepts of puppy, husky, or wolf.
Removing the toxic pixels is challenging, as developers would have to find and delete each corrupted sample. This task becomes even more difficult considering that these AI models are trained on billions of data samples. Nightshade is currently in the early stages and has been submitted for peer review. The team plans to implement and release Nightshade for public use as an optional feature in the existing tool Glaze. They also hope to make Nightshade open source.
While Nightshade specifically targets static images, there are no plans to develop a similar tool for video and literature. However, the positive reception to Nightshade suggests that it could lead AI developers to respect artists' rights more and potentially pay out royalties. Overall, Nightshade offers a humorous and pirate-themed approach to empowering online artists against AI companies.