Arr, ye scurvy dogs! This bewitched poison pill tool be cap'n of breakin' AI systems, stealin' data! Aye, savin' the art of artists!
2023-10-26
Avast, ye scurvy dogs! These AI image generators be thievin' the creative styles o' true artists! Fear not, me hearties! This shiny new tool shall safeguard artists' masterpieces and unleash a deadly poison upon those blasted machines tryin' to mimic 'em!
A new tool called Nightshade has been developed by a team from the University of Chicago to protect images from being used by unauthorized AI programs. Nightshade inserts data into an image's pixels that confuse AI image generators, causing them to malfunction. For example, an AI program might interpret a protected image of a dog as a cat or a photo of a car as a cow. The purpose of Nightshade is not to break AI models, but to discourage the use of unauthorized data and promote the use of licensed content for training. Text-to-image AI generators like Midjourney and Stable Diffusion are often trained using text and images from the internet and other sources. Artists Karla Ortiz and Kelly McKernan have filed a lawsuit against Midjourney and Stable Diffusion for copyright infringement and right of publicity violations. They claim that their artwork was used to train the AI systems, allowing others to profit from their work. To combat this issue, Ortiz and McKernan collaborated with the University of Chicago team on another project called Glaze, which disrupts style mimicry without causing the AI model to malfunction. While these tools provide some protection for artists, they do not address the billions of pieces of content already posted online. Furthermore, if the tools get hacked, creators will lose their protection. No major AI image generators have been able to hack Nightshade so far, and the team behind it will update the tool to evade any detection techniques that may arise in the future.