How Protect Your Art From AI?
Diving into something a bit techy today, but bear with me—it's all about the art world clashing with AI in a wild way. There's this software on the block called Nightshade, and it's throwing a mess in the algorithm works for those image-generating AI models we've all been seeing lately.
I want to make it clear that I support AI art, but I also believe artists should have the ability to protect their art from being part of the AI process if he/she chooses not to participate. Nightshade appears to be a tool that does just that.
Picture this: artists can now tweak their masterpieces at the pixel level, invisible tweaks that don't mess with the art for us mere mortals but turn AI training sets into a hot mess. Upload your art with these tweaks, and if an AI tries to learn from it, boom! The model goes haywire—think dogs turning into cats, cars morphing into cows. Pure chaos. MIT Technology Review got the scoop on this, showing off how Nightshade is shaking up the scene.
Why go through all this trouble? It's all about giving artists a fighting chance against big AI firms scooping up their creations without asking. On Feb 9th I wrote about art being stolen from artists for Midjourney AI training; Nightshade's like a secret weapon, making those AI models trip over their own feet. And it's not just about causing a bit of mischief. Ben Zhao and his crew from the University of Chicago are hoping to shift the power back to the artists, making sure their rights and works are respected.
But wait, there's more. They've also cooked up Glaze, a tool that lets artists disguise their unique style from these data-hungry AI companies. It's a bit like Nightshade, tweaking images in ways that throw off AI without changing how the art looks to us. They're planning to merge these two tools, giving artists the choice to fight back with data poisoning or keep their styles under wraps.
Nightshade's going open source. That means it will be free and anyone can get their hands on it, tweak it, and make their own versions. The more people jump on this, the stronger it gets. With billions of images out there, slipping a few poisoned apples into the batch could really stir up trouble for AI models.
This isn't just a small hiccup for AI. Artists worried about their work being swiped can now use Glaze to disguise their art, then hit it with Nightshade. When AI companies scrape the web for data, they end up with a bunch of these booby-trapped images that send their models off the rails.
The results? Utterly bonkers. Dogs with too many legs, handbags turning into toasters—you name it. And fixing this mess? Good luck. It's like trying to find a needle in a haystack, except the haystack is the size of a mountain.
Shameless plug 🤣 My own AI art for sale$$$
Junfeng Yang, from Columbia University, thinks Nightshade could be a game-changer, pushing AI companies to play nice with artists. Right now, artists can ask to be left out of these AI models, but it's a hassle and doesn't really change the power dynamic.
So, there you have it, folks. In the clash of art vs. AI, artists are arming themselves with some pretty clever tools. Nightshade isn't just about causing a stir; it's about taking a stand, and it's got the art world buzzing with possibilities.