Nightshade, a free tool to poison AI data scraping, is now available to try


Nightshade was initially introduced in October 2023 as a tool to challenge large generative AI models. Created by researchers at the University of Chicago, Nightshade is an "offensive" tool that can protect artists' and creators' work by "poisoning" an image and making it unsuitable for AI training. The program is...

Read Entire Article



from TechSpot https://ift.tt/qzkQgon

Comments