Melissa Heikkilä / MIT Technology Review:

A look at Nightshade and Glaze, tools made by researchers at UChicago that help artists “poison” their work to confuse or break AI models that later train on it  —  The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.


Source link