Nightshade e Glaze: le armi per gli artisti che vogliono combattere l’intelligenza artificiale thumbnail

Nightshade: a weapon for artists who want to fight artificial intelligence

The topic is more than hot and literally hot: on the one hand the artistson the other artificial intelligence (AI) generative, regardless of any form of copyright. The former want to preserve their art, while the AI ​​assimilates and rakes together all the knowledge (and reproducible) it finds online.

The question is simple: What can artists do to defend their rights and creativity?

The answer comes from the world of technology.

Artificial intelligence and copyright: two weapons to protect artists

A team of researchers at the University of Chicago has developed two tools that promise to help artists thwart AI companies that collect their works without their consent (and without compensation). It’s about Nightshade e Glazetwo tools that modify image pixels in a way that is invisible to the human eye, but absolutely important for machines. A sort of invisible paint, which man does not see but which for artificial intelligence is like an ink stain that makes it impossible to assimilate the work.

In particular Nightshade it is a data poisoning tool, that is, a technique that alters training data in a way that compromises the functioning of AI models. Artists can use Nightshade to add subtle changes to their images before uploading them onlinein order to make their works literally unusable by artificial intelligence-based learning models.

Glaze instead, it is a tool that allows artists to mask their personal style to prevent it from being copied by AI. Glaze works similarly to Nightshade, but instead of altering the content of images, it alters how they are perceived by machine learning models. In this way, artists can protect their identity and artistic expression.

The experiment and the project

The team of researchers tested the effectiveness of the two tools on some generative AI models, including the popular one StableDiffusion. Researchers have discovered that a few hundred images poisoned by Nightshade are enough to “confuse” the AI. They also found that Glaze makes it difficult for the AI ​​to recognize and exploit original works.

The two technologies were presented at Usenix, a cybersecurity conference. The researchers made Nightshade available as software open source, so anyone can modify it and create their own versions. The team also intends to integrare Nightshade in Glazeand artists will be able to choose whether or not to use the data poisoning tool.

The goal is to create a deterrent against AI companies that violate copyright, to give artists back control of their works.

Walker Ronnie is a tech writer who keeps you informed on the latest developments in the world of technology. With a keen interest in all things tech-related, Walker shares insights and updates on new gadgets, innovative advancements, and digital trends. Stay connected with Walker to stay ahead in the ever-evolving world of technology.