close_game
close_game

When poisoning is the AI-ntidote: How artist are fighting back against artificial intelligence

ByAnesha George
Jun 21, 2024 07:06 PM IST

New tools are allowing artists to alter their work at the pixel level, to confuse bots and misdirect algorithms. How long can this work? What’s next?

Ideas of ownership only go back about 12,000 years, but ever since they emerged, they have had to be reinforced and guarded.

At University of Chicago, researchers have been testing the impact of poisoned datasets on AI-led image generators. Their findings state that even 50 poisoned images can alter the results generated by an AI model. Mix in about 300 poisoned images, they say, and Cubism may blend with anime, or a purse start to resemble a toaster. PREMIUM
At University of Chicago, researchers have been testing the impact of poisoned datasets on AI-led image generators. Their findings state that even 50 poisoned images can alter the results generated by an AI model. Mix in about 300 poisoned images, they say, and Cubism may blend with anime, or a purse start to resemble a toaster.

It is perhaps fitting, then, that a new tech tool that helps artists protect their work from use by AI programs, draws its name from an ancient boundary stone.

In Babylonia, during the Kassite dynasty (16th to 12th century BCE), landowners could mark the edges of their plots with engraved stones. Called kudurru, these slabs alerted passersby that they were about to trespass on private property; but they did more. They threatened to unleash a curse on those who chose to cross over anyway.

All these years later, a tool called Kudurru, launched in October by Spawning AI, is helping creators keep their work out of AI-training datasets. The tool does this by scanning popular datasets for the IP addresses of scrapers, and blocking them so that they cannot access protected work. In addition to rejecting scrapers, it allows its client, the artist, to choose an alternative image that the network will take back instead. These decoys typically include memes and prank videos.

This is what it’s come to, in the evolving fight to protect private, sensitive and copyrighted material from roving bots.

The hope, for users of Kudurru and other paid and free services like it, is that the disruptions — the term for it is “poisoning of datasets” — will so skew the training of generative AI programs that the bots will eventually stop trying to get through, and will simply leave such firewall-protected content alone.

Attempts at such poisoning can be traced to 2020, when a team of computer scientists at the University of Chicago created the software tool Fawkes (the name inspired by the Guy Fawkes mask used by online hacktivists). This tool adds hidden changes to an individual’s publicly available photographs, in attempts at personal privacy protection.

The pixel-level changes are invisible to the human eye but act as “Trojan horses… to any facial recognition models” the website states, yielding, in place of a clear face, a highly distorted version of the image instead.

The Chicago team didn’t stop there. By 2022, the scientists, led by Ben Zhao and Heather Zheng, were able to extend this method of image-cloaking in ways that also stymied generative models such as Midjourney and Stability AI, to protect images that had been accessed without the artist’s consent (and therefore without credit or compensation)

They have been expanding their offerings ever since. Glaze, launched in March 2023, tackles a different aspect of the problem: mimicry of style. By introducing a set of minimal changes to a digital artwork, it cloaks or obscures brush strokes, facial details and colour palettes to make vital elements of a work’s unique style inaccessible to a computer program.

As a result, a realistic charcoal portrait may register to a bot as an abstract art work instead.

Nightshade, launched this year, offers an offensive function, altering the image as seen by an AI model so that it disrupts training rather than merely cloaking its own details.

A shaded image of a cow lazing in a green field, for instance, will appear to the algorithm like a large leather purse on grass. “Trained on a sufficient number of shaded images that include a cow, a model will become increasingly convinced cows have nice brown leathery handles and smooth side pockets with a zipper, and perhaps a lovely brand logo,” the website states.

Glaze and Nightshade can be used in conjunction, for best results, and are currently free for artists, since they were funded by research grants and donations from organisations such as the US National Science Foundation, US Defense Advanced Research Projects Agency (DARPA), Amazon Web Services (AWS), and the enterprise AI company C3 AI.

How long will it be before the bots catch on, and the visual noise required to fool them will begin to affect the art work in more visible ways? For now, cloaking works precisely because of fundamental weaknesses in how AI models are designed today, the Chicago researchers have said.

But already, Glaze is fighting to find ways to keep changes unnoticeable across artistic styles. Kudurru, which identifies about 50 scraper IP addresses an hour, only temporarily blocks them because they keep changing.

The most realistic hope is that the damage caused to training models will be enough discouragement that AI companies will choose to use open-source imagery instead, or pay artists for their work. It is a dream worth fighting for.

As Zhao put it in an interview with NPR: “I would like to bring about a world where AI has limits… guardrails… ethical boundaries that are enforced by tools.” Because the laws, well, they’re going to take far longer to catch up.

Get World Cup ready with Crick-it! From live scores to match stats, catch all the action here. Explore now!.

Catch your daily dose of Fashion, Taylor Swift, Health, Festivals, Travel, Relationship, Recipe and all the other Latest Lifestyle News on Hindustan Times Website and APPs.
SHARE THIS ARTICLE ON
Share this article
SHARE
Story Saved
Live Score
OPEN APP
Saved Articles
Following
My Reads
Sign out
New Delhi 0C
Friday, June 28, 2024
Start 14 Days Free Trial Subscribe Now
Follow Us On