Quantcast
Channel: slashCAM Forum
Viewing all articles
Browse latest Browse all 2316

KI • Re: AI-Diskussionsthread - Pro/Kontra?

$
0
0
enhanced-image.png
AI poisoning tool Nightshade received 250,000 downloads in 5 days: ‘beyond anything we imagined’

Nightshade, a new, free downloadable tool created by computer science researchers at the University of Chicago designed to be used by artists to disrupt AI models scraping and training on their artworks without consent, has received 250,000 downloads in the first five days of its release.

“Nightshade hit 250K downloads in 5 days since release,” wrote the leader of the project, Ben Zhao, a professor of computer science, in an email to VentureBeat, later adding, “I expected it to be extremely high enthusiasm. But I still underestimated it…The response is simply beyond anything we imagined.”

It’s a strong start for the free tool, and shows a robust appetite among some artists to protect their work from being used to train AI without consent. According to the Bureau of Labor Statistics, there are over 2.67 million artists in the U.S. alone, but Zhao told VentureBeat that the users of Nightshade are likely even broader.

Nightshade seeks to “poison” generative AI image models by altering artworks posted to the web, or “shading” them on a pixel level, so that they appear to a machine learning algorithm to contain entirely different content — a purse instead of a cow, let’s say. Trained on a few “shaded” images scraped from the web, an AI algorithm can begin to generate erroneous imagery from what a user prompts or asks.

On the Nightshade project page, Zhao and his colleagues — Shawn Shan, Wenxin Ding, Josephine Passananti, and Heather Zheng — stated they developed and released the tool to “increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.”
https://venturebeat.com/ai/ai-poisoning ... -imagined/

Viewing all articles
Browse latest Browse all 2316