Check out all the on-demand sessions from the Intelligent Security Summit here.
Spawning, an organization that launched in September to build tools for artist ownership of their training data, announced yesterday that Stability AI will honor artists’ requests to opt out of the training of Stable Diffusion 3, supposedly beginning in a few weeks. Artists can request their Stable Diffusion opt-outs at haveibeentrained.com.
According to Berlin-based Mat Dryhurst, who founded Spawning in September with his wife, musician Holly Herndon, the organization has been chatting with Stability AI and LAION, a non-profit open source dataset that is part of Stability AI, for months.
“They were immediately responsive,” Dryhurst told VentureBeat. “Both organizations have been transparent about data from the beginning, which I think may be why they get disproportionate scrutiny. We approach this issue as a problem to solve, and they have been supportive and full of ideas.”
Text-to-image AI has raised questions about who owns images
Since DALL-E 2 was released in April, creative industries have been buzzing with questions about AI art image ownership. Back in August, Bradford Newman, who leads the machine learning (ML) and artificial intelligence (AI) practice of global law firm Baker McKenzie, in its Palo Alto office, said the answer to the question “Who owns DALL-E images?” is far from clear. And, he emphasized, legal fallout is inevitable.
Intelligent Security Summit On-Demand
Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today.
When the open-source Stable Diffusion was released in August, there were even more questions about the model’s training. And just a couple of days ago, a new study (not yet peer-reviewed) was released raising new concerns. It identified cases where image-generating models, including Stable Diffusion, copied from public internet data — including copyrighted images — on which they were trained.
Stable Diffusion art ownership issues
Emad Mostaque, founder and CEO of Stability AI, pointed out on Twitter that Spawning would also be offering opt-in requests — for artists who want their images included in the training data.
“Technically this is tags for LAION and coordinated around that,” he tweeted. “It’s actually quite difficult due to size (eg what if your image is on a news site?) Exploring other mechanisms for attribution etc, welcome constructive input.”
Mostaque also seemed keen to emphasize that Stability AI is not taking this step for Stable Diffusion training because of anticipated legal or ethical reasons. “There is no legal reason for this we believe, but we think different model datasets will be interesting and would like to see output differentials,” he tweeted. “We think over time most folk will opt-in for richer experiences, just as we have seen them use artstation & others.”
Could Stable Diffusion set a precedent for AI art?
But whether or not Mostaque thinks the measures are necessary for Stable Diffusion (or whether other artists think it goes far enough), Dryhurst insists it is a “great opportunity to set a precedent for AI art moving forward.”
It makes sense for people to register their wishes once with Spawning, he explained (the organization has claimed it is an independent organization that does not do anything with the data), so that the information can then be served to various organizations — so artists don’t have to play whack-a-mole.
“It is certainly a chaotic challenge, and we have no false confidence it will be possible to enforce things in a completist way, as technically anyone could scrape the web,” Dryhurst admitted. “We just feel that the majority of interactions are going to be with a few models from large organizations, and don’t see why those organizations wouldn’t honor requests we can provide them. In my opinion, it is doing them a favor so they can focus on the science.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.