He Quit Stability AI Over Training On Copyrighted Material. Will It Change Anything?

0

He Quit Stability AI Over Training On Copyrighted Material. Will It Change Anything?

Troubled by the public stance Stability AI took on its use of copyrighted material to train its generative AI product, the company’s head of audio, Ed Newton-Rex, announced he was leaving the company. The move reignited a broader conversation surrounding the concept of “fair use” and the rampant use of content in AI models without the creator’s permission.

A good policy whenever any new generative AI model comes out is to ask: what’s the training data?

If the company isn’t telling you, it’s a good bet that it’s trained on copyrighted work without permission.

— Ed Newton-Rex (@ednewtonrex) November 22, 2023

“I’ve resigned from my role leading the Audio team at Stability AI, because I don’t agree with the company’s opinion that training generative AI models on copyrighted works is ‘fair use,’” Newton-Rex wrote on Twitter.

The response was mixed, with some praising Newton-Rex’s stance.

“Thank you for taking a stand and doing the right thing by leaving Stability,” Reid Southern wrote. “If we had more people like you working in machine learning and generative AI, maybe we wouldn’t be in this mess.”

However, others questioned his definition of copyright infringement.

“So, anyone who reads a book, listens to music, watches a movie, etc., that then inspires them to create is infringing on copyright?” Twitter user John Harvey wrote.

The issue, Newton-Rex explained, came when Stability AI responded to a U.S. Copyright Office request for comment on generative AI. The AI developer submitted a 22-page statement that included the view that generative AI is a socially beneficial use of existing content protected by fair use and furthers the objectives of copyright law.

While Stability was the focus of Newton-Rex’s post, he told Decrypt that the issue is larger than any one company.

“My objection here isn’t really against Stability because Stability takes the same approach that many other generative AI companies in the space take,” Newton-Rex said. “It’s really a cross-industry position that I object to. In effect, I was resigning from a whole group of companies who take the same approach.”

Copyright refers to the legal right of creators to control the use of their work. Fair use, on the other hand, permits limited use of copyrighted material for education, reviews, or research. In a perfect world, these concepts would lead to a balance between a creator’s ability to use and profit from their content and the interests of the public.

In October, a group of authors, including John Grisham and Game of Thrones creator George R.R. Martin, joined a lawsuit against ChatGPT creator OpenAI, claiming their work had been fed into the popular AI models training data. That same month, a similar lawsuit against Midjourney, Deviant Art, and Stability AI hit a roadblock when a federal judge ruled that the plaintiffs had not provided enough evidence to support their copyright infringement claim.

Newton-Rex said Stability AI’s audio model was trained on music licensed from the digital music library platform AudioSparx.

“[AudioSparx] has a really good collection of music, and we did a revenue share, with the philosophy of being, if our model does well, then they do well,” he explained. “That’s the idea, and that’s one way I think this can work.”

“I don’t think generative AI and the creative industries have to be enemies,” Newton-Rex continued. “They can work together.”

But while he is optimistic that creatives and AI developers can align, he acknowledged that artists have a right to be concerned about how text, music, and other media are used to train AI models.

Newton-Rex reiterated that Stability AI’s Stable Diffusion was trained on publicly available information. However, he said it is common in the industry to use datasets like LAION-5B, which he said can include copyrighted material scraped from the internet.

“I think the more general approach across the industry in image generation in other modalities as well as data, certainly in text generation, seems to have been to take a pretty permissive view of what you can train on, which I don’t agree with,” he said.

While Newton-Rex said he did not have a particular plan when he left Stability AI, he said the conversation around copyright and fair use is vital.

“I think this is a really important conversation to have, and people are talking about this copyright question because I think we’ve got to, and we have got to decide on this.”

Source

Leave A Reply

Your email address will not be published.