Listen to the article
In brief
Generally Intelligent Newsletter
A weekly AI journey narrated by Gen, a generative AI model.
Read the full article here
Fact Checker
Verify the accuracy of this article using AI-powered analysis and real-time sources.
A new class action lawsuit in San Francisco federal court has accused software giant Salesforce of building its XGen AI models on a pirated library of books and then scrubbing references to those sources once questions arose.
Filed on Wednesday by authors E. Molly Tanzer and Jennifer Gilmore, the suit is brought under the Copyright Act, alleging ongoing infringement, saying Salesforce “continues to do so by continuing to store, copy, use, and process the datasets containing copies of Plaintiffs’ … copyrighted books.”
The complaint says Salesforce.INC “pirated hundreds of thousands of copyrighted books to develop its XGen series of large language models,” relying on the “notorious RedPajama and The Pile datasets” that include a books corpus known as Books3, a collection of over 196,000 books copied from the private tracker Bibliotik.
The filing says Salesforce initially listed “RedPajama-Books” among its training sources when it launched XGen in June 2023, with a company engineer linking GitHub users directly to both datasets.
By September, however, Salesforce allegedly deleted those references from its website and replaced them with vague descriptions of “natural language data” drawn from “publicly available sources.”
Hugging Face, the platform hosting Books3, removed the dataset the following month, citing copyright complaints, the lawsuit says.
The lawsuit alleges that Salesforce used The Pile to train its CodeGen models in 2022, then commercialized the technology through its Agentforce AI platform, including the XGen-Sales model released in October 2024.
Two months later, Salesforce allegedly scrubbed its disclosures, deleting charts and references to “RedPajama-Books” and replacing them with vague language about a “mixture of publicly available data,” before claiming by December 2023 that its models used a “legally compliant dataset” with no mention of RedPajama.
Ishita Sharma, managing partner at Fathom Legal, told Decrypt that authors must “prove real financial harm, not just that their books were used for training,” noting how Judge Vince Chhabria recently dismissed similar claims against Meta, ruling that “simply claiming ‘our work was used’ isn’t enough.”
Recent rulings favored OpenAI and Anthropic in similar cases, with judges finding authors failed to prove market harm, though one criticized Anthropic for maintaining “a permanent library of pirated books.”
‘Using public datasets like RedPajama or The Pile doesn’t automatically erase willful infringement,” Sharma said, adding, “if they knew or ignored that copyrighted works were included, courts could still find reckless disregard.”
“Unless the AI can reproduce parts of the original work, the model weights themselves aren’t considered copyright infringement,” she added.
The complaint cites statements from Salesforce CEO Marc Benioff, who told a Bloomberg interviewer in January 2024 that AI companies “ripped off” training data and that “all the training data has been stolen.”
The authors seek class certification for all U.S. copyright holders whose works were used since October 2022, demanding statutory damages, destruction of infringing copies, profit disgorgement, a willful infringement declaration, and attorneys’ fees.
A weekly AI journey narrated by Gen, a generative AI model.
Read the full article here
Verify the accuracy of this article using AI-powered analysis and real-time sources.
Enter your email to receive detailed fact-checking analysis
You've used your 5 free reports. Sign up for unlimited access!
Already have an account? Sign in here
The FSNN News Room is the voice of our in-house journalists, editors, and researchers. We deliver timely, unbiased reporting at the crossroads of finance, cryptocurrency, and global politics, providing clear, fact-driven analysis free from agendas.
We and our selected partners wish to use cookies to collect information about you for functional purposes and statistical marketing. You may not give us your consent for certain purposes by selecting an option and you can withdraw your consent at any time via the cookie icon.
Cookies are small text that can be used by websites to make the user experience more efficient. The law states that we may store cookies on your device if they are strictly necessary for the operation of this site. For all other types of cookies, we need your permission. This site uses various types of cookies. Some cookies are placed by third party services that appear on our pages.
