A federal judge in San Francisco died late on Monday that the use of books for anthropology without permission to train the artificial intelligence system was legal under the American Publishing and Publishing Law.
Alsup, the province's judge in the United States, said that he is standing alongside technology companies about a pivotal question for the artificial intelligence industry, that the Antarbur took advantage of the “fair use” of books written by the book Andrea Partz, Charles Gray, Kirk Wallace Johnson to train its great linguistic model.
However, Alsup also said that copying and storing Antarubor for more than 7 million pirated books in a “central library” that violated the copyright of the authors and was not a fair use. The judge ordered a trial in December to determine the amount of what a person owes for violation.
Publishing rights law in the United States says that intentional violation of copyright can justify the legal damage of $ 150,000 per work.
A spokesman for the anthropologist said that the company was pleased that the court confessed to training it from “transformational” artificial intelligence and “in line with the purpose of copyright in enabling creativity and enhancing scientific progress.”
The book presented the proposed collective measures against the anthropoor last year, on the pretext that the company, which is supported by Amazon and Alphabet, used pirate versions of their books without permission or compensation to teach Claude to respond to human claims.
The proposed collective lawsuit is one of the many lawsuits filed by authors, news outlets and other copyright owners against companies, including Openai, Microsoft and Meta platforms to train them in artificial intelligence.
The doctrine of fair use allows the use of copyright -protected business without the permission of the copyright in some cases.
Adel use is a major legal defense for technology companies, and Alsup is the first to consume it in the context of artificial intelligence.
Artificial intelligence companies argue that their systems use copyrights to create new transformational content, and that forcing to push copyright holders for their work can pass the increasing artificial intelligence industry.
Antarubor told the court that it had benefited from the books fairly and that the law of copyright in the United States “not only allows, but it encourages” artificial intelligence training because it enhances human creativity. The company said that its system copying books to “study the prosecutor's writing, extract unnoticed information from it, and use what I learned to create a revolutionary technology.”
Publishers say that artificial intelligence companies are copying their work illegally to generate competitive content that threatens their livelihoods.
Alsup agreed with Antarbur on Monday that his training was “very transformed”.
“Like any reader who aspires to be a writer, LLMS of Anthropor has been trained on the works not to race forward, repeat or replace it – but to turn into a solid angle and create something different.”
However, Alsup also said that the Antarbur has violated the rights of authors by saving pirated copies of their books as part of a “central library for all books in the world” that will not necessarily be used to train artificial intelligence.
Humanitarian intelligence companies and other artificial intelligence companies, including Openai and Meta, were accused of downloading digital copies of millions of books to train their systems.
Anthropor told the Antarbur in a court file that a source of his books was not related to fair use.
Alsup said on Monday: “This matter doubts that any accused person can fulfill his burden of explaining the reason for downloading the source copies of the pirate sites that he could have built or accessed in another way that was reasonably necessary for any subsequent just use.”