A growing number of lawsuits expected to take shape this year could significantly alter how generative artificial intelligence is applied to numerous industries, including news agencies, photography, and literature.
The most recent — and furthest reaching lawsuit — was filed by The New York Times on December 27, 2023.
The NYT claims that articles written by journalists for the paper were used to train AI, which is now being used to compete against traditional reporting. Among the allegations, the NYT claims AI misappropriated stories that were behind a paywall and offered them for free and created “hallucinations” in which inaccurate information was credited to the paper.
“Copyright owners have been lining up to take whacks at generative AI like a giant piñata woven out of their works. 2024 is likely to be the year we find out whether there is money inside,” James Grimmelmann, professor of digital and information law at Cornell, told Axios.
Grimmelmann added that the use of copyright law against AI by creators could stifle the development and creation of AI systems, a move he believes is not the intended role of the law.
Instead, the effort to apply copyright law to AI is more likely to create a system in which only well-funded companies will be able to use the programs, University of Miami law professor Andres Sawicki told Axios.
For instance, a recent agreement between the Associated Press and OpenAI to license content creation could become mandatory, Sawicki suggested. This arrangement would prevent individuals, academic writers, and start-ups from accessing AI programming because of a lack of funding.
Adapting copyright law to AI is possible, according to Jerry Levine, general counsel for ContractPodAI. He said AI can be trained to summarize a document and provide a link to the full document.
Efforts to rein in generative AI through legislation are well underway, though controversy exists about the process. For one, Politico reported that key legislative staffers in the Congressional offices tasked with creating a regulatory framework are funded by companies developing AI through the nonprofit Open Philanthropy and in association with Google, Microsoft, IBM, and OpenAI.
The outcomes of the lawsuits are unpredictable, but in the absence of AI-specific legislation, many of the courts’ rulings will revolve around fair use, as the Harvard Business Review pointed out. Investigations into AI’s potential violations of related regulations are underway through the U.S. Federal Trade Commission, the U.S. Copyright Office, and the Senate Judiciary Subcommittee.