The Center for Investigative Reporting Sues OpenAI and Microsoft Over Copyright Infringement

The Center for Investigative Reporting Sues OpenAI and Microsoft Over Copyright Infringement

The Center for Investigative Reporting (CIR), the nonprofit organization behind Mother Jones and Reveal, has filed a lawsuit against OpenAI and Microsoft, alleging copyright violations related to the use of its content in artificial intelligence systems. The legal action, filed in the U.S. District Court for the Southern District of New York, marks another significant development in the ongoing debate over AI's impact on journalism and intellectual property rights.

CIR, which holds the distinction of being the oldest nonprofit newsroom in the United States, claims that OpenAI and Microsoft have been using its copyrighted material without permission or compensation to train and enhance their AI models. The lawsuit specifically targets the practice of using journalistic content to improve AI systems' capabilities, potentially undermining the very industry that produces the source material.

Monika Bauerlein, CEO of the Center for Investigative Reporting, emphasized the importance of the organization's unique content, stating, "OpenAI and Microsoft started vacuuming up our stories to make their product more powerful, but they never asked for permission or offered compensation, unlike other organizations that license our material."

The lawsuit highlights several key issues:

  1. Copyright infringement: CIR alleges that OpenAI and Microsoft have violated the Copyright Act by using its content without authorization.
  2. Digital Millennium Copyright Act violations: The suit also claims breaches of the DMCA, which provides additional protections for copyrighted works in the digital age.
  3. Threat to journalism: CIR argues that AI-generated summaries of articles pose a significant threat to publishers and the broader journalism industry.
  4. Value of diverse perspectives: The lawsuit underscores the importance of CIR's focus on investigative reporting and social justice issues, suggesting that this unique content adds particular value to AI training datasets.
  5. Potential impact on public information: Bauerlein warned that if this practice continues unchecked, "the public's access to truthful information will be limited to AI-generated summaries of a disappearing news landscape."

This legal action follows similar lawsuits filed by other prominent news organizations, including The New York Times, Chicago Tribune, and The Intercept. These cases collectively represent a growing pushback from the media industry against tech companies' use of copyrighted content in AI development.

The dispute raises fundamental questions about the balance between technological innovation and the protection of intellectual property rights. As AI systems become increasingly sophisticated and widely used, the outcome of these legal challenges could have far-reaching implications for both the tech and media industries.

For nonprofit news organizations like CIR, which rely on their unique content to fulfill their mission and sustain their operations, the stakes are particularly high. The lawsuit seeks to establish clear boundaries for the use of copyrighted material in AI training and to ensure that content creators are fairly compensated for their work.

As this legal battle unfolds, it will likely contribute to the ongoing dialogue about the ethical and legal framework needed to govern the development and deployment of artificial intelligence technologies in an increasingly digital media landscape.

Chris McKay is the founder and chief editor of Maginative. His thought leadership in AI literacy and strategic AI adoption has been recognized by top academic institutions, media, and global brands.

Let’s stay in touch. Get the latest AI news from Maginative in your inbox.

Subscribe