AI Threatens to Crush News Organizations. Lawmakers Signal Change Is Ahead

  • Oops!
    Something went wrong.
    Please try again later.

More than a decade ago, the normalization of tech companies carrying content created by news organizations without directly paying them — cannibalizing readership and ad revenue — precipitated the decline of the media industry. With the rise of generative artificial intelligence, those same firms threaten to further tilt the balance of power between Big Tech and news.

On Wednesday, lawmakers in the Senate Judiciary Committee referenced their failure to adopt legislation that would’ve barred the exploitation of content by Big Tech in backing proposals that would require AI companies to strike licensing deals with news organizations.

More from The Hollywood Reporter

Richard Blumenthal, Democrat of Connecticut and chair of the committee, joined several other senators in supporting calls for a licensing regime and to establish a framework clarifying that intellectual property laws don’t protect AI companies using copyrighted material to build their chatbots.

“We need to learn from the mistakes of our failure to oversee social media and adopt standards,” he said.

The fight over the legality of AI firms eating content from news organizations without consent or compensation is split into two camps: Those who believe the practice is protected under the “fair use” doctrine in intellectual property law that allows creators to build upon copyrighted works, and those who argue that it constitutes copyright infringement. Courts are currently wrestling with the issue, but an answer to the question is likely years away. In the meantime, AI companies continue to use copyrighted content as training materials, endangering the financial viability of media in a landscape in which readers can bypass direct sources in favor of search results generated by AI tools.

During the hearing centered on oversight of AI in journalism, Roger Lynch, chief executive of Condé Nast, urged Congress to “clarify that the use of our content and other publications’ content for training and output of AI models is not fair use.” With that issue out of the way, he explained that the “free market will take care of the rest” in reference to how licensing deals could be struck.

Josh Hawley, Republican of Missouri, called the proposal “imminently sensible.” Going one step further, he stressed, “Why shouldn’t we expand the regime outward to say anyone whose data is ingested and regurgitated by generative AI — whether in name, image or likeness — has the right to compensation?”

A lawsuit from The New York Times, filed last month, pulled back the curtain behind negotiations over the price and terms of licensing its content. Before suing, it said that it had been talking for months with OpenAI and Microsoft about a deal, though the talks reached no such truce. In the backdrop of AI companies crawling the internet for high-quality written content, news organizations have been backed into a corner, having to decide whether to accept lowball offers to license their content or expend the time and money to sue in a lawsuit. Some companies, like Axel Springer, took the money.

A major subject of the hearing was whether new legislation is necessary to account for what Lynch characterized as AI companies building their business model on “stolen goods.”

“I think it’s premature,” said Curtis LeGeyt, chief executive of the National Association of Broadcasters. “If we have clarity that the current laws apply to generative AI, the market will work.”

While he agreed the law is on his side, Lynch added a source of frustration is how long it’ll take for the courts to resolve the issue.

“A major concern is the amount of time to litigate, appeal, go back to the courts, appeal, and maybe make it to the Supreme Court to settle,” he said. “Between now and then, many media companies will go out of business.”

Jeff Jarvis, professor at the Craig Newmark Graduate School of Journalism, pushed back against the adoption of “protectionist legislation for a struggling industry.” On the issue of fair use, he advocated for a broader interpretation of the doctrine and said that journalists take advantage of it “every day” when they “ingest information and put it out in a different way.”

Under intellectual property laws, facts aren’t copyrightable. This means that journalists are free to report common details without infringing on any copyrights as long as they aren’t copying excerpts word for word. It’s among the reasons that the Times may face an uphill battle in its suit against OpenAI, though the production of evidence of ChatGPT generating verbatim responses of its articles may get it over the hump.

And while AI companies have yet to argue in court that they can claim immunity under Section 230 of the Communication Decency Act, which has historically afforded tech firms significant legal protection from liability as third-party publishers, it remains a battleground for copyright issues surrounding generative AI. Chamber of Progress, a tech industry coalition whose members include Amazon, Apple and Meta, argued in filings to the copyright office that big tech’s favorite legal shield should be interpreted to immunize firms from infringement claims.

Blumenthal stressed that AI firms shouldn’t be protected under Section 230 if they’re sued for content produced by AI tools.

“There’s a deeply offensive irony here, which is that all of you and your publications or your broadcast stations can be sued,” he said.

On top of copyright issues around generative AI tools, lawmakers have signaled concern around the creation of deepfakes and voice clones. On Wednesday, a bipartisan coalition of House lawmakers introduced a bill to prohibit the publication and distribution of unauthorized digital replicas. It’s intended to give individuals the exclusive right to approve the use of their image, voice and likeness by conferring intellectual property rights in the federal law.

Touching on the increasing prevalence of such deceptive content, Hawley said, “This seems to me like a situation we have to address and quickly.”

Best of The Hollywood Reporter