ZigaForm version 7.6.9

AI vs. Copyright: Insights from the July 2025 Senate Hearing

Comment count

Publish date

07/21/2025

Post author

Deena Rae
Interior library scene with rows of bookshelves and reading chairs, overlaid with the text “News & Trends – The World of Publishing.”
Top 3 Stories in Publishing & Literature
Senate Hearing Debates AI Training on Copyrighted Works
Judge Rules Class Action Suit Against Anthropic Can Proceed
McGraw Hill Files for Public Offering

On July 16, the U.S. Senate Judiciary Subcommittee—led by Senator Josh Hawley—held a hearing titled “Too Big to Prosecute? Examining the AI Industry’s Mass Ingestion of Copyrighted Works for AI Training.” Witnesses argued that major AI firms use pirated and legally acquired books, sparking debate over fair use. While some legal rulings have deemed certain training “transformative,” others—like those against Anthropic—found downloading pirated works explicit infringement. Authors’ guilds hailed the hearing as a chance to push Congress toward stronger IP protections in AI development.

On July 17, U.S. District Judge William Alsup certified a class action suit in Bartz v. Anthropic, allowing authors whose works were allegedly pirated for AI training to sue collectively. While the court recognized fair use for legally acquired texts, it deemed the use of pirate sites like LibGen and PiLiMi infringing. Alsup limited the class to “Pirated Books” from those sites and mandated Anthropic disclose titles and ISBNs by August 1. The ruling could pave the way for a potentially billion‑dollar settlement if authors prevail.

McGraw Hill, acquired by Platinum Equity in 2021, filed an SEC prospectus on July 14 aiming to raise roughly $530 million by offering shares at $19–$22 each. The IPO would value the company at around $4 billion. The prospectus highlights McGraw Hill’s digital transformation: digital sales rose to 65% of revenue in fiscal 2025 (82% outside K–12). Proceeds will repay debt, while the company doubles down on personalized learning tech and generative AI tools for educators. Investors are betting on McGraw Hill’s pivot from print to digital.

On July 16, 2025, a Senate hearing spotlighted AI’s mass ingestion of copyrighted works—a development authors call “the largest IP theft in history.” This session sets the stage for potential legislative safeguards. Here’s what you need to know and how authors, publishers, and policymakers can act to protect creative rights.

The Crux of the Hearing

Senator Josh Hawley chaired the subcommittee hearing “Too Big to Prosecute?” opening with a stark charge: AI companies train on stolen material. Witnesses split between cautioning that transformative use falls under fair use, and condemning mass ingestion of pirated texts from LibGen and PiLiMi. Legal scholar Edward Lee argued fair use could apply, while piracy defenders like Maxwell Pritt—suing Meta—highlighted courts’ rulings limiting truly transformative use to legally obtained texts. The debate underscored that while AI promises innovation, it can also erode author incentives if left unchecked. The hearing aired Congress’s appetite for new IP protections, signaling that vague fair-use defenses may soon face statute-level clarifications.

What Authors and Publishers Should Do

Update Contracts:
  • Insert explicit AI training clauses stating whether your work can be used in model training and under what terms.
  • Define royalty structures for any AI-generated derivatives.
Advocate for Legislation:
  • Support bills clarifying copyright in AI contexts.
  • Engage with organizations like the Authors Guild to lobby lawmakers.
Enhance Transparency:
  • Require publishers to disclose if AI tools are used in editing or marketing.
  • Demand clear labeling: “Generated with AI Assistance” on any relevant output.

These actions ensure authors retain agency over how their intellectual property is used—and compensated—across the AI pipeline.

The Road Ahead: Policy and Practice

Congress looks poised to draft legislation, possibly narrowing the fair-use defense for AI. Industry groups should collaborate on model policies—much like the 70+ authors’ petition—calling for responsible AI adoption. Publishers can lead by piloting AI-audit processes to track copyrighted source usage. Meanwhile, grassroots efforts—such as class action suits like Bartz v. Anthropic—apply pressure through the courts, potentially leading to hefty settlements.

By combining legislative advocacy, contractual clarity, and legal action, the publishing community can shape AI’s evolution. The goal isn’t to stall progress but to ensure innovation doesn’t trample creators’ rights.

📘 New to Publishing?

Start with the fundamentals. Our Education hub covers everything indie authors need to know.

📬 Get Weekly Publishing Updates

Don’t miss the latest trends. Join our newsletter for curated news and practical takeaways every Monday.

🛠️ Want Practical Advice?

Our Publishing Tips blog posts turn industry news into action steps for indie authors.

Related Posts

0 Comments

Leave a Reply

Get News & Tips to Your Inbox

Get News & Tips to Your Inbox

Join our mailing list to receive the latest news and updates from our team.

You have Successfully Subscribed!

Pin It on Pinterest

Share This