AI Music Digest

Music publishers sue Anthropic for $3B over BitTorrent piracy, Deezer licenses AI detection tech to Sacem, Sweden bans AI hit from charts, and Musical AI raises $4.5M for attribution technology.

Summary

Music publishers have filed a $3 billion lawsuit against Anthropic, alleging the AI company used BitTorrent to pirate lyrics and sheet music for training Claude. Meanwhile, Deezer has begun licensing its AI detection technology to combat streaming fraud, Sweden has banned an AI-generated hit from its official charts, and a startup focused on AI attribution has secured new funding.

⚖️

Music Publishers Sue Anthropic for $3B Over BitTorrent Piracy

A coalition of music publishers including Universal Music Group, Concord, and ABKCO has filed a new lawsuit against Anthropic, alleging the AI company used BitTorrent to illegally download copyrighted lyrics and sheet music to train its Claude chatbot.

The lawsuit, filed in US District Court for the Northern District of California, names CEO Dario Amodei and co-founder Benjamin Mann as defendants. The filing claims Mann “personally used BitTorrent to download via torrenting from LibGen approximately five million copies of pirated books” in June 2021, and that Amodei “personally discussed and authorized this illegal torrenting.”

The complaint covers over 20,000 copyrighted songs—a massive expansion from the 499 works in the original 2023 lawsuit. Songs cited include works by The Rolling Stones, Elton John, Neil Diamond, Ariana Grande, and Vanessa Carlton. Because BitTorrent requires downloaders to simultaneously upload files to other users, publishers allege Anthropic also violated their exclusive distribution rights.

Why It Matters: This lawsuit follows Anthropic’s $1.5B settlement with book authors whose works were similarly pirated. Judge Alsup’s prior ruling established that while AI training may qualify as fair use, acquiring content through piracy does not—creating a roadmap for music publishers to pursue massive damages.

🤖

Deezer Licenses AI Detection Tool to Sacem, Plans Industry Rollout

Deezer has licensed its AI music detection technology to Sacem, France’s royalty collection agency representing over 300,000 creators, in the first commercial deal of its kind. The streaming platform plans to expand licensing to other European collective societies during Grammy Week in Los Angeles.

The deal addresses a rapidly growing problem: Deezer now receives approximately 60,000 fully AI-generated tracks daily—39% of total uploads—up from just 10% in January 2025. The company says it successfully identified and removed up to 85% of fraudulent AI-generated streams from its royalty pool in 2025, flagging over 13.4 million tracks.

Deezer’s detection system, trained on 94 million songs, identifies “subtle anomalies inaudible to human ears” and can detect AI-generated music from platforms like Suno and Udio with 99.8% accuracy. The company filed two patents for the technology in 2024.

Why It Matters: By licensing rather than hoarding its detection technology, Deezer is positioning AI fraud detection as shared industry infrastructure. If widely adopted, this could significantly reduce the royalty siphoning that currently diverts payments away from human artists.

📊

Sweden Bans AI Folk-Pop Hit from Official Charts

Sweden’s official chart body, Sverigetopplistan, has banned the song “Jag vet, du är inte min” (I Know, You’re Not Mine) by virtual artist Jacub from its rankings—despite the track amassing over 5 million Spotify streams and topping Sweden’s domestic charts.

The decision followed an investigation by Swedish journalist Emanuel Karlsten, who discovered that Jacub had no social media presence, media appearances, or tour dates. He traced the song to Stellar Music, a Danish publishing firm, finding that two credited rights holders work in Stellar’s AI department.

IFPI Sweden CEO Ludvig Werner declared: “If it is a song that is mainly AI-generated, it does not have the right to be on the top list.” The creators, calling themselves “Team Jacub,” pushed back, arguing AI was merely “a tool” within a “human-controlled creative process” and that they invested significant “time, care, emotions, and financial resources.”

Why It Matters: This ruling establishes an early precedent for how charts handle AI-assisted music. The key question—what percentage of AI involvement disqualifies a track—remains unanswered, creating uncertainty for artists using AI as a creative tool.

💰

Musical AI Raises $4.5M for Attribution Technology

Musical AI, a startup building technology to trace which training data influences specific AI model outputs, has raised $4.5 million in a funding round led by Heavybit.

The company’s platform enables proper attribution and compensation when copyrighted works are used in AI training. Partners include Symphonic Distribution, Pro Sound Effects, SourceAudio, and SoundBreak AI—the latter founded by Better Than Ezra frontman Kevin Griffin, which trained its models on licensed works via Musical AI’s platform.

The company will use the funding to expand its team and refine its attribution technology, which addresses one of the industry’s thorniest problems: determining which training data contributed to any given AI output.

Why It Matters: As AI music generation scales, attribution technology could become essential infrastructure for ensuring creators are compensated when their works influence AI outputs. This funding signals investor confidence that transparent, traceable AI training is the industry’s future.

Trending Themes

  • Piracy allegations escalating AI copyright battles beyond fair use debates
  • Detection and attribution technologies emerging as industry infrastructure
  • Chart bodies grappling with AI eligibility rules as AI tracks gain traction