Anthropic's developers made an extremely basic configuration error, and as a result, the source-code for Claude Code - the company's flagship coding assistant product - has leaked and is being eagerly analyzed by many parties:
-
They struck for the right not to have their wages eroded by AI - to have the right to use (or not use) AI, as they saw fit, without risking their livelihoods.
Right now, many media companies are demanding a new copyright that would allow them to control AI training, and many creative workers have joined in this call. The media companies aren't arguing against infringing *uses* of AI models - they're arguing that the mere *creation* of such a model infringes copyright.
30/
They claim that making a transient copy of a work, analyzing that work, and publishing that analysis is a copyright infringement:
https://pluralistic.net/2023/02/09/ai-monkeys-paw/#bullied-schoolkids
Here's a good rule of thumb: any time your boss demands a new rule, you should be very skeptical about whether that rule will benefit *you*.
31/
-
They claim that making a transient copy of a work, analyzing that work, and publishing that analysis is a copyright infringement:
https://pluralistic.net/2023/02/09/ai-monkeys-paw/#bullied-schoolkids
Here's a good rule of thumb: any time your boss demands a new rule, you should be very skeptical about whether that rule will benefit *you*.
31/
It's clear that the media companies that have sued the AI giants aren't "anti-AI." They don't want to prevent AI from replacing creative workers - they just want to control how that happens.
When Disney and Universal sue Midjourney, it's not to prevent AI models from being trained on their catalogs and used to pauperize the workers whose work is in those catalogs.
32/
-
It's clear that the media companies that have sued the AI giants aren't "anti-AI." They don't want to prevent AI from replacing creative workers - they just want to control how that happens.
When Disney and Universal sue Midjourney, it's not to prevent AI models from being trained on their catalogs and used to pauperize the workers whose work is in those catalogs.
32/
What these companies want is to be paid a license fee for access to their catalogs, and then they want the resulting models to be exclusive to them, and not available to competitors:
https://pluralistic.net/2026/03/03/its-a-trap-2/#inheres-at-the-moment-of-fixation
These companies are violently allergic to paying creative workers.
33.
-
What these companies want is to be paid a license fee for access to their catalogs, and then they want the resulting models to be exclusive to them, and not available to competitors:
https://pluralistic.net/2026/03/03/its-a-trap-2/#inheres-at-the-moment-of-fixation
These companies are violently allergic to paying creative workers.
33.
Disney takes the position that when it buys a company like Lucasfilm, it secures the right to publish the works Lucasfilm commissioned, but not the obligation to pay the royalties that Lucasfilm owes when those works are sold:
https://pluralistic.net/2022/04/30/disney-still-must-pay/#pay-the-writer
As Theresa Nielsen Hayden quipped during the Napster Wars: "Just because you're on their side, it doesn't mean they're on your side."
34/
-
Disney takes the position that when it buys a company like Lucasfilm, it secures the right to publish the works Lucasfilm commissioned, but not the obligation to pay the royalties that Lucasfilm owes when those works are sold:
https://pluralistic.net/2022/04/30/disney-still-must-pay/#pay-the-writer
As Theresa Nielsen Hayden quipped during the Napster Wars: "Just because you're on their side, it doesn't mean they're on your side."
34/
If these companies manage to get copyright law expanded to restrict scraping, analysis, and publication of factual information, they won't use those new powers to increase creators' pay - they'll use them the same way they've used *every* new copyright created in the past 40 years, to make themselves richer at the expense of artists:
https://pluralistic.net/2020/03/03/just-a-stick/#authorsbargain
35/
-
If these companies manage to get copyright law expanded to restrict scraping, analysis, and publication of factual information, they won't use those new powers to increase creators' pay - they'll use them the same way they've used *every* new copyright created in the past 40 years, to make themselves richer at the expense of artists:
https://pluralistic.net/2020/03/03/just-a-stick/#authorsbargain
35/
The Claude Code leak is full of fascinating information about a tool that - like Diebold's voting machines - is at the very center of the most important policy debates of our time. Here's just one example: Claude is almost certainly implicated in the US missile that murdered a building full of little girls in Iran last month:
AI got the blame for the Iran school bombing. The truth is far more worrying
LLMs-gone-rogue dominated coverage, but had nothing to do with the targeting. Instead, it was choices made by human beings, over many years, that gave us this atrocity
the Guardian (www.theguardian.com)
36/
-
The Claude Code leak is full of fascinating information about a tool that - like Diebold's voting machines - is at the very center of the most important policy debates of our time. Here's just one example: Claude is almost certainly implicated in the US missile that murdered a building full of little girls in Iran last month:
AI got the blame for the Iran school bombing. The truth is far more worrying
LLMs-gone-rogue dominated coverage, but had nothing to do with the targeting. Instead, it was choices made by human beings, over many years, that gave us this atrocity
the Guardian (www.theguardian.com)
36/
Of course I see the irony. Anthropic has taken an extremely aggressive posture on copyright's "limitations and exceptions," arguing that it can train its models on *any* information it can find, and that it can knowingly download massive troves of infringing works for that purpose.
37/
-
Of course I see the irony. Anthropic has taken an extremely aggressive posture on copyright's "limitations and exceptions," arguing that it can train its models on *any* information it can find, and that it can knowingly download massive troves of infringing works for that purpose.
37/
It's darkly hilarious to see the company firehosing copyright complaints by the thousands in order to prevent the dissemination, dissection and discussion of the source-code that leaked due to the company's gross incompetence:
38/
-
It's darkly hilarious to see the company firehosing copyright complaints by the thousands in order to prevent the dissemination, dissection and discussion of the source-code that leaked due to the company's gross incompetence:
38/
But what's objectionable about Anthropic - and the AI sector - isn't *copyright*. The thing that makes these companies disgusting is their gleeful, fraudulent trumpeting about how their products will destroy the livelihoods of every kind of worker:
https://pluralistic.net/2025/03/18/asbestos-in-the-walls/#government-by-spicy-autocomplete
And it's their economic fraud, the inflation of a bubble that will destroy the economy when it bursts:
https://www.wheresyoured.at/the-subprime-ai-crisis-is-here/
39/
-
But what's objectionable about Anthropic - and the AI sector - isn't *copyright*. The thing that makes these companies disgusting is their gleeful, fraudulent trumpeting about how their products will destroy the livelihoods of every kind of worker:
https://pluralistic.net/2025/03/18/asbestos-in-the-walls/#government-by-spicy-autocomplete
And it's their economic fraud, the inflation of a bubble that will destroy the economy when it bursts:
https://www.wheresyoured.at/the-subprime-ai-crisis-is-here/
39/
It's their enthusiastic deployment of AI tools for mass surveillance and mass killing. (Anthropic is no exception, despite what you may have heard:)
40/
-
It's their enthusiastic deployment of AI tools for mass surveillance and mass killing. (Anthropic is no exception, despite what you may have heard:)
40/
If the media bosses get their way, and manage to make it even more illegal - and practically harder - to host, discuss, and publish facts about copyrighted works, then leaks like the Claude Code disclosures will never see the light of day. It's only because of decades of hard-fought battles to push back on this nonsense that we are able to identify and learn about the defects in Claude Code that are revealed by this source-code leak.
41/
-
If the media bosses get their way, and manage to make it even more illegal - and practically harder - to host, discuss, and publish facts about copyrighted works, then leaks like the Claude Code disclosures will never see the light of day. It's only because of decades of hard-fought battles to push back on this nonsense that we are able to identify and learn about the defects in Claude Code that are revealed by this source-code leak.
41/
I'm angry about the AI industry, but not because of *copyright*. I'm angry at them for the reasons Cat Valente articulated so well in her "Blood Money" essay:
These companies' stated goals are terrible:
> They took the books I wrote for children and used them to make it possible for children to not bother with reading ever again.
42/
-
I'm angry about the AI industry, but not because of *copyright*. I'm angry at them for the reasons Cat Valente articulated so well in her "Blood Money" essay:
These companies' stated goals are terrible:
> They took the books I wrote for children and used them to make it possible for children to not bother with reading ever again.
42/
> They took the books I wrote about love to create chatbots that isolate people and prevent them from finding human love in the real world, that make it difficult for them to even stand real love, which is not always agreeable, not always positive, not always focused on end-user engagement. They took the books I wrote about hope and glitter in the face of despair and oppression and used it to make a Despair-and-Oppression generator.
43/ -
> They took the books I wrote about love to create chatbots that isolate people and prevent them from finding human love in the real world, that make it difficult for them to even stand real love, which is not always agreeable, not always positive, not always focused on end-user engagement. They took the books I wrote about hope and glitter in the face of despair and oppression and used it to make a Despair-and-Oppression generator.
43/These goals are *entirely compatible with copyright*. The *New York Times* is suing over AI - and they're licensing their writers' words to train an AI model:
The *NYT* wants more copyright. You know what the *NYT* *doesn't* want? More *labor* rights. The *NYT* are vicious union-busters:
44/
-
These goals are *entirely compatible with copyright*. The *New York Times* is suing over AI - and they're licensing their writers' words to train an AI model:
The *NYT* wants more copyright. You know what the *NYT* *doesn't* want? More *labor* rights. The *NYT* are vicious union-busters:
44/
If we creative workers are going to pour our resources into a new policy to address the threats that our bosses - and the AI companies they are morally and temperamentally indistinguishable from - represent to our livelihoods, then let that new policy be a renewed sectoral bargaining right for *every* worker. It was sectoral bargaining (a collective, solidaristic right) and not copyright (an individual, commercial right) that saw off AI in the Hollywood writers' strike.
45/
-
If we creative workers are going to pour our resources into a new policy to address the threats that our bosses - and the AI companies they are morally and temperamentally indistinguishable from - represent to our livelihoods, then let that new policy be a renewed sectoral bargaining right for *every* worker. It was sectoral bargaining (a collective, solidaristic right) and not copyright (an individual, commercial right) that saw off AI in the Hollywood writers' strike.
45/
Copyright positions the creative worker as a small business - an LLC with an MFA - bargaining B2B with another firm. To the extent that copyright helps us, it is largely incidental. Sure, we were able to file for a few thousand bucks per book that Anthropic downloaded from a pirate site to train its models on. But Anthropic doesn't have to use a shadow library to get those books - it can just pay our bosses to get them.
46/
-
Copyright positions the creative worker as a small business - an LLC with an MFA - bargaining B2B with another firm. To the extent that copyright helps us, it is largely incidental. Sure, we were able to file for a few thousand bucks per book that Anthropic downloaded from a pirate site to train its models on. But Anthropic doesn't have to use a shadow library to get those books - it can just pay our bosses to get them.
46/
It's *great* that Claude Code's source is online. It's *great* that we have the ability to pore over, analyze and criticize this code, which has become so consequential in so many ways. It's *great* the copyright is weak enough that this is possible (for now).
47/
-
It's *great* that Claude Code's source is online. It's *great* that we have the ability to pore over, analyze and criticize this code, which has become so consequential in so many ways. It's *great* the copyright is weak enough that this is possible (for now).
47/
Expanding copyright will gain little for creative workers, except for a new reason to be angry about how our audiences experience our work. Expanding *labor* rights will gain much, for *every* worker, including our audiences. It's an idea that our bosses - *and* AI hucksters - hate with every fiber of their beings.
eof/
-
Anthropic's developers made an extremely basic configuration error, and as a result, the source-code for Claude Code - the company's flagship coding assistant product - has leaked and is being eagerly analyzed by many parties:
https://news.ycombinator.com/item?id=47586778
--
If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
1/
@pluralistic
If Claude scrapped Claude’s source code would he have to take it down? -
Anthropic's developers made an extremely basic configuration error, and as a result, the source-code for Claude Code - the company's flagship coding assistant product - has leaked and is being eagerly analyzed by many parties:
https://news.ycombinator.com/item?id=47586778
--
If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
1/
@pluralistic #AI is #clankers
all the way down.