Writing this up again so I can pin it: AI is literally a fascist project.
-
... gained some immediate notoriety, what fell by the wayside is that it's actually just the popular science *summary* of much deeper work, and based on a thorough analysis of as many forms of government across the globe and history as the researchers could manage.
The picture that emerges is this: natural resources beget tyrannies; lack of natural resources cause democracy.
This is, of course, a summary of a summary, and shouldn't be taken without comment. But this here is also a social...
... media thread, so I'll skip the fuller explanation, and just provide a brief summary.
No ruler exists without support, and support is essentially bought. This means that the question of who is in power largely relates to where they can raise money from, and how much they need to spend to raise more.
When there exist natural resources, the amount of people needed to extract them is relatively low. You clearly need to pay those people well, as well as the military. The rest of the...
-
Writing this up again so I can pin it: AI is literally a fascist project. Friends don't let friends use it.
Before I go into this, there are two types of responses to this that I have taken seriously so far.
One I'll call HashTagNotAllAI, which yields the obligatory "sure", but has the same smell. I'll leave it at that.
The other is that an anti AI stance also throws some assistive technology under the bus, making such a stance intrinsically ableistic. The easy thing to do is to refer...
@jens also, hallucinating assistive technology is a really bad thing, especially if it is deemed "good enough" by abled people, and deployed instead of actually reliable assistive technology, because it is cheaper.
For example, the availability of image description software is used to justify no longer describing images. That is a step up from "helpfully" running image description software on your own site and not verifying the result (because it is obvious that no description exists), but still a lot worse than actually providing good descriptions that put the image into the context of the site, and highlight important points.
-
... media thread, so I'll skip the fuller explanation, and just provide a brief summary.
No ruler exists without support, and support is essentially bought. This means that the question of who is in power largely relates to where they can raise money from, and how much they need to spend to raise more.
When there exist natural resources, the amount of people needed to extract them is relatively low. You clearly need to pay those people well, as well as the military. The rest of the...
... population is of lesser importance.
When you do not have natural resources, the only sensible source of income is taxation, for which you need a large population earning well, so that the percentage you skim off the top is enough to pay for essential support.
Lack of natural resources tends to make this service economies, which means the population also needs to be healthy, well fed, able to travel, and well educated.
When your population is well educated, it tends to want a say in how...
-
@jens thanks for the excellent write-up. The last time I tried to make this argument with @davidgerard he blocked me. I'm guessing I didn't make my position clear enough to not be confused with a genAi apologist (me, LOL)
-
... population is of lesser importance.
When you do not have natural resources, the only sensible source of income is taxation, for which you need a large population earning well, so that the percentage you skim off the top is enough to pay for essential support.
Lack of natural resources tends to make this service economies, which means the population also needs to be healthy, well fed, able to travel, and well educated.
When your population is well educated, it tends to want a say in how...
... things are done, so spending on individual people or groups of people is significantly less effective than spending on the population at large.
The result is that democracies and service oriented economies go hand in hand, and support each other rather than work in opposition.
Marx would not have used the words "service economy", but would have said "labour". Both are synonyms for "people".
Now cryptocurrencies and AI have one thing in common, other than using insane amounts of resources.
-
... things are done, so spending on individual people or groups of people is significantly less effective than spending on the population at large.
The result is that democracies and service oriented economies go hand in hand, and support each other rather than work in opposition.
Marx would not have used the words "service economy", but would have said "labour". Both are synonyms for "people".
Now cryptocurrencies and AI have one thing in common, other than using insane amounts of resources.
They're supported by the same investors. But actually, that's the same as using insane amounts of resources.
I'll explain.
The thing is this: natural resources in themselves do not matter. Yes, history is clear in where the patterns lie. But "air" is also a natural resource, and so far, there isn't much monetization of that. (Man was Spaceballs prescient: https://spaceballs.fandom.com/wiki/Perri-Air).
What makes a natural resource monetizable is scarcity. Cryptocurrencies are explicitly systems of artificial...
-
They're supported by the same investors. But actually, that's the same as using insane amounts of resources.
I'll explain.
The thing is this: natural resources in themselves do not matter. Yes, history is clear in where the patterns lie. But "air" is also a natural resource, and so far, there isn't much monetization of that. (Man was Spaceballs prescient: https://spaceballs.fandom.com/wiki/Perri-Air).
What makes a natural resource monetizable is scarcity. Cryptocurrencies are explicitly systems of artificial...
... scarcity, in which - by whichever proof scheme - those who participate early in the system benefit off those who come later (aka pyramid schemes). The proof algorithm guarantees scarcity; it's the whole point of blockchain vs. any other distributed system that there is a chokehold on resource creation somewhere.
AI is doing much the same thing, but it doesn't advertise this artificial scarcity as part of the solution. Instead, it simply guarantees that those who already own the most...
-
Writing this up again so I can pin it: AI is literally a fascist project. Friends don't let friends use it.
Before I go into this, there are two types of responses to this that I have taken seriously so far.
One I'll call HashTagNotAllAI, which yields the obligatory "sure", but has the same smell. I'll leave it at that.
The other is that an anti AI stance also throws some assistive technology under the bus, making such a stance intrinsically ableistic. The easy thing to do is to refer...
I'm still on the fence about it. It is fascinating technology, and it doesn't inherently have to be used to replace people; I've always said that strong AI (now AGI) is a pointless goal because we have plenty of people; we should use AI for things himans are bad at. However, capitalism is of course looking to use it to replace people.
But apart from that, the cost, and the origin of the training data, I see other risks in its use: that we become too dependent on it, that we outsource our actual thinking to it and become dumber as a result. I know the same has been claimed about previous technologies, like books, but man, I can just feel myself getting dumber when I use it incorrectly at work. There are better ways to use it, like as a tool to access info and learn more effectively, but we already know that many people will use it to outsource their thinking, and may be pressured explicitly or implicitly by their employer to do so. And if you do that, you're allowing yourself to be replaced by the AI.
-
... scarcity, in which - by whichever proof scheme - those who participate early in the system benefit off those who come later (aka pyramid schemes). The proof algorithm guarantees scarcity; it's the whole point of blockchain vs. any other distributed system that there is a chokehold on resource creation somewhere.
AI is doing much the same thing, but it doesn't advertise this artificial scarcity as part of the solution. Instead, it simply guarantees that those who already own the most...
... compute resources have the edge. And that is not you or me.
In short, AI is a system which a) aims to replace human labour, while b) shifting the means of production into the hands of the few.
This would be "fine" if nobody used it. What matters for this to succeed is that everyone depends on it. At that point, "means of production" becomes the digital equivalent of a "natural resource".
Marx matters, folk.
You can still argue that this makes AI a weapon of capitalism or tyranny, but...
-
... compute resources have the edge. And that is not you or me.
In short, AI is a system which a) aims to replace human labour, while b) shifting the means of production into the hands of the few.
This would be "fine" if nobody used it. What matters for this to succeed is that everyone depends on it. At that point, "means of production" becomes the digital equivalent of a "natural resource".
Marx matters, folk.
You can still argue that this makes AI a weapon of capitalism or tyranny, but...
... not outright fascism.
Technically, that's kind of true. But it's also missing an important part of the picture. As the infamous Chad C. Mulligan wrote, "COINCIDENCE: You weren't paying attention to the other half of what was going on."
First, note how Hitler's extermination camps were inspired by Henry Ford's assembly line. Capitalism and fascism always had a close relationship, and it's not really possible to separate the two. It's no coincidence that the Jews of the time were also...
-
... not outright fascism.
Technically, that's kind of true. But it's also missing an important part of the picture. As the infamous Chad C. Mulligan wrote, "COINCIDENCE: You weren't paying attention to the other half of what was going on."
First, note how Hitler's extermination camps were inspired by Henry Ford's assembly line. Capitalism and fascism always had a close relationship, and it's not really possible to separate the two. It's no coincidence that the Jews of the time were also...
... associated with the Bolsheviks, in order to justify the application of means for dealing with one supposed threat to the other.
But more importantly, Peter Thiel is a literal fascist, strong promoter and heavy investor in AI. The ties are there, right here, right now, and who benefits - and it's not just Thiel, but all of his Epstein Ilk" - from an AI takeover is abundantly clear.
It's also well documented. This isn't some vague conspiracy shit. They're saying this quiet part out loud.
-
... associated with the Bolsheviks, in order to justify the application of means for dealing with one supposed threat to the other.
But more importantly, Peter Thiel is a literal fascist, strong promoter and heavy investor in AI. The ties are there, right here, right now, and who benefits - and it's not just Thiel, but all of his Epstein Ilk" - from an AI takeover is abundantly clear.
It's also well documented. This isn't some vague conspiracy shit. They're saying this quiet part out loud.
In short, *as a system* rather than a technology, AI is without any doubt a deeply fascist project. It is a weapon aimed straight at the world population at large.
Caveats that the tech itself can be seen as neutral, and definitely has good applications remain unaffected by this.
The survival of our democracies - or sufficiently democratic systems around the world - is the thing that concerns me, though. (Also the environment, but arguably less so overall.)
-
I'm still on the fence about it. It is fascinating technology, and it doesn't inherently have to be used to replace people; I've always said that strong AI (now AGI) is a pointless goal because we have plenty of people; we should use AI for things himans are bad at. However, capitalism is of course looking to use it to replace people.
But apart from that, the cost, and the origin of the training data, I see other risks in its use: that we become too dependent on it, that we outsource our actual thinking to it and become dumber as a result. I know the same has been claimed about previous technologies, like books, but man, I can just feel myself getting dumber when I use it incorrectly at work. There are better ways to use it, like as a tool to access info and learn more effectively, but we already know that many people will use it to outsource their thinking, and may be pressured explicitly or implicitly by their employer to do so. And if you do that, you're allowing yourself to be replaced by the AI.
@mcv Please read the entire thread. I am going into this.
-
@jens I'm strongly in the "yes, but..." camp here. You're right about the current hype cycle, funding, how it is used to affect people largely around the world.
I probably end up pedantic because of my technical perspective on it. I think there are even good uses for LLMs (text related work), but it's not anythink like the chatbots, agents, general code generators today...
For the general population, AI means those things today, and in that I agree.
Is this reasonable, in your view, or no?
@nielsa I think you need to read the entire thread

-
@jens also, hallucinating assistive technology is a really bad thing, especially if it is deemed "good enough" by abled people, and deployed instead of actually reliable assistive technology, because it is cheaper.
For example, the availability of image description software is used to justify no longer describing images. That is a step up from "helpfully" running image description software on your own site and not verifying the result (because it is obvious that no description exists), but still a lot worse than actually providing good descriptions that put the image into the context of the site, and highlight important points.
@GyrosGeier I actually find it difficult to write good image descriptions. The ones I write zero in on the point I want to make, but often omit details. In a way, that's a writing faux pas. In creative writing you learn "show, don't tell", and I do the opposite.
This isn't a counter-argument (nor an argument). All I want to do is acknowledge how hard it is to do well with assitance of this kind.
-
... things are done, so spending on individual people or groups of people is significantly less effective than spending on the population at large.
The result is that democracies and service oriented economies go hand in hand, and support each other rather than work in opposition.
Marx would not have used the words "service economy", but would have said "labour". Both are synonyms for "people".
Now cryptocurrencies and AI have one thing in common, other than using insane amounts of resources.
There's an aside here that I sometimes found worth pointing out: "replacing people" doesn't necessarily mean firing people.
It may simply mean lowering their "worth" in salary negotiations, because you can use the threat of replacement with AI.
Sometimes chains of logic are as simple as "A because B", and sometimes there are several intermediary steps.
You can do a step further: even if YOUR job is not threatened by AI takeover, if the average salary drops (locally), you're also affected.
-
@condret Your mental model is not my mental model.
In my mental model, hypercapitalists - billionaire oligarchs - have no more need for extra capital. They'll pursue it, but it has absolutely lost meaning other than as a number. This is also the suggestion the very few insider views we get suggest: those people care only that their number is bigger than the other person's, not about money as such.
So any model that reduces this to a capitalist need to extract more capital is, IMHO, wrong. 1/n
-
@condret Your mental model is not my mental model.
In my mental model, hypercapitalists - billionaire oligarchs - have no more need for extra capital. They'll pursue it, but it has absolutely lost meaning other than as a number. This is also the suggestion the very few insider views we get suggest: those people care only that their number is bigger than the other person's, not about money as such.
So any model that reduces this to a capitalist need to extract more capital is, IMHO, wrong. 1/n
@condret What the involvement of e.g. Thiel, Musk, Zuck and Bezos in politics instead demonstrate is that those people care about power.
You don't need to amass capital to have power. That's where the game is currently at, sure. But real power is enslavement.
Slaves either do not buy products, or they buy products you tell them to buy, with the money you give them, carefully adjusted so that they will never have enough to break out of enslavement.
This is the game.
And what better... 2/n
-
@condret What the involvement of e.g. Thiel, Musk, Zuck and Bezos in politics instead demonstrate is that those people care about power.
You don't need to amass capital to have power. That's where the game is currently at, sure. But real power is enslavement.
Slaves either do not buy products, or they buy products you tell them to buy, with the money you give them, carefully adjusted so that they will never have enough to break out of enslavement.
This is the game.
And what better... 2/n
@condret ... way to play it than to make your future slaves dependent on something you control entirely? Make them dependent not only for their livelihood, but for their information - their education?
I don't think mere capitalist logic applies here at all.
/3
-
@GyrosGeier I actually find it difficult to write good image descriptions. The ones I write zero in on the point I want to make, but often omit details. In a way, that's a writing faux pas. In creative writing you learn "show, don't tell", and I do the opposite.
This isn't a counter-argument (nor an argument). All I want to do is acknowledge how hard it is to do well with assitance of this kind.
@jens that is a good description though: the details aren't important, but the point is. If you can't show because the recipient is vision impaired, then you need to tell.
My point is that while AI has its uses in assistive technologies, it is also inherently limited, so it's not a good direction to take research in assistive technologies in.