"There are no more juniors.
-
"There are no more juniors. There was a funeral for their passing in 2024. Nobody came. The machine does what they do now, but cheaper. Of course, juniors weren't valuable for what they produced, they were valuable for who they would become: the senior engineer who knows where the bodies are buried. We optimized for output, and abolished apprenticeship. A few years from now, we'll wonder where all the seniors are. We shot them. Nobody will remember."
Programming Still Sucks. — Writing
Sorry Peter. — I'm at a birthday party, and while most people here also work in tech, there's always a Guy with a Real Job. You know, a physical job, building some or other thing people need. And this Guy always asks some variant of the same question: aren't you worried AI is taking your job? I glance around and see a few faces turning around toward us, rolling their eyes ever so slightly before returning to their previous conversation. Yes, this question again.
(www.stvn.sh)
@tante So right…
-
April 26 Fortune magazine quoted an Nvidia VP saying that AI costs far more than human workers.
Ed Zitron reported on the economic realities of the current subscriptions offered by big tech versus the actual cost, found in token usage. The ratio of income to token costs something between five and 12, depending on the subscription charges.
And this figure doesn’t take into account the debt these companies have to service as well.
That’s some kind of productivity increase
@GhostOnTheHalfShell @tante that is completely correct. But we all know that computational power gets cheaper every year. Question is: will it get profitable soon enough or are AI companies going bankrupt before this happens?
-
"There are no more juniors. There was a funeral for their passing in 2024. Nobody came. The machine does what they do now, but cheaper. Of course, juniors weren't valuable for what they produced, they were valuable for who they would become: the senior engineer who knows where the bodies are buried. We optimized for output, and abolished apprenticeship. A few years from now, we'll wonder where all the seniors are. We shot them. Nobody will remember."
Programming Still Sucks. — Writing
Sorry Peter. — I'm at a birthday party, and while most people here also work in tech, there's always a Guy with a Real Job. You know, a physical job, building some or other thing people need. And this Guy always asks some variant of the same question: aren't you worried AI is taking your job? I glance around and see a few faces turning around toward us, rolling their eyes ever so slightly before returning to their previous conversation. Yes, this question again.
(www.stvn.sh)
@tante oh hey I wrote this! Thanks for sharing!
-
@GhostOnTheHalfShell @tante that is completely correct. But we all know that computational power gets cheaper every year. Question is: will it get profitable soon enough or are AI companies going bankrupt before this happens?
@felix_eckhardt @GhostOnTheHalfShell @tante Or if they'll burn up the countryside and destroy the aquifers with the required datacenters.
-
@felix_eckhardt @GhostOnTheHalfShell @tante Or if they'll burn up the countryside and destroy the aquifers with the required datacenters.
@StumpyTheMutt @GhostOnTheHalfShell @tante its sad, that it does not sound far fetched.
-
"There are no more juniors. There was a funeral for their passing in 2024. Nobody came. The machine does what they do now, but cheaper. Of course, juniors weren't valuable for what they produced, they were valuable for who they would become: the senior engineer who knows where the bodies are buried. We optimized for output, and abolished apprenticeship. A few years from now, we'll wonder where all the seniors are. We shot them. Nobody will remember."
Programming Still Sucks. — Writing
Sorry Peter. — I'm at a birthday party, and while most people here also work in tech, there's always a Guy with a Real Job. You know, a physical job, building some or other thing people need. And this Guy always asks some variant of the same question: aren't you worried AI is taking your job? I glance around and see a few faces turning around toward us, rolling their eyes ever so slightly before returning to their previous conversation. Yes, this question again.
(www.stvn.sh)
@tante we might not need any human coder, when no senior is available anymore. It was similar with other professions.
-
@GhostOnTheHalfShell @tante that is completely correct. But we all know that computational power gets cheaper every year. Question is: will it get profitable soon enough or are AI companies going bankrupt before this happens?
The systems get more and more expensive. They may be more powerful, but they aren’t getting less expensive. They’re getting more expensive.
Given that the lifetime of the computational units is between one and three years before they burn out or become obsolete, the window of time they must recoup their cost and make a profit is tiny. This isn’t like laying fiber optic that has a 30 year lifetime.
-
The systems get more and more expensive. They may be more powerful, but they aren’t getting less expensive. They’re getting more expensive.
Given that the lifetime of the computational units is between one and three years before they burn out or become obsolete, the window of time they must recoup their cost and make a profit is tiny. This isn’t like laying fiber optic that has a 30 year lifetime.
@GhostOnTheHalfShell @tante if we look at the last two or three years that might be correct. For the rest of history computational power got cheaper and cheaper. Currently we have a special situation, which will not last forever.
-
This is fantastic and unfortunately
true. I am Sara, tunnelling under Mordor with a USB stick. I have attempted to document the cron job and institutionalize the periodic nudge it needs to run payroll... but then I get yelled at for not videcoding enough new featureslop. There are no juniors for me to explain the cron job too.
Perhaps the AI may one dayabsorb the wiki page about the cron job. Hopefully someone else thinks to ask the AI about why payroll didn't run.
There are no more spoons, we sold them all to pay for the AI
-
@tante we might not need any human coder, when no senior is available anymore. It was similar with other professions.
@felix_eckhardt @tante this is the same like saying there is no difference between a taylormade suit and 2$ underpants made by slave labor in Bangladesh.
-
@GhostOnTheHalfShell @tante if we look at the last two or three years that might be correct. For the rest of history computational power got cheaper and cheaper. Currently we have a special situation, which will not last forever.
Even Moore’s law has flattened out because we’re reaching the physical limit of electronics. The only way compute power has really expanded is by stacking compute cells on top of each other.
The faster speeds dramatically increase. I think the relationship is a low exponentiation so more compute power faster speeds demands more and more power delivery and generates more and more heat. This is not a good combination.
-
@felix_eckhardt @tante this is the same like saying there is no difference between a taylormade suit and 2$ underpants made by slave labor in Bangladesh.
-
"There are no more juniors. There was a funeral for their passing in 2024. Nobody came. The machine does what they do now, but cheaper. Of course, juniors weren't valuable for what they produced, they were valuable for who they would become: the senior engineer who knows where the bodies are buried. We optimized for output, and abolished apprenticeship. A few years from now, we'll wonder where all the seniors are. We shot them. Nobody will remember."
Programming Still Sucks. — Writing
Sorry Peter. — I'm at a birthday party, and while most people here also work in tech, there's always a Guy with a Real Job. You know, a physical job, building some or other thing people need. And this Guy always asks some variant of the same question: aren't you worried AI is taking your job? I glance around and see a few faces turning around toward us, rolling their eyes ever so slightly before returning to their previous conversation. Yes, this question again.
(www.stvn.sh)
There will be a place for juniors, I believe. Jevon's Paradox
-
@tante we might not need any human coder, when no senior is available anymore. It was similar with other professions.
@felix_eckhardt @tante Yup, no one becomes cooks any more after we got microwave meals in the freezer section. Why bother?
-
@felix_eckhardt @tante Yup, no one becomes cooks any more after we got microwave meals in the freezer section. Why bother?
-
"There are no more juniors. There was a funeral for their passing in 2024. Nobody came. The machine does what they do now, but cheaper. Of course, juniors weren't valuable for what they produced, they were valuable for who they would become: the senior engineer who knows where the bodies are buried. We optimized for output, and abolished apprenticeship. A few years from now, we'll wonder where all the seniors are. We shot them. Nobody will remember."
Programming Still Sucks. — Writing
Sorry Peter. — I'm at a birthday party, and while most people here also work in tech, there's always a Guy with a Real Job. You know, a physical job, building some or other thing people need. And this Guy always asks some variant of the same question: aren't you worried AI is taking your job? I glance around and see a few faces turning around toward us, rolling their eyes ever so slightly before returning to their previous conversation. Yes, this question again.
(www.stvn.sh)
“Cheaper” only because the industry is currently subsidizing subscriptions to the tune of 5 to 12 times the revenue they gain from the subscriptions themselves.
And we haven’t even started to talk about the debt service these companies have undertaken
-
Even Moore’s law has flattened out because we’re reaching the physical limit of electronics. The only way compute power has really expanded is by stacking compute cells on top of each other.
The faster speeds dramatically increase. I think the relationship is a low exponentiation so more compute power faster speeds demands more and more power delivery and generates more and more heat. This is not a good combination.
@GhostOnTheHalfShell @tante Hm, i am not conviced:
GPU computational performance per dollar
An interactive visualization from Our World in Data.
Our World in Data (ourworldindata.org)
-
@GhostOnTheHalfShell @tante Hm, i am not conviced:
GPU computational performance per dollar
An interactive visualization from Our World in Data.
Our World in Data (ourworldindata.org)
I think the thing to be pointed out is that cost of the systems versus their operational costs are very different things and this page gives you an idea of what the power draw is. Primarily the cost is in increased energy, demand and waste.
This is what I meant. Is that Moore’s law, is only been kept alive by stacking components on the silicon. For GPU this is intrinsic and ideal. You can’t contravene thermodynamics though.
12 best GPUs for AI and machine learning in 2026 | Blog — Northflank
Compare the 12 best GPUs for AI in 2026: B200, H200, H100, RTX 4090 & more. Specs, performance & costs. Deploy with Northflank's cloud platform.
Northflank — Deploy any project in seconds, in our cloud or yours. (northflank.com)
-
"There are no more juniors. There was a funeral for their passing in 2024. Nobody came. The machine does what they do now, but cheaper. Of course, juniors weren't valuable for what they produced, they were valuable for who they would become: the senior engineer who knows where the bodies are buried. We optimized for output, and abolished apprenticeship. A few years from now, we'll wonder where all the seniors are. We shot them. Nobody will remember."
Programming Still Sucks. — Writing
Sorry Peter. — I'm at a birthday party, and while most people here also work in tech, there's always a Guy with a Real Job. You know, a physical job, building some or other thing people need. And this Guy always asks some variant of the same question: aren't you worried AI is taking your job? I glance around and see a few faces turning around toward us, rolling their eyes ever so slightly before returning to their previous conversation. Yes, this question again.
(www.stvn.sh)
@tante heh - my wife, who (encouraged by our friend who did the same switch ~5/6 years ago) decided at the end of 2021 to quit her supply chain management career and become a developer (so we both could have remote jobs ant travel) - ignored the funeral and tried, and tired and tried till 2025, when she said fuckit and became qa engineer … just in time to train the machines
️ -
I think the thing to be pointed out is that cost of the systems versus their operational costs are very different things and this page gives you an idea of what the power draw is. Primarily the cost is in increased energy, demand and waste.
This is what I meant. Is that Moore’s law, is only been kept alive by stacking components on the silicon. For GPU this is intrinsic and ideal. You can’t contravene thermodynamics though.
12 best GPUs for AI and machine learning in 2026 | Blog — Northflank
Compare the 12 best GPUs for AI in 2026: B200, H200, H100, RTX 4090 & more. Specs, performance & costs. Deploy with Northflank's cloud platform.
Northflank — Deploy any project in seconds, in our cloud or yours. (northflank.com)
Also I can help nerding out a bit, but running on the assumption that Nvidia is talking about technical reality and not some marketing ploy, the raw compute power does not necessarily equate to better performance. Compute power is more subtle than gigaflops. This means that more compute power may not be found necessarily more gigaflops but in the surrounding architecture and that can be good or less good.
NVIDIA CUDA Cores: How They Work and Why They Matter (2026)
Learn how CUDA cores power AI training through parallel processors. Compare CUDA vs Tensor cores, performance factors, and get started with cloud GPUs.
(www.thundercompute.com)