"There are no more juniors.
-
@tante we might not need any human coder, when no senior is available anymore. It was similar with other professions.
@felix_eckhardt @tante this is the same like saying there is no difference between a taylormade suit and 2$ underpants made by slave labor in Bangladesh.
-
@GhostOnTheHalfShell @tante if we look at the last two or three years that might be correct. For the rest of history computational power got cheaper and cheaper. Currently we have a special situation, which will not last forever.
Even Moore’s law has flattened out because we’re reaching the physical limit of electronics. The only way compute power has really expanded is by stacking compute cells on top of each other.
The faster speeds dramatically increase. I think the relationship is a low exponentiation so more compute power faster speeds demands more and more power delivery and generates more and more heat. This is not a good combination.
-
@felix_eckhardt @tante this is the same like saying there is no difference between a taylormade suit and 2$ underpants made by slave labor in Bangladesh.
-
"There are no more juniors. There was a funeral for their passing in 2024. Nobody came. The machine does what they do now, but cheaper. Of course, juniors weren't valuable for what they produced, they were valuable for who they would become: the senior engineer who knows where the bodies are buried. We optimized for output, and abolished apprenticeship. A few years from now, we'll wonder where all the seniors are. We shot them. Nobody will remember."
Programming Still Sucks. — Writing
Sorry Peter. — I'm at a birthday party, and while most people here also work in tech, there's always a Guy with a Real Job. You know, a physical job, building some or other thing people need. And this Guy always asks some variant of the same question: aren't you worried AI is taking your job? I glance around and see a few faces turning around toward us, rolling their eyes ever so slightly before returning to their previous conversation. Yes, this question again.
(www.stvn.sh)
There will be a place for juniors, I believe. Jevon's Paradox
-
@tante we might not need any human coder, when no senior is available anymore. It was similar with other professions.
@felix_eckhardt @tante Yup, no one becomes cooks any more after we got microwave meals in the freezer section. Why bother?
-
@felix_eckhardt @tante Yup, no one becomes cooks any more after we got microwave meals in the freezer section. Why bother?
-
"There are no more juniors. There was a funeral for their passing in 2024. Nobody came. The machine does what they do now, but cheaper. Of course, juniors weren't valuable for what they produced, they were valuable for who they would become: the senior engineer who knows where the bodies are buried. We optimized for output, and abolished apprenticeship. A few years from now, we'll wonder where all the seniors are. We shot them. Nobody will remember."
Programming Still Sucks. — Writing
Sorry Peter. — I'm at a birthday party, and while most people here also work in tech, there's always a Guy with a Real Job. You know, a physical job, building some or other thing people need. And this Guy always asks some variant of the same question: aren't you worried AI is taking your job? I glance around and see a few faces turning around toward us, rolling their eyes ever so slightly before returning to their previous conversation. Yes, this question again.
(www.stvn.sh)
“Cheaper” only because the industry is currently subsidizing subscriptions to the tune of 5 to 12 times the revenue they gain from the subscriptions themselves.
And we haven’t even started to talk about the debt service these companies have undertaken
-
Even Moore’s law has flattened out because we’re reaching the physical limit of electronics. The only way compute power has really expanded is by stacking compute cells on top of each other.
The faster speeds dramatically increase. I think the relationship is a low exponentiation so more compute power faster speeds demands more and more power delivery and generates more and more heat. This is not a good combination.
@GhostOnTheHalfShell @tante Hm, i am not conviced:
GPU computational performance per dollar
An interactive visualization from Our World in Data.
Our World in Data (ourworldindata.org)
-
@GhostOnTheHalfShell @tante Hm, i am not conviced:
GPU computational performance per dollar
An interactive visualization from Our World in Data.
Our World in Data (ourworldindata.org)
I think the thing to be pointed out is that cost of the systems versus their operational costs are very different things and this page gives you an idea of what the power draw is. Primarily the cost is in increased energy, demand and waste.
This is what I meant. Is that Moore’s law, is only been kept alive by stacking components on the silicon. For GPU this is intrinsic and ideal. You can’t contravene thermodynamics though.
12 best GPUs for AI and machine learning in 2026 | Blog — Northflank
Compare the 12 best GPUs for AI in 2026: B200, H200, H100, RTX 4090 & more. Specs, performance & costs. Deploy with Northflank's cloud platform.
Northflank — Deploy any project in seconds, in our cloud or yours. (northflank.com)
-
"There are no more juniors. There was a funeral for their passing in 2024. Nobody came. The machine does what they do now, but cheaper. Of course, juniors weren't valuable for what they produced, they were valuable for who they would become: the senior engineer who knows where the bodies are buried. We optimized for output, and abolished apprenticeship. A few years from now, we'll wonder where all the seniors are. We shot them. Nobody will remember."
Programming Still Sucks. — Writing
Sorry Peter. — I'm at a birthday party, and while most people here also work in tech, there's always a Guy with a Real Job. You know, a physical job, building some or other thing people need. And this Guy always asks some variant of the same question: aren't you worried AI is taking your job? I glance around and see a few faces turning around toward us, rolling their eyes ever so slightly before returning to their previous conversation. Yes, this question again.
(www.stvn.sh)
@tante heh - my wife, who (encouraged by our friend who did the same switch ~5/6 years ago) decided at the end of 2021 to quit her supply chain management career and become a developer (so we both could have remote jobs ant travel) - ignored the funeral and tried, and tired and tried till 2025, when she said fuckit and became qa engineer … just in time to train the machines
️ -
I think the thing to be pointed out is that cost of the systems versus their operational costs are very different things and this page gives you an idea of what the power draw is. Primarily the cost is in increased energy, demand and waste.
This is what I meant. Is that Moore’s law, is only been kept alive by stacking components on the silicon. For GPU this is intrinsic and ideal. You can’t contravene thermodynamics though.
12 best GPUs for AI and machine learning in 2026 | Blog — Northflank
Compare the 12 best GPUs for AI in 2026: B200, H200, H100, RTX 4090 & more. Specs, performance & costs. Deploy with Northflank's cloud platform.
Northflank — Deploy any project in seconds, in our cloud or yours. (northflank.com)
Also I can help nerding out a bit, but running on the assumption that Nvidia is talking about technical reality and not some marketing ploy, the raw compute power does not necessarily equate to better performance. Compute power is more subtle than gigaflops. This means that more compute power may not be found necessarily more gigaflops but in the surrounding architecture and that can be good or less good.
NVIDIA CUDA Cores: How They Work and Why They Matter (2026)
Learn how CUDA cores power AI training through parallel processors. Compare CUDA vs Tensor cores, performance factors, and get started with cloud GPUs.
(www.thundercompute.com)
-
@tante oh hey I wrote this! Thanks for sharing!
@stevendotjs @tante Thanks for writing this and congratulations on your craftsmanship with words! Hits the perfect tone and while English is not my first language, reading this was pure bliss.
Made it out of programming (it’s just a hobby now) and glad about that.
-
"There are no more juniors. There was a funeral for their passing in 2024. Nobody came. The machine does what they do now, but cheaper. Of course, juniors weren't valuable for what they produced, they were valuable for who they would become: the senior engineer who knows where the bodies are buried. We optimized for output, and abolished apprenticeship. A few years from now, we'll wonder where all the seniors are. We shot them. Nobody will remember."
Programming Still Sucks. — Writing
Sorry Peter. — I'm at a birthday party, and while most people here also work in tech, there's always a Guy with a Real Job. You know, a physical job, building some or other thing people need. And this Guy always asks some variant of the same question: aren't you worried AI is taking your job? I glance around and see a few faces turning around toward us, rolling their eyes ever so slightly before returning to their previous conversation. Yes, this question again.
(www.stvn.sh)
@tante Gawd this is good.
-
@tante oh hey I wrote this! Thanks for sharing!
@stevendotjs it's fantastic! Thanks for writing it
-
@stevendotjs @tante Thanks for writing this and congratulations on your craftsmanship with words! Hits the perfect tone and while English is not my first language, reading this was pure bliss.
Made it out of programming (it’s just a hobby now) and glad about that.
@Linkshaender @tante ahw thank you! Happy for you you made it out! If you wanna tell me about it, I'm all ears.
-
"There are no more juniors. There was a funeral for their passing in 2024. Nobody came. The machine does what they do now, but cheaper. Of course, juniors weren't valuable for what they produced, they were valuable for who they would become: the senior engineer who knows where the bodies are buried. We optimized for output, and abolished apprenticeship. A few years from now, we'll wonder where all the seniors are. We shot them. Nobody will remember."
Programming Still Sucks. — Writing
Sorry Peter. — I'm at a birthday party, and while most people here also work in tech, there's always a Guy with a Real Job. You know, a physical job, building some or other thing people need. And this Guy always asks some variant of the same question: aren't you worried AI is taking your job? I glance around and see a few faces turning around toward us, rolling their eyes ever so slightly before returning to their previous conversation. Yes, this question again.
(www.stvn.sh)
-
@tante oh hey I wrote this! Thanks for sharing!
@stevendotjs i applaud you, kind sir
- I hilariously joined a web dev program in 2022, and when I completed it in 2024, I could see it was pointless trying to get a job as a Jr..Your story resonates (also bc I'm a career changer so old enough to have other experiences that are similar) - it's both hilarious and stark.
Well done

-
@tante oh hey I wrote this! Thanks for sharing!
@stevendotjs kudos. so very on point.
Having ridden out the implosion of the dot-com crash, I’m staring at all this buildout and wondering what this time smells like. Capital spend crushed a few big names. Example: Lucent went to a nickel on the dollar.
-
@stevendotjs i applaud you, kind sir
- I hilariously joined a web dev program in 2022, and when I completed it in 2024, I could see it was pointless trying to get a job as a Jr..Your story resonates (also bc I'm a career changer so old enough to have other experiences that are similar) - it's both hilarious and stark.
Well done

-
@stevendotjs i feel for my cohort who tried to move forward by going for their masters.. I hope it works for them. I hope there are enough companies who see the mess AI creates cannot be trusted.. but humans tend to take the path of least resistance..
In truth, I was never a coder, so let's hope I can find other ways to feed myself and my family.. it's rough out here for everyone.