There's a disturbing trend of developers I've known for years now suddenly being unable to discuss even basic aspects of software development.
-
@headius The fundamentals of software engineering become more relevant and important when application and feature code becomes cheap to sketch.
Cheap to sketch, often illegibly and inconsistently.
Static analysis, performance monitoring, continuous integration, and ops visibility have to be up to the load of doubtful application code.
Not to leave aside the importance of specific and legible tests of system transitions and invariants.
@jmeowmeow You're a few steps behind what I'm seeing. I can't even have conversations about how to build a system or implement an idea with some people if they aren't asking Claude what to do. They gave up the small decisions, then the medium ones, and now they can't even think about problems without an LLM prompting *them*. It's like a prion disease carving away bits of their brains until nothing is left.
-
@jmeowmeow You're a few steps behind what I'm seeing. I can't even have conversations about how to build a system or implement an idea with some people if they aren't asking Claude what to do. They gave up the small decisions, then the medium ones, and now they can't even think about problems without an LLM prompting *them*. It's like a prion disease carving away bits of their brains until nothing is left.
@headius That's painful and outside my present experience. I wonder what a software job interview looks like these days?
-
@brandonscript @timbray I'd like to see a setup where I still do the work of coding and the AI is solely a pairing partner. Delegating the entire process to an LLM including the actual construction removes all of my agency from the process, and delegating small decisions to LLM eventually leads to delegating all of them. If you don't exercise your critical thinking abilities you will lose them. Too many developers are choosing that path.
@headius I did that by doing ping- pong with the LLM. The LLM writes the failing test case. I make it pass, and make many decisions along the way, some invalidating the test. Then we switch roles: I write the next failing test case, and it brings us back to green.
This works because before, I’ve planned things out in great details, with the LLM’s help. I’ve taken many big decisions up-front.
It works, but the LLM must populate its context, so the first few turns are slow. Then, it gets better.
-
@brandonscript @timbray I'd like to see a setup where I still do the work of coding and the AI is solely a pairing partner. Delegating the entire process to an LLM including the actual construction removes all of my agency from the process, and delegating small decisions to LLM eventually leads to delegating all of them. If you don't exercise your critical thinking abilities you will lose them. Too many developers are choosing that path.
-
@headius That's painful and outside my present experience. I wonder what a software job interview looks like these days?
@jmeowmeow @headius at $workplace our interviewing process is still the same (bit of live coding a couple simple problems, bit of sketching out an architecture, some questions and chat)
My colleagues (I have not conducted an interview in a while) report that candidates do seem "dumber" on average, than they once were, but of course it's anecdotes and feelings not hard data.
(I use LLMs and don't feel I'm getting less capable, but maybe I am)
-
There's a disturbing trend of developers I've known for years now suddenly being unable to discuss even basic aspects of software development. They have become so dependent on LLMs, they can't even describe how to design a system anymore. Feels like losing friends to dementia.

@headius AI dementia. That's a good expression for it.
-
@headius I did that by doing ping- pong with the LLM. The LLM writes the failing test case. I make it pass, and make many decisions along the way, some invalidating the test. Then we switch roles: I write the next failing test case, and it brings us back to green.
This works because before, I’ve planned things out in great details, with the LLM’s help. I’ve taken many big decisions up-front.
It works, but the LLM must populate its context, so the first few turns are slow. Then, it gets better.
-
@jmeowmeow You're a few steps behind what I'm seeing. I can't even have conversations about how to build a system or implement an idea with some people if they aren't asking Claude what to do. They gave up the small decisions, then the medium ones, and now they can't even think about problems without an LLM prompting *them*. It's like a prion disease carving away bits of their brains until nothing is left.
@headius @jmeowmeow Yeah, I’m starting to feel that it’s discuss work away from the workstations, like by the coffee machines or at lunch. Even if colleagues are competent, they no longer carry the state of the software in their heads anymore.
-
@pointlessone true, which is why I vary. Sometimes I RED, sometimes (when I'm low on energy), I let the LLM start on RED. The LLM kickstarts my brain, then I "fix" its ... mistakes is too strong a word.
Anyway, I've done that 3-4 times. Its not like it's my regular workflow yet, but its something I've generally been happy with.
-
@headius The fundamentals of software engineering become more relevant and important when application and feature code becomes cheap to sketch.
Cheap to sketch, often illegibly and inconsistently.
Static analysis, performance monitoring, continuous integration, and ops visibility have to be up to the load of doubtful application code.
Not to leave aside the importance of specific and legible tests of system transitions and invariants.
@jmeowmeow@hachyderm.io @headius@mastodon.social
Cheap to sketch, often illegibly and inconsistently.
This! I have to continuously redirect LLMs to follow conventions and place code and logic in sane places so the project structure remains solid. Otherwise, everything would be randomly scattered throughout the project.
-
There's a disturbing trend of developers I've known for years now suddenly being unable to discuss even basic aspects of software development. They have become so dependent on LLMs, they can't even describe how to design a system anymore. Feels like losing friends to dementia.

@headius or developers who can't seem to have a technical discussion on anything without feeding your input to Claude first to figure out what they should think about it.
-
@jmeowmeow You're a few steps behind what I'm seeing. I can't even have conversations about how to build a system or implement an idea with some people if they aren't asking Claude what to do. They gave up the small decisions, then the medium ones, and now they can't even think about problems without an LLM prompting *them*. It's like a prion disease carving away bits of their brains until nothing is left.
@headius @jmeowmeow
I immediately get annoyed when I see someone start a reply to a question with "Claude said..."I don't want to know what string of tokens the LLM free associated with the question, I wanted to know what the human I asked thinks.
-
There's a disturbing trend of developers I've known for years now suddenly being unable to discuss even basic aspects of software development. They have become so dependent on LLMs, they can't even describe how to design a system anymore. Feels like losing friends to dementia.

There's a disturbing trend of developers I've known for years now suddenly being unable to discuss even basic aspects of modern software development. They have become so calcified and unable to embrace new skills relaying on orthodoxy and conservative software development, they can't even innovate or learn new, more efficient and challenging techniques. Feels like losing friends to dementia.

-
R relay@relay.infosec.exchange shared this topic
-
@brandonscript @timbray I'd like to see how your workflow is structured.
-
@jmeowmeow You're a few steps behind what I'm seeing. I can't even have conversations about how to build a system or implement an idea with some people if they aren't asking Claude what to do. They gave up the small decisions, then the medium ones, and now they can't even think about problems without an LLM prompting *them*. It's like a prion disease carving away bits of their brains until nothing is left.
When I see descriptions like this, a thing which comes to mind is that covid can cause brain damage. As well as LLMs diminishing the incentives to think, they might become more tempting to rely on as cognitive skills deteriorate from repeat infections.
In other words, there might be more than one thing happening.
-
There's a disturbing trend of developers I've known for years now suddenly being unable to discuss even basic aspects of software development. They have become so dependent on LLMs, they can't even describe how to design a system anymore. Feels like losing friends to dementia.

@headius so how are programs debugged?
-
There's a disturbing trend of developers I've known for years now suddenly being unable to discuss even basic aspects of software development. They have become so dependent on LLMs, they can't even describe how to design a system anymore. Feels like losing friends to dementia.

@headius at work, some coworkers that used be quite good at programming, ended up prompting a lot to know how to code a simple currency exchange logic.
All the necessary data for that task was alrdeady in the programm. No api calls for getting the values of each currency, nor anything like that, was even necessary -
There's a disturbing trend of developers I've known for years now suddenly being unable to discuss even basic aspects of software development. They have become so dependent on LLMs, they can't even describe how to design a system anymore. Feels like losing friends to dementia.

@headius damn that's scary. I'm quite a hater of the whole LLM thing and the deskilling is certainly one of my concerns but I did not believe that some of it would be so immediate, so fast. WTAF
-
@brandonscript @timbray I'd like to see a setup where I still do the work of coding and the AI is solely a pairing partner. Delegating the entire process to an LLM including the actual construction removes all of my agency from the process, and delegating small decisions to LLM eventually leads to delegating all of them. If you don't exercise your critical thinking abilities you will lose them. Too many developers are choosing that path.
@headius I had some limited success using an agent in Ask mode with a new to me codebase. It can go read a bunch of files faster than me and can give relatively OK answers to questions like where can I find auth code or what does this big undocumented class do.
-
R relay@relay.mycrowd.ca shared this topic