The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
-
@raymaccarthy @tante @map AI can't create. You create, AI just implements it. I know it is hard to digest but this is everyday work for a lot of us now.
-
@raymaccarthy @tante @map Ok, feel free to think whatever you want.
-
@343max @tante
I've designed & written SW for decades and done physical AI courses as well as studying it.
What's your qualification for your amazing claims Max?
Expert systems was AI in 1980s and relied on good design and curation of the knowledge of experts. It was too expensive to build and fragile.
I forecast the the idea of LLM 20+ years ago. Chatbots then had data encoded in the program (Eliza, ALICE etc). I suggested a statistical engine and using the Internet as data. A toy.@raymaccarthy You are absolutely right, I really shouldn’t trust my own day to day experience and the experience of all the people that I trust over your 20 year old predictions. We are all wrong, our eyes betrayed us, please help us see! Oh please!
-
@343max @tante
I've designed & written SW for decades and done physical AI courses as well as studying it.
What's your qualification for your amazing claims Max?
Expert systems was AI in 1980s and relied on good design and curation of the knowledge of experts. It was too expensive to build and fragile.
I forecast the the idea of LLM 20+ years ago. Chatbots then had data encoded in the program (Eliza, ALICE etc). I suggested a statistical engine and using the Internet as data. A toy.@raymaccarthy @343max @tante AI has, in theory, the same potential as a calculator. It can make tasks easier, but it actually does imply skill degradation in a certain field. Solving n-th grade differential equations back in Blaise Pascals time was frustrating. So since Schickard, automating such tasks helped humanity spend their time on the bigger picture instead of grinding through repetitive tasks. New technologies always shifted human skills to a new domain.
-
The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"
@tante The best engineers I know just became more ambitious, and so should all of us.
I'll keep repeating this, there's tonnes of proprietary binary blobs in all of our tech. You can shout from the rooftops about how much you love your /e/OS phone, if your phone modem relies on a proprietary driver, it's pretty much worthless as a "resistance against big tech". European digital sovereignty is equally worthless.
LLMs are good at staring at hexdumps, humans aren't. Use their advantage to build actually open tech.
> But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.
Skill issue, idk what more to say. I don't find it any different to managing juniors and reviewing their PRs. Bad code is bad code.
-
Here's the thing: I believe that you deserve to have access to high quality products and services. You deserve to use products and services that are safe, secure, well-designed and not destroying the ecological, informational or social environment.
+1
> "destroying the ecological, informational or social environment"
As good as this generated code may be, it remains unacceptable because of that. And that should be the ultimate reason (as the code quality may rise, but at the cost of more destruction)
-
The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"
5 years retired from IT but I still remember CRUD
-
The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"
@tante Hot take: there should be no "software writing" involved in CRUD to begin with. Just some declarative stuff for your specific application and fully generic code.
-
The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"
@tante Between development speed and software quality, the industry has chosen speed long before LLMs came to be. It would not doubt be better for all of us, if we all did not let LLMs write software. But whether not using them is viable for a specific company or a specific individual in a competitive environment stays open.
-
@raymaccarthy @tante @map Ok, feel free to think whatever you want.
@gklka @raymaccarthy @tante @map I agree with GK on this. Not all AI is the same, and it's definitely not black and white. With the right expertise and detailed specs, you can achieve great results while keeping the code maintainable and retaining ownership. I really dislike the mindset that everything has to be either absolutely good or 100% bad.