It's clear that AI assisted coding is dividing developers (welcome to the culture wars!).
-
@hanshuebner @dalias I'm apprehensive about trying a full agent-based workflow because I'm afraid I'll be so dazzled by what it can do (unreliably) via brute force that I'll let my guard down in terms of evaluating it critically.
-
@hanshuebner @matt Yes it can hurt to try them. They are cognitohazards and are designed to make you think they're doing things they're not. This works on a lot of people, even people who think themselves very intelligent and thereby immune.
This is how we end up with folks praising them while putting out clearly worse writing, code, etc. that nobody wants.
-
It's clear that AI assisted coding is dividing developers (welcome to the culture wars!). I've seen a few blog posts now that talk about how some people just "love the craft", "delight in making something just right, like knitting", etc, as opposed to people who just "want to make it work". As if that explains the divide.
How about this, some people resent the notion of being a babysitter to a stochastic token machine, hastening their own cognitive decline. Some people resent paying rent to a handful of US companies, all coming directly out of the TESCREAL human extinction cult, to be able to write software. Some people resent the "worse is better" steady decline of software quality over the past two decades, now supercharged. Some people resent that the hegemonic computing ecosystem is entirely shaped by the logic of venture capital. Some people hate that the digital commons is walled off and sold back to us. Oh and I guess some people also don't like the thought of making coding several orders of magnitude more energy intensive during a climate emergency.
But sure, no, it's really because we mourn the loss of our hobby.
@plexus Let's not forget that code output by LLMs are probably not subject to copyright (and include training on copyrighted works), making them a legal landmine for any project or product that includes output from LLMs.
-
@hanshuebner @dalias I'm apprehensive about trying a full agent-based workflow because I'm afraid I'll be so dazzled by what it can do (unreliably) via brute force that I'll let my guard down in terms of evaluating it critically.
-
@hanshuebner @matt You realize you sound exactly like a drug dealer, right?
-
I have, and it failed to complete the task AND "lied" to me at the same time.
-
@schaueho @hanshuebner @grishka
I will never begrudge a person's decision to boycott LLM usage.
But I do grow weary of folk on Mastodon earnestly insisting that "the flaws in LLMs will somehow all be laid bare, and handcrafted, artisanal code is somehow inherently superior"
Y'all cheering for John Henry without understanding that this is a job that's actually very well suited for a machine.
@dusk
OT, but cool John Henry video: https://www.youtube.com/watch?v=kt9NSMZR0dM -
I have, and it failed to complete the task AND "lied" to me at the same time.
@BoydStephenSmithJr @grishka If you have the expectation that it should complete the task flawlessly and point out that it "lied", it seems that you have achieved your goal of showing that it did not work for you.
I've had many successes, and none of the things that I created magically collapsed or failed to work except under narrow circumstances. I had to spend time creating and improving them, but I would not have started them if I'd needed to write the code myself.
-
@tiotasram @matt @hanshuebner @dalias Honestly, I don't think even the AI labs are ignoring these issues. At the very least Anthropic has been fairly up front about these concerns and reporting them in their research and surveywork.
Their most recent one is pretty comprehensive: https://www.anthropic.com/features/81k-interviews
-
It's clear that AI assisted coding is dividing developers (welcome to the culture wars!). I've seen a few blog posts now that talk about how some people just "love the craft", "delight in making something just right, like knitting", etc, as opposed to people who just "want to make it work". As if that explains the divide.
How about this, some people resent the notion of being a babysitter to a stochastic token machine, hastening their own cognitive decline. Some people resent paying rent to a handful of US companies, all coming directly out of the TESCREAL human extinction cult, to be able to write software. Some people resent the "worse is better" steady decline of software quality over the past two decades, now supercharged. Some people resent that the hegemonic computing ecosystem is entirely shaped by the logic of venture capital. Some people hate that the digital commons is walled off and sold back to us. Oh and I guess some people also don't like the thought of making coding several orders of magnitude more energy intensive during a climate emergency.
But sure, no, it's really because we mourn the loss of our hobby.
@plexus@toot.cat I mean, it can be both.
-
@plexus and i also feel we should be standing in solidarity with other affected professions to form a unified front against all generative "ai"
stand together with artists, writers, journalists, translators, etc etc against this morally corrupt technology@lumi@snug.moe @plexus@toot.cat Absolutely. Hold the line. On all fronts.
-
I love to listen to virgins talk about sex.
It's very fun.@n_dimension@infosec.exchange @plexus@toot.cat "You probably haven't even tasted shit before."
-
It's clear that AI assisted coding is dividing developers (welcome to the culture wars!). I've seen a few blog posts now that talk about how some people just "love the craft", "delight in making something just right, like knitting", etc, as opposed to people who just "want to make it work". As if that explains the divide.
How about this, some people resent the notion of being a babysitter to a stochastic token machine, hastening their own cognitive decline. Some people resent paying rent to a handful of US companies, all coming directly out of the TESCREAL human extinction cult, to be able to write software. Some people resent the "worse is better" steady decline of software quality over the past two decades, now supercharged. Some people resent that the hegemonic computing ecosystem is entirely shaped by the logic of venture capital. Some people hate that the digital commons is walled off and sold back to us. Oh and I guess some people also don't like the thought of making coding several orders of magnitude more energy intensive during a climate emergency.
But sure, no, it's really because we mourn the loss of our hobby.
The point of "algorithmically driven social media" isn't to find what you want to see, it's to make you question what they don't want you to think and to affirm what they DO want you to think.
"All the objections to AI are aesthetic"
️
- boost the one guy in comments talking about K&R C as being for a "more civilized age" in the comments."Using AI in your tool chain is Russian Roulette"
️
- deboost speaker to everyone but zealots who will proceed to drown him in adhom. -
The point of "algorithmically driven social media" isn't to find what you want to see, it's to make you question what they don't want you to think and to affirm what they DO want you to think.
"All the objections to AI are aesthetic"
️
- boost the one guy in comments talking about K&R C as being for a "more civilized age" in the comments."Using AI in your tool chain is Russian Roulette"
️
- deboost speaker to everyone but zealots who will proceed to drown him in adhom.@plexus speaking of adhom and sloppy arguments, Hi, Hans! Glad to see your LinkedIn Suicide vest is properly fitted.
-
@hanshuebner @plexus I think we all agree that this shit sucks and many of us are familiar with the history of modern computing. I disagree that workers of the software industry can't spark change. We are probably the most privileged of the working class. So I would even argue it's our duty to do something with this privelege...
-
@matt @dalias "Our understanding" is often incomplete, leading to code that is just a reflection of the process of understanding the task at hand. Code often suffers from that in that the person working on it learned faster than they could or would refactor. The resulting reality is that code, by and large, is messy.
Not everyone is working the same way, but it is certainly true that not everyone is a genius. Thus, bad, human code prevails.
@hanshuebner @matt @dalias You seem to be arguing against LLM coding here. Because if you develop the understanding by working on the code, how do you discern that the output of an LLM actually solves the problem as intended?
-
@hanshuebner @matt @dalias You seem to be arguing against LLM coding here. Because if you develop the understanding by working on the code, how do you discern that the output of an LLM actually solves the problem as intended?
@scheme I certainly did not try to argue that writing code is the only or the best way to understand the requirements. It is just one way.
When you don't write the code, you of course need to validate your requirements differently, for example by trying out the code or by formulting tests, or by realizing that you have difficulties creating a good prompt.
-
@jeffmcneill @plexus @ttntm I would add the caveat that, for some, just like the fabric weavers of old who just wanted to put food on the table using a skill they had (aka luddites) it WAS an existential crisis. They starved, their children got chewed up by the machines, it’s only their grandchildren that started to prosper from the new productivity that increased the size of pie.
-
@grishka Right on, and then consider that with the traditional mode of writing software, the cost of creating something that is good is very high.
I'd argue that with faster (machine assisted) software creation, it is easier to meet the need of users because the cost of change is drastically reduced. I'm experiencing that with those system that I'm currently writing that way.
The whole argument that software written by humans is better does not bear any merit for me.
@hanshuebner @grishka May I suggest that software written by a skilled human with or without an llm tool is better than similarly complex software written by an unskilled human.
A skilled human with an llm tool will do just fine. A skilled human working with conventional tools will do fine. An unskilled human without an llm tool will generally fail. An unskilled human with an llm tool will generally be worse than the other three options.
This biases human code to be statistically better.
-
@tiotasram @matt @hanshuebner @dalias So you're saying you'd prefer slanted research as long as it favors your point of view that "AI is bad"? That wall of text you wrote basically oversimplifies everything to a negative bias.
Even Anthropic acknowledged that this is from Claude users, but you discount the weight of the opinions of people simply because of that?
And the sample size is more than sufficiently large to be considered rigorous.
