No, opposing LLMs isn't "purity culture."
-
-
I wouldn't be saying all this if it was just Doctorow, I'm even fine disagreeing with people I deeply respect. But he's not the only one saying shit like this, and I think it's worth calling out the broader rhetorical point.
Addendum: since this has now rather dramatically escaped containment, I want to quickly note that if you reply to this thread in a completely embarrassing way, I reserve the right to be at least a bit rude in my responses.
-
No, opposing LLMs isn't "purity culture." I've seen this now from quite a few different people, and I disagree vehemently. It is good, actually, to have moral principles and hold to them, even when people with more money than you find said principles annoying.
@xgranade You have to admit, though, that it's pretty impressive that "no thanks" is purity culture, and not "we need to keep sacrificing transistors and coal to manifest the libertarian god, and everybody who disagrees won't and shouldn't survive."
-
@codinghorror Anyway, this isn't the first time you've replied to me to make the argument that LLMs are just another kind of tool. I suspect we won't see eye-to-eye on that, especially as my work has been abused to make LLM products.
I hope we can agree though, that my objection *even though you disagree with it* is principled and neither knee jerk nor purity culture.
@xgranade LLMs told me something critical about my health that no healthcare professional -- and I have a whole team working on me, because I'm bonkers -- ever did. If you want to ask, ask, I can provide very detailed citations and proof.
-
@codinghorror Sure, but we're not talking about "which tool is best for driving a nail that I own into a wall that I own," we're talking about "is it ethical to use a technology built on fascist ideology and stolen work, that carries unconscionable environmental costs, and that's used to disrupt labor movements to perform a task that that technology is fundamentally unsuited to?"
It's quite fair to have a very firm "no" by way of answer to the second question.
@xgranade fair; I want to be alive, see earlier response.
-
@xgranade LLMs told me something critical about my health that no healthcare professional -- and I have a whole team working on me, because I'm bonkers -- ever did. If you want to ask, ask, I can provide very detailed citations and proof.
@codinghorror I'm not a doctor (well, not that *kind* of doctor, anyway), so I'll absolutely admit that I'm not the right person to evaluate those citations. I'll say that from a pretty damned nontrivial degree of expertise with machine learning, I would find it extremely surprising if *on average* text recombination without any underlying semantic model yielded useful advice more commonly than outright dangerous advice.
-
@codinghorror I'm not a doctor (well, not that *kind* of doctor, anyway), so I'll absolutely admit that I'm not the right person to evaluate those citations. I'll say that from a pretty damned nontrivial degree of expertise with machine learning, I would find it extremely surprising if *on average* text recombination without any underlying semantic model yielded useful advice more commonly than outright dangerous advice.
@codinghorror Like, nothing about LLMs and the theory behind them prevents anyone from getting lucky — and I'm glad that you got lucky instead of the much more common and probable case. But that doesn't mean that they're anything other than outright terrifyingly dangerous in a medical context more generally.
-
@codinghorror I'm not a doctor (well, not that *kind* of doctor, anyway), so I'll absolutely admit that I'm not the right person to evaluate those citations. I'll say that from a pretty damned nontrivial degree of expertise with machine learning, I would find it extremely surprising if *on average* text recombination without any underlying semantic model yielded useful advice more commonly than outright dangerous advice.
@xgranade email me if you want to know. I have a rare set of DNA in some cases, as it turns out.
-
@xgranade "You don't want to use the lie machine powered by mulching puppies? What are you, some kind of purist?"
Do you eat chicken? Do you know how the chicken industry mulches all the rooster chicks?
Not to defend LLM use, but I am starting to get tired of the PETA-esque rhetoric. Do these really mulch animals? No. Do they do have negative impacts in other ways? Yes.
Is it that hard to focus on real impacts?
-
@codinghorror Like, nothing about LLMs and the theory behind them prevents anyone from getting lucky — and I'm glad that you got lucky instead of the much more common and probable case. But that doesn't mean that they're anything other than outright terrifyingly dangerous in a medical context more generally.
@xgranade people should absolutely be taught all the pros and cons, but I really dislike absolutism and zealotry.. it's not useful, it's not practical, it accomplishes nothing (except in the very narrow cases of civil rights or human dignity). If I wanted more ones and zeroes, I'd own more computers..
-
@xgranade people should absolutely be taught all the pros and cons, but I really dislike absolutism and zealotry.. it's not useful, it's not practical, it accomplishes nothing (except in the very narrow cases of civil rights or human dignity). If I wanted more ones and zeroes, I'd own more computers..
@xgranade and as I've said before, if you want to be angry, be angry at cryptocurrency which is gambling, grifters, and human trafficking to the bone. It's horrendous.
-
@xgranade and as I've said before, if you want to be angry, be angry at cryptocurrency which is gambling, grifters, and human trafficking to the bone. It's horrendous.
@codinghorror @xgranade The push for LLM inevitability is all the same people as cryptocurrency. That should tell you something about LLMs. It certainly tells me something.
-
@codinghorror @xgranade The push for LLM inevitability is all the same people as cryptocurrency. That should tell you something about LLMs. It certainly tells me something.
-
@xgranade and as I've said before, if you want to be angry, be angry at cryptocurrency which is gambling, grifters, and human trafficking to the bone. It's horrendous.
@codinghorror @xgranade we can be angry at multiple things
-
@codinghorror @xgranade we can be angry at multiple things
@codinghorror @xgranade your persistent sea lioning, for example
-
-
@codinghorror @xgranade your persistent sea lioning, for example
-
@xgranade people should absolutely be taught all the pros and cons, but I really dislike absolutism and zealotry.. it's not useful, it's not practical, it accomplishes nothing (except in the very narrow cases of civil rights or human dignity). If I wanted more ones and zeroes, I'd own more computers..
@codinghorror @xgranade All of the "zealotry", including sea lioning
, is from the people who want to force us to give their precious slop machines a fair chance.Wanting to be left alone by that shit, not to have people submitting PRs and bug reports with fraudulent provenance to our projects, wanting not to have our time wasted reading slop nobody actually wrote, wanting not to have our servers hammered by gigabits per second of scraper hits, etc. isn't called "zealotry". It's called boundaries. Something tech bro culture refuses to understand.
-
@codinghorror @xgranade All of the "zealotry", including sea lioning
, is from the people who want to force us to give their precious slop machines a fair chance.Wanting to be left alone by that shit, not to have people submitting PRs and bug reports with fraudulent provenance to our projects, wanting not to have our time wasted reading slop nobody actually wrote, wanting not to have our servers hammered by gigabits per second of scraper hits, etc. isn't called "zealotry". It's called boundaries. Something tech bro culture refuses to understand.
@codinghorror @xgranade Among the folks who came out of that culture, you're one of the few I largely respect and consider decent. But you really need to realize sometime that the whole culture was rotten to the core in matters of consent and boundaries. And overall folks here on the fedi are done with that shit. We're not having it.
-
No, opposing LLMs isn't "purity culture." I've seen this now from quite a few different people, and I disagree vehemently. It is good, actually, to have moral principles and hold to them, even when people with more money than you find said principles annoying.
For all the folks complaining about “purity culture”:
What alternative do you propose?
Impurity culture?
We know what that looks like, now in millions of documents kind of detail…
