the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion
-
RE: https://mastodon.social/@glyph/115076275195904439
I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy
@glyph yes, but to destroy is so much easier than to create. I worry that some moron might put spicy autocorrect in charge of a hydro dam or one of those shooty tooties the US has all over Europe. It wouldn't take much for Musk to (accidentally on purpose) Heinlein us all with Starlink.
-
@glyph yes, but to destroy is so much easier than to create. I worry that some moron might put spicy autocorrect in charge of a hydro dam or one of those shooty tooties the US has all over Europe. It wouldn't take much for Musk to (accidentally on purpose) Heinlein us all with Starlink.
@pkraus there are lots of very scary things happening right now, it's just that "swarms of killer robots with minds beyond our comprehension" are not among them
-
casual thinkpieces and lazy attempts at scicomm are what has set me off but the actual thing I'm mad about is that we are ruled by people with a child's understanding of the world and the economy and that's actually really bad
@glyph reading this thread was a great cap to my evening, thanks
-
@glyph my assertion was that the singularity, as described by ray kurzweil, accurately describes the invention of writing, and i don't see why it would be more interesting if the self-improving intelligent mechanism were made of etched silicon instead of CHNOPS nanomachines. it is harder for etched silicon to self-reproduce, anyway. the CHNOPS nanomachines just do that.
i think human advancement *has* followed an exponential-*looking* curve since that point, albeit with a low base.
-
@glyph reading this thread was a great cap to my evening, thanks
@darkuncle very kind of you to say so, thanks
-
the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion
@glyph If you study population ecology, you learn there are two outcomes of exponential growth. Sigmoid is the pretty one. Spike-and-crash is the common one.
-
R relay@relay.publicsquare.global shared this topic
-
the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion
@glyph
People also forget that the definition of singularity was simply a point beyond which we have no hope of making any accurate predictions.
Reaching the singularity didn't necessarily mean that we would suddenly get AGI or extropian uploading or any of the myriad other things other science fiction authors layered on it or ascribed to it.
That original definition might still apply to a sigmoid, but obviously it's much less certain. -
doomers might look at my rant here and think, "but wait, once it's self-sustaining, even a little, it's TOO LATE, it's already out of control!!!" and to that I say: no. not even close. look the evolution of *any* business. managing resource flows is really hard. there is an off-ramp every single day
@glyph that and also they're all slop machines that generates shit in the first place even when begged not to screw up
-
RE: https://mastodon.social/@glyph/115076275195904439
I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy
@glyph I think the closest worry I can see is more a logistical collapse due to semiautomation causing massive planning issues
A real life equivalent to “ah why are my servers all falling over…. Oh disk space” but for some planning processes all optimizing on some weird axis.
Not a singularity so much as just a bunch of pain from us shifting more and more into automated decision making and having less eyeballs on intermediate results. Still… humans will be in the loop in so many spots!
-
@glyph
People also forget that the definition of singularity was simply a point beyond which we have no hope of making any accurate predictions.
Reaching the singularity didn't necessarily mean that we would suddenly get AGI or extropian uploading or any of the myriad other things other science fiction authors layered on it or ascribed to it.
That original definition might still apply to a sigmoid, but obviously it's much less certain. -
casual thinkpieces and lazy attempts at scicomm are what has set me off but the actual thing I'm mad about is that we are ruled by people with a child's understanding of the world and the economy and that's actually really bad
@glyph really good to read a sane alternative to what is usually said in the media about AI
-
the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion
@glyph yeah it's the rapture for people who find computers easier to believe in than old men
-
RE: https://mastodon.social/@glyph/115076275195904439
I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy
@glyph The only scenario I’ve found interesting is the idea that a sufficiently advanced AI doesn’t need to replace the people, just be so amazingly perceptive that it can convince, blackmail, or threaten anyone it can communicate with into doing anything it wanted.
It’s a great idea… when I read it in 2000AD comics. But only good enough to be my third favourite series after Judge Dredd and Rogue Trooper, not something that keeps me up at night.
-
seriously just imagine the plot of one of the movies that doomers seem to think are documentaries, like Terminator 2. imagine the scene where the T-1000 is getting pelted with bullets. instead of seamlessly autonomously healing, imagine it has to lie down and wait for a human to place an order for $1,000,000 of NVIDIA GPUs to be delivered in a shipping container and then a construction crew to set up a methane generator to run for two weeks straight before it got up again. is that still scary?
@glyph I've seen enough movies to know that the whole thing will come crashing down due to a very tiny inconsequential unnoticed design flaw. You know, like an expired SSL certificate.
-
seriously just imagine the plot of one of the movies that doomers seem to think are documentaries, like Terminator 2. imagine the scene where the T-1000 is getting pelted with bullets. instead of seamlessly autonomously healing, imagine it has to lie down and wait for a human to place an order for $1,000,000 of NVIDIA GPUs to be delivered in a shipping container and then a construction crew to set up a methane generator to run for two weeks straight before it got up again. is that still scary?
@glyph This is a great thread but it IS scary to consider that there absolutely would be police standing guard over it until it can be fixed, people saying “If we don't repair the transforming killing machine, China will,” an op-ed in the NYT headed “My Don’t-Want-To-Be-Killed-By-a-Smirking-Robert-Patrick Friends Are Crazy,” principals signing deals with Google to have murderbots stalk classrooms (guardrails: only kill kids named John Connor), &c
-
seriously just imagine the plot of one of the movies that doomers seem to think are documentaries, like Terminator 2. imagine the scene where the T-1000 is getting pelted with bullets. instead of seamlessly autonomously healing, imagine it has to lie down and wait for a human to place an order for $1,000,000 of NVIDIA GPUs to be delivered in a shipping container and then a construction crew to set up a methane generator to run for two weeks straight before it got up again. is that still scary?
@glyph skynet was so intelligent, they built terminators so efficienly, they run on bare 6502s ; they dont even need nvidia GPUs.
LLMs are not even close.
-
doomers might look at my rant here and think, "but wait, once it's self-sustaining, even a little, it's TOO LATE, it's already out of control!!!" and to that I say: no. not even close. look the evolution of *any* business. managing resource flows is really hard. there is an off-ramp every single day
@glyph Another counterpoint: Every single zombie apocalypse scenario, where the collapse of human infrastructure and supply chains is so absolute, not even the zombies disappearing overnight would still lead to years, if not decades of recovery.
-
if, in order to achieve your out-of-control doomsday robot scenario, a trillion dollars worth of human effort must be expended annually, and if any of it stops for even a moment than the whole thing implodes and grinds to a halt, _you can stop worrying_ that it is "the machines" which dominate us
@glyph above all, if people believe singularity is scary, why the fuck do they invest a trillion $/yr to try to reach it ? At that point, we should try to convince them that a bigger CERN could really provoke a black hole on earth.
Won’t work either, but at least we’ll have something useful at the end for a fraction of the price! -
like if anyone had halfway-plausible "grey goo" nanotech that could do anything that looked like computation, that might be worrying. a locally viable self-reproducing platform that can make another one of itself from a pile of dirt, even if it's like, special dirt, that might scare me a little bit. but an overlord hive-mind that requires an uninterrupted global high-purity helium supply chain just to make ONE more of itself is supposed to be a threat?
@glyph Goddammit, this is twice in a row I'm forced to root for, of all things, the government of Iran.
Edit: For context, a lot of the world's helium trade goes through, you guessed it, the Strait of Hormuz.
-
the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion
@glyph Believing LLM chatbots will achieve singularity is like someone believing teleportation and manufacture-anything-machines are right around the corner because they once saw a magician perform a magic trick.