Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

Scheduled Pinned Locked Moved Uncategorized
67 Posts 35 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • mcc@mastodon.socialM mcc@mastodon.social

    @glyph however i *am* scared of the economic system that would bother to do that

    mcc@mastodon.socialM This user is from outside of this forum
    mcc@mastodon.socialM This user is from outside of this forum
    mcc@mastodon.social
    wrote last edited by
    #29

    @glyph this is how you turn a post-scarcity society into … … whatever it is we're doing now

    1 Reply Last reply
    0
    • glyph@mastodon.socialG glyph@mastodon.social

      we are not even remotely close to a single LLM meaningfully constructing even a portion of the pipeline to train another LLM. you can sort of argue around the edges that maybe under certain synthetic conditions this is borderline possible now, but on the "singularity" progress bar, that is 0.5%

      darkuncle@infosec.exchangeD This user is from outside of this forum
      darkuncle@infosec.exchangeD This user is from outside of this forum
      darkuncle@infosec.exchange
      wrote last edited by
      #30

      @glyph it’s like saying that the Parker solar probe being the fastest thing ever produced by humanity, at nearly 500,000 mph, is a material step towards relativistic travel.

      1 Reply Last reply
      0
      • theorangetheme@en.osm.townT theorangetheme@en.osm.town

        @glyph Ah yes, the Singularity: a thing that its religious adherents can't define but which will almost certainly be ushered in by chatbots that tell you to put glue on pizza.

        Put me on Artemis III, I'm done here.

        scott@sfba.socialS This user is from outside of this forum
        scott@sfba.socialS This user is from outside of this forum
        scott@sfba.social
        wrote last edited by
        #31

        @theorangetheme @glyph You know Artemis III is just a quick out-and-back, right? You may prefer to be scheduled for one of the longer distance missions. 😜

        1 Reply Last reply
        0
        • glyph@mastodon.socialG glyph@mastodon.social

          RE: https://mastodon.social/@glyph/115076275195904439

          I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy

          pkraus@berlin.socialP This user is from outside of this forum
          pkraus@berlin.socialP This user is from outside of this forum
          pkraus@berlin.social
          wrote last edited by
          #32

          @glyph yes, but to destroy is so much easier than to create. I worry that some moron might put spicy autocorrect in charge of a hydro dam or one of those shooty tooties the US has all over Europe. It wouldn't take much for Musk to (accidentally on purpose) Heinlein us all with Starlink.

          glyph@mastodon.socialG 1 Reply Last reply
          0
          • pkraus@berlin.socialP pkraus@berlin.social

            @glyph yes, but to destroy is so much easier than to create. I worry that some moron might put spicy autocorrect in charge of a hydro dam or one of those shooty tooties the US has all over Europe. It wouldn't take much for Musk to (accidentally on purpose) Heinlein us all with Starlink.

            glyph@mastodon.socialG This user is from outside of this forum
            glyph@mastodon.socialG This user is from outside of this forum
            glyph@mastodon.social
            wrote last edited by
            #33

            @pkraus there are lots of very scary things happening right now, it's just that "swarms of killer robots with minds beyond our comprehension" are not among them

            1 Reply Last reply
            0
            • glyph@mastodon.socialG glyph@mastodon.social

              casual thinkpieces and lazy attempts at scicomm are what has set me off but the actual thing I'm mad about is that we are ruled by people with a child's understanding of the world and the economy and that's actually really bad

              darkuncle@infosec.exchangeD This user is from outside of this forum
              darkuncle@infosec.exchangeD This user is from outside of this forum
              darkuncle@infosec.exchange
              wrote last edited by
              #34

              @glyph reading this thread was a great cap to my evening, thanks

              glyph@mastodon.socialG 1 Reply Last reply
              0
              • mcc@mastodon.socialM mcc@mastodon.social

                @glyph my assertion was that the singularity, as described by ray kurzweil, accurately describes the invention of writing, and i don't see why it would be more interesting if the self-improving intelligent mechanism were made of etched silicon instead of CHNOPS nanomachines. it is harder for etched silicon to self-reproduce, anyway. the CHNOPS nanomachines just do that.

                i think human advancement *has* followed an exponential-*looking* curve since that point, albeit with a low base.

                darkuncle@infosec.exchangeD This user is from outside of this forum
                darkuncle@infosec.exchangeD This user is from outside of this forum
                darkuncle@infosec.exchange
                wrote last edited by
                #35

                @mcc @glyph agree all along but also highly recommend “The Exponential Age” as a good read. Part of the problem with exponential growth is our tendency to assume it will continue.

                1 Reply Last reply
                0
                • darkuncle@infosec.exchangeD darkuncle@infosec.exchange

                  @glyph reading this thread was a great cap to my evening, thanks

                  glyph@mastodon.socialG This user is from outside of this forum
                  glyph@mastodon.socialG This user is from outside of this forum
                  glyph@mastodon.social
                  wrote last edited by
                  #36

                  @darkuncle very kind of you to say so, thanks

                  1 Reply Last reply
                  0
                  • glyph@mastodon.socialG glyph@mastodon.social

                    the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

                    zenkat@sfba.socialZ This user is from outside of this forum
                    zenkat@sfba.socialZ This user is from outside of this forum
                    zenkat@sfba.social
                    wrote last edited by
                    #37

                    @glyph If you study population ecology, you learn there are two outcomes of exponential growth. Sigmoid is the pretty one. Spike-and-crash is the common one.

                    1 Reply Last reply
                    0
                    • R relay@relay.publicsquare.global shared this topic
                    • glyph@mastodon.socialG glyph@mastodon.social

                      the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

                      brouhaha@mastodon.socialB This user is from outside of this forum
                      brouhaha@mastodon.socialB This user is from outside of this forum
                      brouhaha@mastodon.social
                      wrote last edited by
                      #38

                      @glyph
                      People also forget that the definition of singularity was simply a point beyond which we have no hope of making any accurate predictions.
                      Reaching the singularity didn't necessarily mean that we would suddenly get AGI or extropian uploading or any of the myriad other things other science fiction authors layered on it or ascribed to it.
                      That original definition might still apply to a sigmoid, but obviously it's much less certain.

                      S 1 Reply Last reply
                      0
                      • glyph@mastodon.socialG glyph@mastodon.social

                        doomers might look at my rant here and think, "but wait, once it's self-sustaining, even a little, it's TOO LATE, it's already out of control!!!" and to that I say: no. not even close. look the evolution of *any* business. managing resource flows is really hard. there is an off-ramp every single day

                        f4grx@chaos.socialF This user is from outside of this forum
                        f4grx@chaos.socialF This user is from outside of this forum
                        f4grx@chaos.social
                        wrote last edited by
                        #39

                        @glyph that and also they're all slop machines that generates shit in the first place even when begged not to screw up

                        1 Reply Last reply
                        0
                        • glyph@mastodon.socialG glyph@mastodon.social

                          RE: https://mastodon.social/@glyph/115076275195904439

                          I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy

                          raphael@mastodon.sdf.orgR This user is from outside of this forum
                          raphael@mastodon.sdf.orgR This user is from outside of this forum
                          raphael@mastodon.sdf.org
                          wrote last edited by
                          #40

                          @glyph I think the closest worry I can see is more a logistical collapse due to semiautomation causing massive planning issues

                          A real life equivalent to “ah why are my servers all falling over…. Oh disk space” but for some planning processes all optimizing on some weird axis.

                          Not a singularity so much as just a bunch of pain from us shifting more and more into automated decision making and having less eyeballs on intermediate results. Still… humans will be in the loop in so many spots!

                          1 Reply Last reply
                          0
                          • brouhaha@mastodon.socialB brouhaha@mastodon.social

                            @glyph
                            People also forget that the definition of singularity was simply a point beyond which we have no hope of making any accurate predictions.
                            Reaching the singularity didn't necessarily mean that we would suddenly get AGI or extropian uploading or any of the myriad other things other science fiction authors layered on it or ascribed to it.
                            That original definition might still apply to a sigmoid, but obviously it's much less certain.

                            S This user is from outside of this forum
                            S This user is from outside of this forum
                            sea1am@mastodon.social
                            wrote last edited by
                            #41

                            @brouhaha @glyph

                            I thought the term Singularity was in some way a reference to the romantic lives of tech CEOs.

                            You learn something new every day.

                            1 Reply Last reply
                            0
                            • glyph@mastodon.socialG glyph@mastodon.social

                              casual thinkpieces and lazy attempts at scicomm are what has set me off but the actual thing I'm mad about is that we are ruled by people with a child's understanding of the world and the economy and that's actually really bad

                              clarkiestar@mas.toC This user is from outside of this forum
                              clarkiestar@mas.toC This user is from outside of this forum
                              clarkiestar@mas.to
                              wrote last edited by
                              #42

                              @glyph really good to read a sane alternative to what is usually said in the media about AI

                              1 Reply Last reply
                              0
                              • glyph@mastodon.socialG glyph@mastodon.social

                                the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

                                ireneista@adhd.irenes.spaceI This user is from outside of this forum
                                ireneista@adhd.irenes.spaceI This user is from outside of this forum
                                ireneista@adhd.irenes.space
                                wrote last edited by
                                #43

                                @glyph yeah it's the rapture for people who find computers easier to believe in than old men

                                1 Reply Last reply
                                0
                                • glyph@mastodon.socialG glyph@mastodon.social

                                  RE: https://mastodon.social/@glyph/115076275195904439

                                  I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy

                                  semanticist@mastodon.socialS This user is from outside of this forum
                                  semanticist@mastodon.socialS This user is from outside of this forum
                                  semanticist@mastodon.social
                                  wrote last edited by
                                  #44

                                  @glyph The only scenario I’ve found interesting is the idea that a sufficiently advanced AI doesn’t need to replace the people, just be so amazingly perceptive that it can convince, blackmail, or threaten anyone it can communicate with into doing anything it wanted.

                                  It’s a great idea… when I read it in 2000AD comics. But only good enough to be my third favourite series after Judge Dredd and Rogue Trooper, not something that keeps me up at night.

                                  1 Reply Last reply
                                  0
                                  • glyph@mastodon.socialG glyph@mastodon.social

                                    seriously just imagine the plot of one of the movies that doomers seem to think are documentaries, like Terminator 2. imagine the scene where the T-1000 is getting pelted with bullets. instead of seamlessly autonomously healing, imagine it has to lie down and wait for a human to place an order for $1,000,000 of NVIDIA GPUs to be delivered in a shipping container and then a construction crew to set up a methane generator to run for two weeks straight before it got up again. is that still scary?

                                    dabeaz@mastodon.socialD This user is from outside of this forum
                                    dabeaz@mastodon.socialD This user is from outside of this forum
                                    dabeaz@mastodon.social
                                    wrote last edited by
                                    #45

                                    @glyph I've seen enough movies to know that the whole thing will come crashing down due to a very tiny inconsequential unnoticed design flaw. You know, like an expired SSL certificate.

                                    joxn@wandering.shopJ 1 Reply Last reply
                                    0
                                    • glyph@mastodon.socialG glyph@mastodon.social

                                      seriously just imagine the plot of one of the movies that doomers seem to think are documentaries, like Terminator 2. imagine the scene where the T-1000 is getting pelted with bullets. instead of seamlessly autonomously healing, imagine it has to lie down and wait for a human to place an order for $1,000,000 of NVIDIA GPUs to be delivered in a shipping container and then a construction crew to set up a methane generator to run for two weeks straight before it got up again. is that still scary?

                                      nosword@localization.cafeN This user is from outside of this forum
                                      nosword@localization.cafeN This user is from outside of this forum
                                      nosword@localization.cafe
                                      wrote last edited by
                                      #46

                                      @glyph This is a great thread but it IS scary to consider that there absolutely would be police standing guard over it until it can be fixed, people saying “If we don't repair the transforming killing machine, China will,” an op-ed in the NYT headed “My Don’t-Want-To-Be-Killed-By-a-Smirking-Robert-Patrick Friends Are Crazy,” principals signing deals with Google to have murderbots stalk classrooms (guardrails: only kill kids named John Connor), &c

                                      1 Reply Last reply
                                      0
                                      • glyph@mastodon.socialG glyph@mastodon.social

                                        seriously just imagine the plot of one of the movies that doomers seem to think are documentaries, like Terminator 2. imagine the scene where the T-1000 is getting pelted with bullets. instead of seamlessly autonomously healing, imagine it has to lie down and wait for a human to place an order for $1,000,000 of NVIDIA GPUs to be delivered in a shipping container and then a construction crew to set up a methane generator to run for two weeks straight before it got up again. is that still scary?

                                        f4grx@chaos.socialF This user is from outside of this forum
                                        f4grx@chaos.socialF This user is from outside of this forum
                                        f4grx@chaos.social
                                        wrote last edited by
                                        #47

                                        @glyph skynet was so intelligent, they built terminators so efficienly, they run on bare 6502s ; they dont even need nvidia GPUs.

                                        LLMs are not even close.

                                        1 Reply Last reply
                                        0
                                        • glyph@mastodon.socialG glyph@mastodon.social

                                          doomers might look at my rant here and think, "but wait, once it's self-sustaining, even a little, it's TOO LATE, it's already out of control!!!" and to that I say: no. not even close. look the evolution of *any* business. managing resource flows is really hard. there is an off-ramp every single day

                                          glennseto@mastodon.socialG This user is from outside of this forum
                                          glennseto@mastodon.socialG This user is from outside of this forum
                                          glennseto@mastodon.social
                                          wrote last edited by
                                          #48

                                          @glyph Another counterpoint: Every single zombie apocalypse scenario, where the collapse of human infrastructure and supply chains is so absolute, not even the zombies disappearing overnight would still lead to years, if not decades of recovery.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups