Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

Scheduled Pinned Locked Moved Uncategorized
67 Posts 35 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • glyph@mastodon.socialG glyph@mastodon.social

    put ME on CNN and MSNBC, you cowards.

    petrillic@hachyderm.ioP This user is from outside of this forum
    petrillic@hachyderm.ioP This user is from outside of this forum
    petrillic@hachyderm.io
    wrote last edited by
    #19

    @glyph i would pay for that.

    1 Reply Last reply
    0
    • glyph@mastodon.socialG glyph@mastodon.social

      doomers might look at my rant here and think, "but wait, once it's self-sustaining, even a little, it's TOO LATE, it's already out of control!!!" and to that I say: no. not even close. look the evolution of *any* business. managing resource flows is really hard. there is an off-ramp every single day

      miss_rodent@girlcock.clubM This user is from outside of this forum
      miss_rodent@girlcock.clubM This user is from outside of this forum
      miss_rodent@girlcock.club
      wrote last edited by
      #20

      @glyph Ants are self-sustaining, self-reproducing, more intelligent than any AI humans have managed to make, and capable of directly altering the physical world.
      If 'self-sustaining' were really the break-point, humans lost well before we existed as a species.

      1 Reply Last reply
      0
      • glyph@mastodon.socialG glyph@mastodon.social

        put ME on CNN and MSNBC, you cowards.

        glyph@mastodon.socialG This user is from outside of this forum
        glyph@mastodon.socialG This user is from outside of this forum
        glyph@mastodon.social
        wrote last edited by
        #21

        like if anyone had halfway-plausible "grey goo" nanotech that could do anything that looked like computation, that might be worrying. a locally viable self-reproducing platform that can make another one of itself from a pile of dirt, even if it's like, special dirt, that might scare me a little bit. but an overlord hive-mind that requires an uninterrupted global high-purity helium supply chain just to make ONE more of itself is supposed to be a threat?

        glyph@mastodon.socialG glennseto@mastodon.socialG 2 Replies Last reply
        0
        • glyph@mastodon.socialG glyph@mastodon.social

          RE: https://mastodon.social/@glyph/115076275195904439

          I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy

          dpnash@c.imD This user is from outside of this forum
          dpnash@c.imD This user is from outside of this forum
          dpnash@c.im
          wrote last edited by
          #22

          @glyph Or any other subject, really.

          Even in STEM.

          Like the introductory biology class I took with its toy population models that went sigmoid very quickly, simply because biologists understand that populations of living things hit barriers to growth. Or the control systems engineering class I took, where we figured out how to tell which parts of the system behavior would be good over the long term, which (to oversimplify only slightly) meant *no positive exponentials* anywhere in the math.

          1 Reply Last reply
          0
          • glyph@mastodon.socialG glyph@mastodon.social

            like if anyone had halfway-plausible "grey goo" nanotech that could do anything that looked like computation, that might be worrying. a locally viable self-reproducing platform that can make another one of itself from a pile of dirt, even if it's like, special dirt, that might scare me a little bit. but an overlord hive-mind that requires an uninterrupted global high-purity helium supply chain just to make ONE more of itself is supposed to be a threat?

            glyph@mastodon.socialG This user is from outside of this forum
            glyph@mastodon.socialG This user is from outside of this forum
            glyph@mastodon.social
            wrote last edited by
            #23

            seriously just imagine the plot of one of the movies that doomers seem to think are documentaries, like Terminator 2. imagine the scene where the T-1000 is getting pelted with bullets. instead of seamlessly autonomously healing, imagine it has to lie down and wait for a human to place an order for $1,000,000 of NVIDIA GPUs to be delivered in a shipping container and then a construction crew to set up a methane generator to run for two weeks straight before it got up again. is that still scary?

            glyph@mastodon.socialG miss_rodent@girlcock.clubM mcc@mastodon.socialM dabeaz@mastodon.socialD nosword@localization.cafeN 7 Replies Last reply
            0
            • glyph@mastodon.socialG glyph@mastodon.social

              seriously just imagine the plot of one of the movies that doomers seem to think are documentaries, like Terminator 2. imagine the scene where the T-1000 is getting pelted with bullets. instead of seamlessly autonomously healing, imagine it has to lie down and wait for a human to place an order for $1,000,000 of NVIDIA GPUs to be delivered in a shipping container and then a construction crew to set up a methane generator to run for two weeks straight before it got up again. is that still scary?

              glyph@mastodon.socialG This user is from outside of this forum
              glyph@mastodon.socialG This user is from outside of this forum
              glyph@mastodon.social
              wrote last edited by
              #24

              casual thinkpieces and lazy attempts at scicomm are what has set me off but the actual thing I'm mad about is that we are ruled by people with a child's understanding of the world and the economy and that's actually really bad

              darkuncle@infosec.exchangeD clarkiestar@mas.toC mathaetaes@infosec.exchangeM npars01@mstdn.socialN 4 Replies Last reply
              0
              • glyph@mastodon.socialG glyph@mastodon.social

                seriously just imagine the plot of one of the movies that doomers seem to think are documentaries, like Terminator 2. imagine the scene where the T-1000 is getting pelted with bullets. instead of seamlessly autonomously healing, imagine it has to lie down and wait for a human to place an order for $1,000,000 of NVIDIA GPUs to be delivered in a shipping container and then a construction crew to set up a methane generator to run for two weeks straight before it got up again. is that still scary?

                miss_rodent@girlcock.clubM This user is from outside of this forum
                miss_rodent@girlcock.clubM This user is from outside of this forum
                miss_rodent@girlcock.club
                wrote last edited by
                #25

                @glyph Meanwhile, the actual potential (but mitigatable) doom - the methane generators poisoning the air and worsening the severity and frequency of climate disasters, the technofascists spending obscene amounts of money undermining governments and trying to radicalize large parts of the population, while burning resources at a rate only a captain planet villain could find reasonable, etc. - goes largely unremarkedupon -_-;

                miss_rodent@girlcock.clubM 1 Reply Last reply
                0
                • miss_rodent@girlcock.clubM miss_rodent@girlcock.club

                  @glyph Meanwhile, the actual potential (but mitigatable) doom - the methane generators poisoning the air and worsening the severity and frequency of climate disasters, the technofascists spending obscene amounts of money undermining governments and trying to radicalize large parts of the population, while burning resources at a rate only a captain planet villain could find reasonable, etc. - goes largely unremarkedupon -_-;

                  miss_rodent@girlcock.clubM This user is from outside of this forum
                  miss_rodent@girlcock.clubM This user is from outside of this forum
                  miss_rodent@girlcock.club
                  wrote last edited by
                  #26

                  @glyph so much energy, and so many articles, are going into the scifi-fanfiction doom that it's seemingly crowding out the actual, tangible, presently-addressable, immanent problems, that have fuck-all to do with chatbot pseudogods, and everything to do with the people building them.

                  1 Reply Last reply
                  0
                  • glyph@mastodon.socialG glyph@mastodon.social

                    in order to be a singularity candidate, an AI would need to achieve vertical integration from silicon fabrication through logistics and integration, into operating systems and applications, with tight whole-system feedback from the robotics to the shipping to the power generation and back

                    r343l@freeradical.zoneR This user is from outside of this forum
                    r343l@freeradical.zoneR This user is from outside of this forum
                    r343l@freeradical.zone
                    wrote last edited by
                    #27

                    @glyph With literally not even one minuscule step in the process dependent on a human doing things at human speed via flesh and blood movement. Because if any of those processes could be made to work without a human doing something physical, why wouldn’t the people with money have done that already??

                    1 Reply Last reply
                    0
                    • glyph@mastodon.socialG glyph@mastodon.social

                      seriously just imagine the plot of one of the movies that doomers seem to think are documentaries, like Terminator 2. imagine the scene where the T-1000 is getting pelted with bullets. instead of seamlessly autonomously healing, imagine it has to lie down and wait for a human to place an order for $1,000,000 of NVIDIA GPUs to be delivered in a shipping container and then a construction crew to set up a methane generator to run for two weeks straight before it got up again. is that still scary?

                      mcc@mastodon.socialM This user is from outside of this forum
                      mcc@mastodon.socialM This user is from outside of this forum
                      mcc@mastodon.social
                      wrote last edited by
                      #28

                      @glyph however i *am* scared of the economic system that would bother to do that

                      mcc@mastodon.socialM 1 Reply Last reply
                      0
                      • mcc@mastodon.socialM mcc@mastodon.social

                        @glyph however i *am* scared of the economic system that would bother to do that

                        mcc@mastodon.socialM This user is from outside of this forum
                        mcc@mastodon.socialM This user is from outside of this forum
                        mcc@mastodon.social
                        wrote last edited by
                        #29

                        @glyph this is how you turn a post-scarcity society into … … whatever it is we're doing now

                        1 Reply Last reply
                        0
                        • glyph@mastodon.socialG glyph@mastodon.social

                          we are not even remotely close to a single LLM meaningfully constructing even a portion of the pipeline to train another LLM. you can sort of argue around the edges that maybe under certain synthetic conditions this is borderline possible now, but on the "singularity" progress bar, that is 0.5%

                          darkuncle@infosec.exchangeD This user is from outside of this forum
                          darkuncle@infosec.exchangeD This user is from outside of this forum
                          darkuncle@infosec.exchange
                          wrote last edited by
                          #30

                          @glyph it’s like saying that the Parker solar probe being the fastest thing ever produced by humanity, at nearly 500,000 mph, is a material step towards relativistic travel.

                          1 Reply Last reply
                          0
                          • theorangetheme@en.osm.townT theorangetheme@en.osm.town

                            @glyph Ah yes, the Singularity: a thing that its religious adherents can't define but which will almost certainly be ushered in by chatbots that tell you to put glue on pizza.

                            Put me on Artemis III, I'm done here.

                            scott@sfba.socialS This user is from outside of this forum
                            scott@sfba.socialS This user is from outside of this forum
                            scott@sfba.social
                            wrote last edited by
                            #31

                            @theorangetheme @glyph You know Artemis III is just a quick out-and-back, right? You may prefer to be scheduled for one of the longer distance missions. 😜

                            1 Reply Last reply
                            0
                            • glyph@mastodon.socialG glyph@mastodon.social

                              RE: https://mastodon.social/@glyph/115076275195904439

                              I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy

                              pkraus@berlin.socialP This user is from outside of this forum
                              pkraus@berlin.socialP This user is from outside of this forum
                              pkraus@berlin.social
                              wrote last edited by
                              #32

                              @glyph yes, but to destroy is so much easier than to create. I worry that some moron might put spicy autocorrect in charge of a hydro dam or one of those shooty tooties the US has all over Europe. It wouldn't take much for Musk to (accidentally on purpose) Heinlein us all with Starlink.

                              glyph@mastodon.socialG 1 Reply Last reply
                              0
                              • pkraus@berlin.socialP pkraus@berlin.social

                                @glyph yes, but to destroy is so much easier than to create. I worry that some moron might put spicy autocorrect in charge of a hydro dam or one of those shooty tooties the US has all over Europe. It wouldn't take much for Musk to (accidentally on purpose) Heinlein us all with Starlink.

                                glyph@mastodon.socialG This user is from outside of this forum
                                glyph@mastodon.socialG This user is from outside of this forum
                                glyph@mastodon.social
                                wrote last edited by
                                #33

                                @pkraus there are lots of very scary things happening right now, it's just that "swarms of killer robots with minds beyond our comprehension" are not among them

                                1 Reply Last reply
                                0
                                • glyph@mastodon.socialG glyph@mastodon.social

                                  casual thinkpieces and lazy attempts at scicomm are what has set me off but the actual thing I'm mad about is that we are ruled by people with a child's understanding of the world and the economy and that's actually really bad

                                  darkuncle@infosec.exchangeD This user is from outside of this forum
                                  darkuncle@infosec.exchangeD This user is from outside of this forum
                                  darkuncle@infosec.exchange
                                  wrote last edited by
                                  #34

                                  @glyph reading this thread was a great cap to my evening, thanks

                                  glyph@mastodon.socialG 1 Reply Last reply
                                  0
                                  • mcc@mastodon.socialM mcc@mastodon.social

                                    @glyph my assertion was that the singularity, as described by ray kurzweil, accurately describes the invention of writing, and i don't see why it would be more interesting if the self-improving intelligent mechanism were made of etched silicon instead of CHNOPS nanomachines. it is harder for etched silicon to self-reproduce, anyway. the CHNOPS nanomachines just do that.

                                    i think human advancement *has* followed an exponential-*looking* curve since that point, albeit with a low base.

                                    darkuncle@infosec.exchangeD This user is from outside of this forum
                                    darkuncle@infosec.exchangeD This user is from outside of this forum
                                    darkuncle@infosec.exchange
                                    wrote last edited by
                                    #35

                                    @mcc @glyph agree all along but also highly recommend “The Exponential Age” as a good read. Part of the problem with exponential growth is our tendency to assume it will continue.

                                    1 Reply Last reply
                                    0
                                    • darkuncle@infosec.exchangeD darkuncle@infosec.exchange

                                      @glyph reading this thread was a great cap to my evening, thanks

                                      glyph@mastodon.socialG This user is from outside of this forum
                                      glyph@mastodon.socialG This user is from outside of this forum
                                      glyph@mastodon.social
                                      wrote last edited by
                                      #36

                                      @darkuncle very kind of you to say so, thanks

                                      1 Reply Last reply
                                      0
                                      • glyph@mastodon.socialG glyph@mastodon.social

                                        the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

                                        zenkat@sfba.socialZ This user is from outside of this forum
                                        zenkat@sfba.socialZ This user is from outside of this forum
                                        zenkat@sfba.social
                                        wrote last edited by
                                        #37

                                        @glyph If you study population ecology, you learn there are two outcomes of exponential growth. Sigmoid is the pretty one. Spike-and-crash is the common one.

                                        1 Reply Last reply
                                        0
                                        • R relay@relay.publicsquare.global shared this topic
                                        • glyph@mastodon.socialG glyph@mastodon.social

                                          the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

                                          brouhaha@mastodon.socialB This user is from outside of this forum
                                          brouhaha@mastodon.socialB This user is from outside of this forum
                                          brouhaha@mastodon.social
                                          wrote last edited by
                                          #38

                                          @glyph
                                          People also forget that the definition of singularity was simply a point beyond which we have no hope of making any accurate predictions.
                                          Reaching the singularity didn't necessarily mean that we would suddenly get AGI or extropian uploading or any of the myriad other things other science fiction authors layered on it or ascribed to it.
                                          That original definition might still apply to a sigmoid, but obviously it's much less certain.

                                          S 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups