Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

Scheduled Pinned Locked Moved Uncategorized
67 Posts 35 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • glyph@mastodon.socialG glyph@mastodon.social

    the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

    snoopj@hachyderm.ioS This user is from outside of this forum
    snoopj@hachyderm.ioS This user is from outside of this forum
    snoopj@hachyderm.io
    wrote last edited by
    #3

    @glyph 1001 ways in which ray kurzweil is a very unserious man and society has been irreparably damaged by people acting otherwise

    1 Reply Last reply
    0
    • glyph@mastodon.socialG glyph@mastodon.social

      resources run out. processes hit bottlenecks. optimizations reach physical limits. perpetual motion machines are impossible for reasons that are pretty well understood

      glyph@mastodon.socialG This user is from outside of this forum
      glyph@mastodon.socialG This user is from outside of this forum
      glyph@mastodon.social
      wrote last edited by
      #4

      it is so mind-meltingly frustrating to see people think that we are close to a "singularity" with current AI technology. here's a hint about when you could worry about a disruption so big that it might, even momentarily, *appear* to be a singularity:

      a single corporation turning a profit even once

      glyph@mastodon.socialG xgranade@wandering.shopX theorangetheme@en.osm.townT 3 Replies Last reply
      0
      • glyph@mastodon.socialG glyph@mastodon.social

        it is so mind-meltingly frustrating to see people think that we are close to a "singularity" with current AI technology. here's a hint about when you could worry about a disruption so big that it might, even momentarily, *appear* to be a singularity:

        a single corporation turning a profit even once

        glyph@mastodon.socialG This user is from outside of this forum
        glyph@mastodon.socialG This user is from outside of this forum
        glyph@mastodon.social
        wrote last edited by
        #5

        in order to be a singularity candidate, an AI would need to achieve vertical integration from silicon fabrication through logistics and integration, into operating systems and applications, with tight whole-system feedback from the robotics to the shipping to the power generation and back

        glyph@mastodon.socialG r343l@freeradical.zoneR varx@cybersecurity.theaterV 3 Replies Last reply
        0
        • glyph@mastodon.socialG glyph@mastodon.social

          it is so mind-meltingly frustrating to see people think that we are close to a "singularity" with current AI technology. here's a hint about when you could worry about a disruption so big that it might, even momentarily, *appear* to be a singularity:

          a single corporation turning a profit even once

          xgranade@wandering.shopX This user is from outside of this forum
          xgranade@wandering.shopX This user is from outside of this forum
          xgranade@wandering.shop
          wrote last edited by
          #6

          @glyph There's two distinct claims that are each independently ridiculous: The Singularity is a real thing, and LLMs put us closer to The Singularity.

          1 Reply Last reply
          0
          • glyph@mastodon.socialG glyph@mastodon.social

            the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

            mcc@mastodon.socialM This user is from outside of this forum
            mcc@mastodon.socialM This user is from outside of this forum
            mcc@mastodon.social
            wrote last edited by
            #7

            @glyph my assertion was that the singularity, as described by ray kurzweil, accurately describes the invention of writing, and i don't see why it would be more interesting if the self-improving intelligent mechanism were made of etched silicon instead of CHNOPS nanomachines. it is harder for etched silicon to self-reproduce, anyway. the CHNOPS nanomachines just do that.

            i think human advancement *has* followed an exponential-*looking* curve since that point, albeit with a low base.

            darkuncle@infosec.exchangeD lockex@ioc.exchangeL 2 Replies Last reply
            0
            • glyph@mastodon.socialG glyph@mastodon.social

              in order to be a singularity candidate, an AI would need to achieve vertical integration from silicon fabrication through logistics and integration, into operating systems and applications, with tight whole-system feedback from the robotics to the shipping to the power generation and back

              glyph@mastodon.socialG This user is from outside of this forum
              glyph@mastodon.socialG This user is from outside of this forum
              glyph@mastodon.social
              wrote last edited by
              #8

              we are not even remotely close to a single LLM meaningfully constructing even a portion of the pipeline to train another LLM. you can sort of argue around the edges that maybe under certain synthetic conditions this is borderline possible now, but on the "singularity" progress bar, that is 0.5%

              glyph@mastodon.socialG darkuncle@infosec.exchangeD 2 Replies Last reply
              0
              • glyph@mastodon.socialG glyph@mastodon.social

                we are not even remotely close to a single LLM meaningfully constructing even a portion of the pipeline to train another LLM. you can sort of argue around the edges that maybe under certain synthetic conditions this is borderline possible now, but on the "singularity" progress bar, that is 0.5%

                glyph@mastodon.socialG This user is from outside of this forum
                glyph@mastodon.socialG This user is from outside of this forum
                glyph@mastodon.social
                wrote last edited by
                #9

                if, in order to achieve your out-of-control doomsday robot scenario, a trillion dollars worth of human effort must be expended annually, and if any of it stops for even a moment than the whole thing implodes and grinds to a halt, _you can stop worrying_ that it is "the machines" which dominate us

                glyph@mastodon.socialG reedmideke@mastodon.socialR ced@mapstodon.spaceC 3 Replies Last reply
                0
                • glyph@mastodon.socialG glyph@mastodon.social

                  if, in order to achieve your out-of-control doomsday robot scenario, a trillion dollars worth of human effort must be expended annually, and if any of it stops for even a moment than the whole thing implodes and grinds to a halt, _you can stop worrying_ that it is "the machines" which dominate us

                  glyph@mastodon.socialG This user is from outside of this forum
                  glyph@mastodon.socialG This user is from outside of this forum
                  glyph@mastodon.social
                  wrote last edited by
                  #10

                  doomers might look at my rant here and think, "but wait, once it's self-sustaining, even a little, it's TOO LATE, it's already out of control!!!" and to that I say: no. not even close. look the evolution of *any* business. managing resource flows is really hard. there is an off-ramp every single day

                  glyph@mastodon.socialG miss_rodent@girlcock.clubM f4grx@chaos.socialF glennseto@mastodon.socialG 4 Replies Last reply
                  0
                  • glyph@mastodon.socialG glyph@mastodon.social

                    it is so mind-meltingly frustrating to see people think that we are close to a "singularity" with current AI technology. here's a hint about when you could worry about a disruption so big that it might, even momentarily, *appear* to be a singularity:

                    a single corporation turning a profit even once

                    theorangetheme@en.osm.townT This user is from outside of this forum
                    theorangetheme@en.osm.townT This user is from outside of this forum
                    theorangetheme@en.osm.town
                    wrote last edited by
                    #11

                    @glyph Ah yes, the Singularity: a thing that its religious adherents can't define but which will almost certainly be ushered in by chatbots that tell you to put glue on pizza.

                    Put me on Artemis III, I'm done here.

                    scott@sfba.socialS 1 Reply Last reply
                    0
                    • glyph@mastodon.socialG glyph@mastodon.social

                      the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion

                      suetanvil@freeradical.zoneS This user is from outside of this forum
                      suetanvil@freeradical.zoneS This user is from outside of this forum
                      suetanvil@freeradical.zone
                      wrote last edited by
                      #12

                      @glyph

                      tbf, it made for some *great* science fiction in the 90s.

                      glyph@mastodon.socialG 1 Reply Last reply
                      0
                      • glyph@mastodon.socialG glyph@mastodon.social

                        doomers might look at my rant here and think, "but wait, once it's self-sustaining, even a little, it's TOO LATE, it's already out of control!!!" and to that I say: no. not even close. look the evolution of *any* business. managing resource flows is really hard. there is an off-ramp every single day

                        glyph@mastodon.socialG This user is from outside of this forum
                        glyph@mastodon.socialG This user is from outside of this forum
                        glyph@mastodon.social
                        wrote last edited by
                        #13

                        RE: https://mastodon.social/@glyph/115076275195904439

                        I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy

                        glyph@mastodon.socialG dpnash@c.imD pkraus@berlin.socialP raphael@mastodon.sdf.orgR semanticist@mastodon.socialS 5 Replies Last reply
                        1
                        0
                        • glyph@mastodon.socialG glyph@mastodon.social

                          RE: https://mastodon.social/@glyph/115076275195904439

                          I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy

                          glyph@mastodon.socialG This user is from outside of this forum
                          glyph@mastodon.socialG This user is from outside of this forum
                          glyph@mastodon.social
                          wrote last edited by
                          #14

                          put ME on CNN and MSNBC, you cowards.

                          xgranade@wandering.shopX petrillic@hachyderm.ioP glyph@mastodon.socialG 3 Replies Last reply
                          0
                          • suetanvil@freeradical.zoneS suetanvil@freeradical.zone

                            @glyph

                            tbf, it made for some *great* science fiction in the 90s.

                            glyph@mastodon.socialG This user is from outside of this forum
                            glyph@mastodon.socialG This user is from outside of this forum
                            glyph@mastodon.social
                            wrote last edited by
                            #15

                            @suetanvil WHY CAN OUR GENERATION'S SUPPOSEDLY GREATEST MINDS NOT DISTINGUISH BETWEEN REALITY AND FANTASY

                            glyph@mastodon.socialG 1 Reply Last reply
                            0
                            • glyph@mastodon.socialG glyph@mastodon.social

                              @suetanvil WHY CAN OUR GENERATION'S SUPPOSEDLY GREATEST MINDS NOT DISTINGUISH BETWEEN REALITY AND FANTASY

                              glyph@mastodon.socialG This user is from outside of this forum
                              glyph@mastodon.socialG This user is from outside of this forum
                              glyph@mastodon.social
                              wrote last edited by
                              #16

                              @suetanvil it's ruining my ability to appreciate the fantasy!!!

                              suetanvil@freeradical.zoneS 1 Reply Last reply
                              0
                              • glyph@mastodon.socialG glyph@mastodon.social

                                if, in order to achieve your out-of-control doomsday robot scenario, a trillion dollars worth of human effort must be expended annually, and if any of it stops for even a moment than the whole thing implodes and grinds to a halt, _you can stop worrying_ that it is "the machines" which dominate us

                                reedmideke@mastodon.socialR This user is from outside of this forum
                                reedmideke@mastodon.socialR This user is from outside of this forum
                                reedmideke@mastodon.social
                                wrote last edited by
                                #17

                                @glyph Maybe the real singularity was the <s>friends we made along the way</s> black hole we dumped all our cash into

                                1 Reply Last reply
                                0
                                • glyph@mastodon.socialG glyph@mastodon.social

                                  put ME on CNN and MSNBC, you cowards.

                                  xgranade@wandering.shopX This user is from outside of this forum
                                  xgranade@wandering.shopX This user is from outside of this forum
                                  xgranade@wandering.shop
                                  wrote last edited by
                                  #18

                                  @glyph That would be such an incredible improvement over their current coverage *and* I would pay to see that. Both things can be true.

                                  1 Reply Last reply
                                  0
                                  • glyph@mastodon.socialG glyph@mastodon.social

                                    put ME on CNN and MSNBC, you cowards.

                                    petrillic@hachyderm.ioP This user is from outside of this forum
                                    petrillic@hachyderm.ioP This user is from outside of this forum
                                    petrillic@hachyderm.io
                                    wrote last edited by
                                    #19

                                    @glyph i would pay for that.

                                    1 Reply Last reply
                                    0
                                    • glyph@mastodon.socialG glyph@mastodon.social

                                      doomers might look at my rant here and think, "but wait, once it's self-sustaining, even a little, it's TOO LATE, it's already out of control!!!" and to that I say: no. not even close. look the evolution of *any* business. managing resource flows is really hard. there is an off-ramp every single day

                                      miss_rodent@girlcock.clubM This user is from outside of this forum
                                      miss_rodent@girlcock.clubM This user is from outside of this forum
                                      miss_rodent@girlcock.club
                                      wrote last edited by
                                      #20

                                      @glyph Ants are self-sustaining, self-reproducing, more intelligent than any AI humans have managed to make, and capable of directly altering the physical world.
                                      If 'self-sustaining' were really the break-point, humans lost well before we existed as a species.

                                      1 Reply Last reply
                                      0
                                      • glyph@mastodon.socialG glyph@mastodon.social

                                        put ME on CNN and MSNBC, you cowards.

                                        glyph@mastodon.socialG This user is from outside of this forum
                                        glyph@mastodon.socialG This user is from outside of this forum
                                        glyph@mastodon.social
                                        wrote last edited by
                                        #21

                                        like if anyone had halfway-plausible "grey goo" nanotech that could do anything that looked like computation, that might be worrying. a locally viable self-reproducing platform that can make another one of itself from a pile of dirt, even if it's like, special dirt, that might scare me a little bit. but an overlord hive-mind that requires an uninterrupted global high-purity helium supply chain just to make ONE more of itself is supposed to be a threat?

                                        glyph@mastodon.socialG glennseto@mastodon.socialG 2 Replies Last reply
                                        0
                                        • glyph@mastodon.socialG glyph@mastodon.social

                                          RE: https://mastodon.social/@glyph/115076275195904439

                                          I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy

                                          dpnash@c.imD This user is from outside of this forum
                                          dpnash@c.imD This user is from outside of this forum
                                          dpnash@c.im
                                          wrote last edited by
                                          #22

                                          @glyph Or any other subject, really.

                                          Even in STEM.

                                          Like the introductory biology class I took with its toy population models that went sigmoid very quickly, simply because biologists understand that populations of living things hit barriers to growth. Or the control systems engineering class I took, where we figured out how to tell which parts of the system behavior would be good over the long term, which (to oversimplify only slightly) meant *no positive exponentials* anywhere in the math.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups