Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. the AI alignment problem is entirely a smokescreen designed to distract from the capital class alignment problem

the AI alignment problem is entirely a smokescreen designed to distract from the capital class alignment problem

Scheduled Pinned Locked Moved Uncategorized
37 Posts 20 Posters 6 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • deshipu@fosstodon.orgD deshipu@fosstodon.org

    @travisfw @mcc @glyph are you saying bayesians are not statisticians?

    davidgerard@circumstances.runD This user is from outside of this forum
    davidgerard@circumstances.runD This user is from outside of this forum
    davidgerard@circumstances.run
    wrote last edited by
    #24

    @deshipu @travisfw @mcc @glyph there's people who apply Bayes' theorem and then there's *Bayesians*

    1 Reply Last reply
    0
    • stilescrisis@mastodon.gamedev.placeS stilescrisis@mastodon.gamedev.place

      @mcc @glyph I don't think alignment has anything to do with determinism. People are non-deterministic but a person can absolutely be ethnically aligned (or not).

      mcc@mastodon.socialM This user is from outside of this forum
      mcc@mastodon.socialM This user is from outside of this forum
      mcc@mastodon.social
      wrote last edited by
      #25

      @stilescrisis @glyph I think a certain sort of predictability is a prerequisite for alignment. Necessary but not sufficient. Humans are not deterministic but their behavior can be consistent, because they can act with intent. They can have beliefs and moral codes. They can understand their own incentives and the consequences of their actions. You can do things that cause them to understand the consequences of their actions better.

      stilescrisis@mastodon.gamedev.placeS 1 Reply Last reply
      0
      • mcc@mastodon.socialM mcc@mastodon.social

        @stilescrisis @glyph I think a certain sort of predictability is a prerequisite for alignment. Necessary but not sufficient. Humans are not deterministic but their behavior can be consistent, because they can act with intent. They can have beliefs and moral codes. They can understand their own incentives and the consequences of their actions. You can do things that cause them to understand the consequences of their actions better.

        stilescrisis@mastodon.gamedev.placeS This user is from outside of this forum
        stilescrisis@mastodon.gamedev.placeS This user is from outside of this forum
        stilescrisis@mastodon.gamedev.place
        wrote last edited by
        #26

        @mcc @glyph Right, which is why they are called "model weights" and not "model coin flips." Models are non-deterministic at the token level but pretty darn consistent at the macro level, which is why ChatGPT articles are so easy to spot. "It's not X, it's Y"; numbered lists; boldface, etc.

        mcc@mastodon.socialM 1 Reply Last reply
        0
        • stilescrisis@mastodon.gamedev.placeS stilescrisis@mastodon.gamedev.place

          @mcc @glyph Right, which is why they are called "model weights" and not "model coin flips." Models are non-deterministic at the token level but pretty darn consistent at the macro level, which is why ChatGPT articles are so easy to spot. "It's not X, it's Y"; numbered lists; boldface, etc.

          mcc@mastodon.socialM This user is from outside of this forum
          mcc@mastodon.socialM This user is from outside of this forum
          mcc@mastodon.social
          wrote last edited by
          #27

          @stilescrisis @glyph "Models are non-deterministic at the token level but pretty darn consistent at the macro level"

          At recreating the structural properties of language, yeah, because that's what the algorithm's for. But the product is not sold as a "structural properties of text simulator". It is sold as an engine for producing meaning. And when it comes to meaning the tokens matter very much, very very much

          1 Reply Last reply
          0
          • mcc@mastodon.socialM This user is from outside of this forum
            mcc@mastodon.socialM This user is from outside of this forum
            mcc@mastodon.social
            wrote last edited by
            #28

            @flipper @davidgerard @deshipu @travisfw @glyph i (a frequentist) once dated a Bayesian for a while. Nothing was learned from this experience which applies to other situations

            1 Reply Last reply
            0
            • glyph@mastodon.socialG glyph@mastodon.social

              @3psboyd @mcc I feel a *little* bad for the lesswrongers generally because this is really judging the community by its worst and most extreme elements, and here we are on fedi (not a group whose most extreme and unpleasant members I would like to represent me) but that faction is certainly … unduly powerful in society right now

              jaystephens@mastodon.socialJ This user is from outside of this forum
              jaystephens@mastodon.socialJ This user is from outside of this forum
              jaystephens@mastodon.social
              wrote last edited by
              #29

              @glyph @3psboyd @mcc
              This. I know some decent ones.
              But the decent ones tend to follow the Bentham-Utilitarianism-on-acid (aka longtermist) nutters, wherever they lead, IME.

              glyph@mastodon.socialG 1 Reply Last reply
              0
              • jaystephens@mastodon.socialJ jaystephens@mastodon.social

                @glyph @3psboyd @mcc
                This. I know some decent ones.
                But the decent ones tend to follow the Bentham-Utilitarianism-on-acid (aka longtermist) nutters, wherever they lead, IME.

                glyph@mastodon.socialG This user is from outside of this forum
                glyph@mastodon.socialG This user is from outside of this forum
                glyph@mastodon.social
                wrote last edited by
                #30

                @jaystephens @3psboyd @mcc if they were at least real Benthamites they’d get out the felicific calculus and do the damn arithmetic and not just slosh around a bunch of half-assed Fermi estimates with orders of magnitude instead of numbers

                glyph@mastodon.socialG dpnash@c.imD jaystephens@mastodon.socialJ 3 Replies Last reply
                0
                • glyph@mastodon.socialG glyph@mastodon.social

                  @jaystephens @3psboyd @mcc if they were at least real Benthamites they’d get out the felicific calculus and do the damn arithmetic and not just slosh around a bunch of half-assed Fermi estimates with orders of magnitude instead of numbers

                  glyph@mastodon.socialG This user is from outside of this forum
                  glyph@mastodon.socialG This user is from outside of this forum
                  glyph@mastodon.social
                  wrote last edited by
                  #31

                  @jaystephens @3psboyd @mcc consider this my “born in the dark” Bane speech

                  1 Reply Last reply
                  0
                  • glyph@mastodon.socialG glyph@mastodon.social

                    @jaystephens @3psboyd @mcc if they were at least real Benthamites they’d get out the felicific calculus and do the damn arithmetic and not just slosh around a bunch of half-assed Fermi estimates with orders of magnitude instead of numbers

                    dpnash@c.imD This user is from outside of this forum
                    dpnash@c.imD This user is from outside of this forum
                    dpnash@c.im
                    wrote last edited by
                    #32

                    @glyph @jaystephens @3psboyd @mcc

                    I know what “felicific calculus” refers to, but every time I see that phrase, I’m annoyed that it refers to generic happiness and not to the number of cats people have (or that they would like to have).

                    1 Reply Last reply
                    1
                    0
                    • glyph@mastodon.socialG glyph@mastodon.social

                      @xgranade I don't think there's an exaggeration here, just some uncharitable phrasing

                      flaviusb@mastodon.socialF This user is from outside of this forum
                      flaviusb@mastodon.socialF This user is from outside of this forum
                      flaviusb@mastodon.social
                      wrote last edited by
                      #33

                      @glyph @xgranade They would tend to say spacegod instead of god, intelligence instead of feelings, and spacehell instead of hell, because to them that makes it Science and Fact rather than religion or fantasy.

                      1 Reply Last reply
                      0
                      • davidgerard@circumstances.runD This user is from outside of this forum
                        davidgerard@circumstances.runD This user is from outside of this forum
                        davidgerard@circumstances.run
                        wrote last edited by
                        #34

                        @flipper @deshipu @travisfw @mcc @glyph "Bayesian" is a contraction of "Bay Area sex pest"

                        1 Reply Last reply
                        0
                        • glyph@mastodon.socialG glyph@mastodon.social

                          @jaystephens @3psboyd @mcc if they were at least real Benthamites they’d get out the felicific calculus and do the damn arithmetic and not just slosh around a bunch of half-assed Fermi estimates with orders of magnitude instead of numbers

                          jaystephens@mastodon.socialJ This user is from outside of this forum
                          jaystephens@mastodon.socialJ This user is from outside of this forum
                          jaystephens@mastodon.social
                          wrote last edited by
                          #35

                          @glyph @3psboyd @mcc
                          Mate, what's a bit of child labour in Africa compared to the happiness of the quadrillions of humans who'll flourish once we're spread across the galaxy? Any malnutrition and lost limbs in the here and now is a rounding error.

                          1 Reply Last reply
                          0
                          • xgranade@wandering.shopX xgranade@wandering.shop

                            @glyph

                            ML ethics: here's why including ZIP codes in the data used by a classifier is bad

                            AI ethics: what if some cryptogod hundreds of millennia in the future gets their feelings hurt by mean posts and decides to invent hell?

                            0x4d6165@transfem.social0 This user is from outside of this forum
                            0x4d6165@transfem.social0 This user is from outside of this forum
                            0x4d6165@transfem.social
                            wrote last edited by
                            #36

                            @glyph@mastodon.social @xgranade@wandering.shop Eliezer Yudkowsky and his consequences have been a disaster for the human race

                            1 Reply Last reply
                            0
                            • xgranade@wandering.shopX xgranade@wandering.shop

                              @glyph (I hate how little I had to exaggerate to make that joke.)

                              erik@mastodon.infrageeks.socialE This user is from outside of this forum
                              erik@mastodon.infrageeks.socialE This user is from outside of this forum
                              erik@mastodon.infrageeks.social
                              wrote last edited by
                              #37

                              @xgranade @glyph No exaggeration spotted here

                              1 Reply Last reply
                              0
                              Reply
                              • Reply as topic
                              Log in to reply
                              • Oldest to Newest
                              • Newest to Oldest
                              • Most Votes


                              • Login

                              • Login or register to search.
                              • First post
                                Last post
                              0
                              • Categories
                              • Recent
                              • Tags
                              • Popular
                              • World
                              • Users
                              • Groups