Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. Had a lot of fun with my stats students today.

Had a lot of fun with my stats students today.

Scheduled Pinned Locked Moved Uncategorized
112 Posts 62 Posters 18 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • futurebird@sauropods.winF futurebird@sauropods.win

    "Why don't you just load a library to find the mean and SD?"

    Because I'M OLD. I like to write my own function. I do it for integration sometimes... kids these days.

    ohmu@social.seattle.wa.usO This user is from outside of this forum
    ohmu@social.seattle.wa.usO This user is from outside of this forum
    ohmu@social.seattle.wa.us
    wrote last edited by
    #4

    @futurebird
    When I was a kid, we solved integrals in the snow and rain uphill in both directions.

    ai6yr@m.ai6yr.orgA 1 Reply Last reply
    0
    • ohmu@social.seattle.wa.usO ohmu@social.seattle.wa.us

      @futurebird
      When I was a kid, we solved integrals in the snow and rain uphill in both directions.

      ai6yr@m.ai6yr.orgA This user is from outside of this forum
      ai6yr@m.ai6yr.orgA This user is from outside of this forum
      ai6yr@m.ai6yr.org
      wrote last edited by
      #5

      @ohmu @futurebird LOL 42 and 73 are my picks for "random" numbers out of the LLMs, for now.

      meuwese@mastodon.socialM 1 Reply Last reply
      0
      • futurebird@sauropods.winF futurebird@sauropods.win

        "Why don't you just load a library to find the mean and SD?"

        Because I'M OLD. I like to write my own function. I do it for integration sometimes... kids these days.

        ldpm@wandering.shopL This user is from outside of this forum
        ldpm@wandering.shopL This user is from outside of this forum
        ldpm@wandering.shop
        wrote last edited by
        #6

        @futurebird I know how to find the SD and I will use the php-stats library every day of the week and twice on Sunday. I would much rather be able to depend on well supported community code. (At least until it is all replaced by ai slop)

        sabik@rants.auS futurebird@sauropods.winF 2 Replies Last reply
        0
        • futurebird@sauropods.winF futurebird@sauropods.win

          "Why don't you just load a library to find the mean and SD?"

          Because I'M OLD. I like to write my own function. I do it for integration sometimes... kids these days.

          perigee@rage.loveP This user is from outside of this forum
          perigee@rage.loveP This user is from outside of this forum
          perigee@rage.love
          wrote last edited by
          #7

          @futurebird early in my physical chemistry researcher fellowship I had to write an algorithm to do a Levenberg-Marquardt least squares curve fitting algorithm to an 18 to 28 parameter optical curve (that used double precision complex numbers). I did the first pass implementation in FORTRAN and then needed my postdoc's help to transform the algorithm to matrix algebra in a Matlab implementation. It was fascinating.

          1 Reply Last reply
          0
          • futurebird@sauropods.winF futurebird@sauropods.win

            Had a lot of fun with my stats students today. I gave them two data sets. One from a random number generator, the other was one I made up that was not random, but designed to look random. They were able to figure out which one was fake.

            Then we had ChatGPT make the same kind of data set (random numbers 1-6 set of 100) and it had the same problems as my fake set but in a different way.

            We talked about the study about AI generated passwords.

            futurebird@sauropods.winF This user is from outside of this forum
            futurebird@sauropods.winF This user is from outside of this forum
            futurebird@sauropods.win
            wrote last edited by
            #8

            There is something very creepy about the way LLMs willy cheerfully give lists of "random" numbers. But they aren't random in frequency, and as my students pointed out "it's probably from some webpage about how to generate random numbers"

            But even then, why is the frequency so unnaturally regular? Is that an artifact from mixing lists of real random numbers together?

            darkling@mstdn.socialD burnitdown@beige.partyB okohll@hachyderm.ioO mcc@mastodon.socialM futurebird@sauropods.winF 10 Replies Last reply
            0
            • futurebird@sauropods.winF futurebird@sauropods.win

              There is something very creepy about the way LLMs willy cheerfully give lists of "random" numbers. But they aren't random in frequency, and as my students pointed out "it's probably from some webpage about how to generate random numbers"

              But even then, why is the frequency so unnaturally regular? Is that an artifact from mixing lists of real random numbers together?

              darkling@mstdn.socialD This user is from outside of this forum
              darkling@mstdn.socialD This user is from outside of this forum
              darkling@mstdn.social
              wrote last edited by
              #9

              @futurebird I think I've got a printed book of random numbers upstairs somewhere.

              1 Reply Last reply
              0
              • futurebird@sauropods.winF futurebird@sauropods.win

                Had a lot of fun with my stats students today. I gave them two data sets. One from a random number generator, the other was one I made up that was not random, but designed to look random. They were able to figure out which one was fake.

                Then we had ChatGPT make the same kind of data set (random numbers 1-6 set of 100) and it had the same problems as my fake set but in a different way.

                We talked about the study about AI generated passwords.

                phpete@mastodon.coffeeP This user is from outside of this forum
                phpete@mastodon.coffeeP This user is from outside of this forum
                phpete@mastodon.coffee
                wrote last edited by
                #10

                @futurebird
                Related: the subconscious power of human brains is amazing.

                1 Reply Last reply
                0
                • futurebird@sauropods.winF futurebird@sauropods.win

                  There is something very creepy about the way LLMs willy cheerfully give lists of "random" numbers. But they aren't random in frequency, and as my students pointed out "it's probably from some webpage about how to generate random numbers"

                  But even then, why is the frequency so unnaturally regular? Is that an artifact from mixing lists of real random numbers together?

                  burnitdown@beige.partyB This user is from outside of this forum
                  burnitdown@beige.partyB This user is from outside of this forum
                  burnitdown@beige.party
                  wrote last edited by
                  #11

                  @futurebird

                  FUN FACT: random ain't random. especially in computers.

                  if you ask for "random" output from a computer, there is no guarantee that what comes out isn't actually from the contents of RAM.

                  life_is@no-pony.farmL dpiponi@mathstodon.xyzD 2 Replies Last reply
                  0
                  • darkling@mstdn.socialD This user is from outside of this forum
                    darkling@mstdn.socialD This user is from outside of this forum
                    darkling@mstdn.social
                    wrote last edited by
                    #12

                    @flipper @futurebird I definitely have some of those. Several, in fact, at various levels of precision and different sets of functions.

                    1 Reply Last reply
                    0
                    • burnitdown@beige.partyB burnitdown@beige.party

                      @futurebird

                      FUN FACT: random ain't random. especially in computers.

                      if you ask for "random" output from a computer, there is no guarantee that what comes out isn't actually from the contents of RAM.

                      life_is@no-pony.farmL This user is from outside of this forum
                      life_is@no-pony.farmL This user is from outside of this forum
                      life_is@no-pony.farm
                      wrote last edited by
                      #13
                      @burnitdown@beige.party You need to put some NDO into the ram. @futurebird@sauropods.win
                      burnitdown@beige.partyB 1 Reply Last reply
                      0
                      • futurebird@sauropods.winF futurebird@sauropods.win

                        There is something very creepy about the way LLMs willy cheerfully give lists of "random" numbers. But they aren't random in frequency, and as my students pointed out "it's probably from some webpage about how to generate random numbers"

                        But even then, why is the frequency so unnaturally regular? Is that an artifact from mixing lists of real random numbers together?

                        okohll@hachyderm.ioO This user is from outside of this forum
                        okohll@hachyderm.ioO This user is from outside of this forum
                        okohll@hachyderm.io
                        wrote last edited by
                        #14

                        @futurebird haven't tried it but maybe it's also all mixed up with non-random numbers in training content e.g. the next number after '20' is likely one of 0, 1 or 2, the start of a 21st century year so far. Or Benford's law https://en.wikipedia.org/wiki/Benford%27s_law

                        cstross@wandering.shopC 1 Reply Last reply
                        0
                        • life_is@no-pony.farmL life_is@no-pony.farm
                          @burnitdown@beige.party You need to put some NDO into the ram. @futurebird@sauropods.win
                          burnitdown@beige.partyB This user is from outside of this forum
                          burnitdown@beige.partyB This user is from outside of this forum
                          burnitdown@beige.party
                          wrote last edited by
                          #15

                          @Life_is @futurebird that's still the contents of RAM, whatever an NDO is.

                          life_is@no-pony.farmL 1 Reply Last reply
                          0
                          • futurebird@sauropods.winF futurebird@sauropods.win

                            Had a lot of fun with my stats students today. I gave them two data sets. One from a random number generator, the other was one I made up that was not random, but designed to look random. They were able to figure out which one was fake.

                            Then we had ChatGPT make the same kind of data set (random numbers 1-6 set of 100) and it had the same problems as my fake set but in a different way.

                            We talked about the study about AI generated passwords.

                            geepawhill@mastodon.socialG This user is from outside of this forum
                            geepawhill@mastodon.socialG This user is from outside of this forum
                            geepawhill@mastodon.social
                            wrote last edited by
                            #16

                            @futurebird As you so often do, you sent me off on a tangent. My favorite PRNG is in Knuth, and it's called Algorithm A there. It is entirely additive, so very fast, and has a period of 2^54.

                            I spent *years* tryna to figure out why nobody ever used it or even mentioned it.

                            Finally discovered that it has another name, and that it is quite frequently used today. 🙂

                            I have, of course, completely forgotten its other name, which somebody here on fedi actually told me.

                            1 Reply Last reply
                            0
                            • burnitdown@beige.partyB burnitdown@beige.party

                              @futurebird

                              FUN FACT: random ain't random. especially in computers.

                              if you ask for "random" output from a computer, there is no guarantee that what comes out isn't actually from the contents of RAM.

                              dpiponi@mathstodon.xyzD This user is from outside of this forum
                              dpiponi@mathstodon.xyzD This user is from outside of this forum
                              dpiponi@mathstodon.xyz
                              wrote last edited by
                              #17

                              @burnitdown @futurebird These days if you really want random numbers you can have them. Eg. RDRAND on Intel chips is seeded by analogue circuitry, not by some state updated in RAM. And even if you don't use RDRAND directly its output is still used as a source of entropy for other generators.

                              digitalcalibrator@hol.ogra.phD 1 Reply Last reply
                              0
                              • futurebird@sauropods.winF futurebird@sauropods.win

                                There is something very creepy about the way LLMs willy cheerfully give lists of "random" numbers. But they aren't random in frequency, and as my students pointed out "it's probably from some webpage about how to generate random numbers"

                                But even then, why is the frequency so unnaturally regular? Is that an artifact from mixing lists of real random numbers together?

                                mcc@mastodon.socialM This user is from outside of this forum
                                mcc@mastodon.socialM This user is from outside of this forum
                                mcc@mastodon.social
                                wrote last edited by
                                #18

                                @futurebird i mean the LLM itself is just a statistical distribution… the path through the distribution is i assume randomized, but the distribution itself is gonna be the same every time.

                                1 Reply Last reply
                                0
                                • futurebird@sauropods.winF futurebird@sauropods.win

                                  There is something very creepy about the way LLMs willy cheerfully give lists of "random" numbers. But they aren't random in frequency, and as my students pointed out "it's probably from some webpage about how to generate random numbers"

                                  But even then, why is the frequency so unnaturally regular? Is that an artifact from mixing lists of real random numbers together?

                                  futurebird@sauropods.winF This user is from outside of this forum
                                  futurebird@sauropods.winF This user is from outside of this forum
                                  futurebird@sauropods.win
                                  wrote last edited by
                                  #19

                                  The LLM is like a little box of computer horrors that we peer into from time to time.

                                  I'm sorry but the whole interface is just so silly.

                                  You ask for random numbers with sentences and it pretends to give them to you? What are we doooooing?

                                  grumpasaurus@infosec.exchangeG dpiponi@mathstodon.xyzD perigee@rage.loveP gatesvp@mstdn.caG f_dion@mastodon.onlineF 13 Replies Last reply
                                  1
                                  0
                                  • futurebird@sauropods.winF futurebird@sauropods.win

                                    The LLM is like a little box of computer horrors that we peer into from time to time.

                                    I'm sorry but the whole interface is just so silly.

                                    You ask for random numbers with sentences and it pretends to give them to you? What are we doooooing?

                                    grumpasaurus@infosec.exchangeG This user is from outside of this forum
                                    grumpasaurus@infosec.exchangeG This user is from outside of this forum
                                    grumpasaurus@infosec.exchange
                                    wrote last edited by
                                    #20

                                    @futurebird it really puts into perspective what my interaction with real people is like

                                    1 Reply Last reply
                                    0
                                    • futurebird@sauropods.winF futurebird@sauropods.win

                                      The LLM is like a little box of computer horrors that we peer into from time to time.

                                      I'm sorry but the whole interface is just so silly.

                                      You ask for random numbers with sentences and it pretends to give them to you? What are we doooooing?

                                      dpiponi@mathstodon.xyzD This user is from outside of this forum
                                      dpiponi@mathstodon.xyzD This user is from outside of this forum
                                      dpiponi@mathstodon.xyz
                                      wrote last edited by
                                      #21

                                      @futurebird It's very weird.

                                      In principle, if you take an LLM, you should be able to get it to generate random numbers in a way that reflects the numbers that appear in the corpus it was trained on. If you have the raw model you can probably do that.

                                      But if you ask ChatGPT (or at least if I do) it starts talking about how numbers taken from around us typically follow Benford's law so their first digits have a logarithmic distribution. When it then spits out some random numbers it's no longer sampling random numbers from the entire corpus but a sample that's probably heavily biased towards numbers that appear in articles about Benford's law. I.e. what people have previously said about these numbers, rather than the actual numbers.

                                      jedbrown@hachyderm.ioJ raffzahn@mastodon.bayernR 2 Replies Last reply
                                      0
                                      • futurebird@sauropods.winF futurebird@sauropods.win

                                        The LLM is like a little box of computer horrors that we peer into from time to time.

                                        I'm sorry but the whole interface is just so silly.

                                        You ask for random numbers with sentences and it pretends to give them to you? What are we doooooing?

                                        perigee@rage.loveP This user is from outside of this forum
                                        perigee@rage.loveP This user is from outside of this forum
                                        perigee@rage.love
                                        wrote last edited by
                                        #22

                                        @futurebird as others here have said or implied, I think LLMs are trained not to be random. Like as a structural part of the statistical models they're based on, so the input corpus will inform the "random" output.

                                        Speaking as a long time not mathematically rigorous enough amateur cryptographer, most humans don't understand (not talking about you or your students, to be clear) that actually random can contain sequences and patterns, or parts of them, so when an uninformed human evaluates "randomness", they don't recognize sequences with patterns even if those are accidental coincidences.

                                        Related, there's also the old cryptography parable that if a low ranking person in the security organization uses random picking to draw random numbers for, for example, a one time pad, the results won't really be random if that volunteer looks into the hat or drum from which they pick because they will subconsciously bias toward patterns like letter and number frequency from their experience and expectations, which might help an attacker decrypt the pad. Maybe.

                                        Since the LLM is supposed to emulate human output it makes sense it might mess with "randomness".

                                        1 Reply Last reply
                                        0
                                        • futurebird@sauropods.winF This user is from outside of this forum
                                          futurebird@sauropods.winF This user is from outside of this forum
                                          futurebird@sauropods.win
                                          wrote last edited by
                                          #23

                                          @Bumblefish

                                          Which one is random?
                                          (data sets are 100 numbers 1 to 6)

                                          listA=[2,3,5,1,2,2,4,2,4,5,2,3,3,4,5,6,4,2,6,2,2,1,3,4,5,5,6,3,3,6,1,4,2,1,4,5,2,2,3,3,3,5,6,3,2,4,5,5,1,1,1,6,1,4,3,5,5,3,1,1,1,6,1,4,6,6,3,6,6,2,4,4,4,5,1,5,6,2,6,1,1,2,4,2,2,3,4,4,5,6,1,3,3,3,5,4,6,5,1,6]

                                          listB=[4,2,5,6,3,5,3,1,3,4,2,3,4,3,4,5,5,1,3,3,2,1,1,6,1,3,2,2,2,6,1,5,6,3,6,3,2,3,2,4,6,1,1,6,3,2,4,1,6,1,3,1,5,6,2,3,3,5,1,6,4,5,2,5,1,1,5,3,6,2,3,3,6,5,2,3,3,1,6,3,2,3,2,1,6,6,4,4,6,2,4,5,4,5,3,4,6,5,3,2]

                                          noplasticshower@infosec.exchangeN zalasur@mastodon.surazal.netZ ramsey@phpc.socialR raederle@masto.nuR dlakelan@mastodon.sdf.orgD 19 Replies Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups