Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. "I just found out that it's been hallucinating numbers this entire time."

"I just found out that it's been hallucinating numbers this entire time."

Scheduled Pinned Locked Moved Uncategorized
techagenticai
157 Posts 132 Posters 6 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • cptsuperlative@toot.catC This user is from outside of this forum
    cptsuperlative@toot.catC This user is from outside of this forum
    cptsuperlative@toot.cat
    wrote last edited by
    #34

    @Natasha_Jay

    No one should be surprised.

    It is mathematically impossible to stop an LLM from “hallucinating” because “hallucinating” is what LLMs are doing 100% of the time.

    It’s only human beings who distinguish (ideally) between correct and incorrect synthetic text.

    And yet, it’s like forbidden candy to a child. Even well educated, thoughtful people so desperately want to believe that this tech “works”

    1 Reply Last reply
    0
    • maccruiskeen@social.linux.pizzaM maccruiskeen@social.linux.pizza

      @ferrix @Natasha_Jay @wendynather "Making up plausible shit" is a human consultant's job! AI is going to replace us!

      ferrix@mastodon.onlineF This user is from outside of this forum
      ferrix@mastodon.onlineF This user is from outside of this forum
      ferrix@mastodon.online
      wrote last edited by
      #35

      @maccruiskeen @Natasha_Jay @wendynather MUPSaaS

      1 Reply Last reply
      0
      • nantucketlit@mastodon.socialN This user is from outside of this forum
        nantucketlit@mastodon.socialN This user is from outside of this forum
        nantucketlit@mastodon.social
        wrote last edited by
        #36

        @Natasha_Jay "To an artificial mind, all reality is virtual."

        1 Reply Last reply
        0
        • 0x0ddc0ffee@infosec.exchange0 0x0ddc0ffee@infosec.exchange

          @Natasha_Jay I suspect that one of the few applications where a heavily hallucinating LLM can outperform a human would be in replacing board members, C-suite executives, and their direct reports. I propose a longitudinal study with a control group of high level executives using real data, and experimental group using hallucinated, or maybe even totally random data filtered into plausible ranges, using executive compensation deltas as the metric.

          npars01@mstdn.socialN This user is from outside of this forum
          npars01@mstdn.socialN This user is from outside of this forum
          npars01@mstdn.social
          wrote last edited by
          #37

          @0x0ddc0ffee @Natasha_Jay

          Remember who funds Ai.
          https://www.al-monitor.com/originals/2024/05/saudi-prince-alwaleed-bin-talal-invests-elon-musks-24b-ai-startup

          https://www.washingtonpost.com/technology/2025/05/13/trump-tech-execs-riyadh/

          forbes.com

          favicon

          (www.forbes.com)

          forbes.com

          favicon

          (www.forbes.com)

          Disaster Capitalists -- they want another bubble bursting that sends another generation of wealth upwards to the 1%

          Global financial crashes & mass dissatisfaction provides the makings for fascist movements & profitable imperialist wars of extraction.

          https://www.taxresearch.org.uk/Blog/2025/09/03/austerity-is-the-midwife-of-fascism/

          Just a moment...

          favicon

          (www.euractiv.com)

          Link Preview Image
          What will it take to beat the far right?

          The far right feeds on economic despair and exclusion. Only policies that deliver security, dignity and belonging can starve it

          favicon

          (www.ips-journal.eu)

          1 Reply Last reply
          0
          • jkrotkov@mastodon.socialJ This user is from outside of this forum
            jkrotkov@mastodon.socialJ This user is from outside of this forum
            jkrotkov@mastodon.social
            wrote last edited by
            #38

            @Natasha_Jay

            Hallucinating numbers, you say?

            It must be learning from the current American administration..

            1 Reply Last reply
            0
            • countholdem@mastodon.socialC This user is from outside of this forum
              countholdem@mastodon.socialC This user is from outside of this forum
              countholdem@mastodon.social
              wrote last edited by
              #39

              @Natasha_Jay #BoycottAI

              1 Reply Last reply
              0
              • only_exception@mstdn.plusO This user is from outside of this forum
                only_exception@mstdn.plusO This user is from outside of this forum
                only_exception@mstdn.plus
                wrote last edited by
                #40

                @Natasha_Jay someone is getting fired

                1 Reply Last reply
                0
                • stefanol@mstdn.caS This user is from outside of this forum
                  stefanol@mstdn.caS This user is from outside of this forum
                  stefanol@mstdn.ca
                  wrote last edited by
                  #41

                  @Natasha_Jay 😬

                  1 Reply Last reply
                  0
                  • drahardja@sfba.socialD drahardja@sfba.social

                    @Natasha_Jay This part in the original post is fantastic:

                    “The worst part I raised concerns about needing validation in November and got told I was slowing down innovation.”

                    hxxps://www.reddit.com/r/analytics/comments/1r4dsq2/we_just_found_out_our_ai_has_been_making_up/

                    indigoviolet@tech.lgbtI This user is from outside of this forum
                    indigoviolet@tech.lgbtI This user is from outside of this forum
                    indigoviolet@tech.lgbt
                    wrote last edited by
                    #42

                    @drahardja @Natasha_Jay I believe it’s supposed to be h*tt*ps

                    drahardja@sfba.socialD 1 Reply Last reply
                    0
                    • iveyline@mastodon.socialI This user is from outside of this forum
                      iveyline@mastodon.socialI This user is from outside of this forum
                      iveyline@mastodon.social
                      wrote last edited by
                      #43

                      @Natasha_Jay It seems to me that AI tools have been released prematurely to generate revenue from the massive investments AI tech companies are making. This type of hallucination could cause serious damage to any business, organisation or person using it. Who is accountable in the end? Given the history of big tech's social media platforms and the harm theyare causing you can bet they'll accept no responsibility.

                      1 Reply Last reply
                      0
                      • jmcclure@sciences.socialJ This user is from outside of this forum
                        jmcclure@sciences.socialJ This user is from outside of this forum
                        jmcclure@sciences.social
                        wrote last edited by
                        #44

                        @Natasha_Jay

                        "... just inventing plausible sounding [answers]"

                        This shit is so tiring - that is literally all any AI is even *meant* to do. They are not even designed to give correct answers to questions, but just examples of what a plausible answer could sound like.

                        (edit: sorry, I know I'm likely preaching to the choir here, but it's just so fucking tiring seeing people surprised by this crap.)

                        greatbigtable@mastodon.socialG 1 Reply Last reply
                        0
                        • ai6yr@m.ai6yr.orgA This user is from outside of this forum
                          ai6yr@m.ai6yr.orgA This user is from outside of this forum
                          ai6yr@m.ai6yr.org
                          wrote last edited by
                          #45

                          @Natasha_Jay Bwahahahahahaha

                          1 Reply Last reply
                          0
                          • erikjonker@mastodon.socialE This user is from outside of this forum
                            erikjonker@mastodon.socialE This user is from outside of this forum
                            erikjonker@mastodon.social
                            wrote last edited by
                            #46

                            @Natasha_Jay .. and you really want to blame the technology for this... If there was no process for checking the facts that were used , it is simply bad implementation, everyone knows you have to check for or do something against hallucinations with GenAI.

                            greatbigtable@mastodon.socialG 1 Reply Last reply
                            0
                            • johnjburnsiii@kzoo.toJ This user is from outside of this forum
                              johnjburnsiii@kzoo.toJ This user is from outside of this forum
                              johnjburnsiii@kzoo.to
                              wrote last edited by
                              #47

                              @Natasha_Jay

                              Guess who is going to take the blame for this...

                              #RollingDownHill
                              🤯

                              1 Reply Last reply
                              0
                              • bredroll@mas.toB bredroll@mas.to

                                @ZenHeathen @Natasha_Jay the thing is, there is no change-over "period" once you use this daily, your org begins lose its institutional memory

                                jwo@mastodonczech.czJ This user is from outside of this forum
                                jwo@mastodonczech.czJ This user is from outside of this forum
                                jwo@mastodonczech.cz
                                wrote last edited by
                                #48

                                @Bredroll @ZenHeathen @Natasha_Jay And there will be further, probably unnoticed, changeover periods, when the technology less or more silently provider changes (or even the org changes provider, …) their services, e.g., the used model is changed.

                                1 Reply Last reply
                                0
                                • dianea@lgbtqia.spaceD This user is from outside of this forum
                                  dianea@lgbtqia.spaceD This user is from outside of this forum
                                  dianea@lgbtqia.space
                                  wrote last edited by
                                  #49

                                  @Natasha_Jay

                                  So I asked the higher ups to double check this $11M investment they made to run the show in our department. Everything promised was made worse with zero improvements. We lost a lot of experience and money. Almost lost me too, but I'm a fixer and they are going to pay me a lot of overtime...

                                  1 Reply Last reply
                                  0
                                  • bangskij@climatejustice.socialB This user is from outside of this forum
                                    bangskij@climatejustice.socialB This user is from outside of this forum
                                    bangskij@climatejustice.social
                                    wrote last edited by
                                    #50

                                    @Natasha_Jay so much karma

                                    1 Reply Last reply
                                    0
                                    • jmcrookston@mastodon.socialJ This user is from outside of this forum
                                      jmcrookston@mastodon.socialJ This user is from outside of this forum
                                      jmcrookston@mastodon.social
                                      wrote last edited by
                                      #51

                                      @Natasha_Jay

                                      I can't wait to deploy AI on the battlefield! And for policing. 😬

                                      1 Reply Last reply
                                      0
                                      • lxskllr@mastodon.worldL lxskllr@mastodon.world

                                        @Natasha_Jay

                                        What would be amusing is them having greater success using bullshit data than whoever was previously correlating stuff :^D

                                        greatbigtable@mastodon.socialG This user is from outside of this forum
                                        greatbigtable@mastodon.socialG This user is from outside of this forum
                                        greatbigtable@mastodon.social
                                        wrote last edited by
                                        #52

                                        @lxskllr more likely it will have similar results to that of the media industry who changed their entire industry based on fake data provided to them by Facebook.

                                        ktneely@infosec.exchangeK 1 Reply Last reply
                                        0
                                        • erikjonker@mastodon.socialE erikjonker@mastodon.social

                                          @Natasha_Jay .. and you really want to blame the technology for this... If there was no process for checking the facts that were used , it is simply bad implementation, everyone knows you have to check for or do something against hallucinations with GenAI.

                                          greatbigtable@mastodon.socialG This user is from outside of this forum
                                          greatbigtable@mastodon.socialG This user is from outside of this forum
                                          greatbigtable@mastodon.social
                                          wrote last edited by
                                          #53

                                          @ErikJonker then where's the productivity gain promised with AI if you still have to do the work to get the numbers you trust? Why take on the additional cost at that point?

                                          If you had an employee who was constantly lying to you, you'd fire them.

                                          erikjonker@mastodon.socialE 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups