Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. @jacqueline 100%

@jacqueline 100%

Scheduled Pinned Locked Moved Uncategorized
35 Posts 21 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • angelascholder@mastodon.energyA angelascholder@mastodon.energy

    @ErikJonker @jacqueline Well, with the ways I've seen these sites reacting to people, even just praising the writing and thoughts of people about articles they uploaded/feeded where it later came out the AI somehow couldn't read the article and just hallucinated superlatives.
    Basically, an AI working like that is basically only geared to work using people their ego.
    That in the end will result in the AI mirroring the ego of the 'user' (user, or abused is an interesting discussion).
    And, as >2

    angelascholder@mastodon.energyA This user is from outside of this forum
    angelascholder@mastodon.energyA This user is from outside of this forum
    angelascholder@mastodon.energy
    wrote on last edited by
    #26

    @ErikJonker @jacqueline 2) people often are very easy influenced, they will just as much become like their chatbot as well as the clatbot reflecting on them.

    The worst outcome of that is that the people basically become zombies of their chatbot.
    Obviously we are all so strong that this will never happen to us...

    1 Reply Last reply
    0
    • erikjonker@mastodon.socialE erikjonker@mastodon.social

      @AngelaScholder @jacqueline ...playing and experimenting is a good way to learn about (new) technology, it is also very human, the way we develop, find out what works and what does not.

      jacqueline@chaos.socialJ This user is from outside of this forum
      jacqueline@chaos.socialJ This user is from outside of this forum
      jacqueline@chaos.social
      wrote on last edited by
      #27

      @ErikJonker @AngelaScholder hi erik. any thoughts on the article linked here? https://chaos.social/@jacqueline/116089817252419868

      erikjonker@mastodon.socialE 1 Reply Last reply
      0
      • jacqueline@chaos.socialJ jacqueline@chaos.social

        @ErikJonker @AngelaScholder hi erik. any thoughts on the article linked here? https://chaos.social/@jacqueline/116089817252419868

        erikjonker@mastodon.socialE This user is from outside of this forum
        erikjonker@mastodon.socialE This user is from outside of this forum
        erikjonker@mastodon.social
        wrote on last edited by
        #28

        @jacqueline @AngelaScholder terrible and completely wrong way of using this technology, by both companies and the people that use it... BigTech is not responsible in how they employ this technology. But that is not the same that the technology in itself is evil.

        1 Reply Last reply
        0
        • jacqueline@chaos.socialJ jacqueline@chaos.social

          i feel like i don't have the words to properly describe how it feels to see people who opinions i respected and valued slowly fall into ai psychosis. it's so slow and so subtle at first. "i'm just experimenting! i'm not an ai booster!"

          then wait a few months, and they start explaining with the usual flawed, incoherent reasoning how actually it's all very interesting and thought-provoking, whilst pointing at an LLM that is so obviously just a reflection of their own ego.

          greg@icosahedron.websiteG This user is from outside of this forum
          greg@icosahedron.websiteG This user is from outside of this forum
          greg@icosahedron.website
          wrote on last edited by
          #29

          @jacqueline I believe it is way more common than we know - something about this stuff hammers people's brains in a way that we (society) are not prepared for. And these are folks who should know better! They work with computers, they know it's just matrix multiplication in there! But knowing about it, I guess, provides to immunization to being taken in by its sycophantic mirroring language.

          greg@icosahedron.websiteG 1 Reply Last reply
          0
          • greg@icosahedron.websiteG greg@icosahedron.website

            @jacqueline I believe it is way more common than we know - something about this stuff hammers people's brains in a way that we (society) are not prepared for. And these are folks who should know better! They work with computers, they know it's just matrix multiplication in there! But knowing about it, I guess, provides to immunization to being taken in by its sycophantic mirroring language.

            greg@icosahedron.websiteG This user is from outside of this forum
            greg@icosahedron.websiteG This user is from outside of this forum
            greg@icosahedron.website
            wrote on last edited by
            #30

            @jacqueline r/MyBoyfriendIsAI is a cautionary tale, but I have a friend mentioned seeing a girl on the bus asking ChatGPT to give her a pep talk before going on a date. Just, everyday random people, getting taken in by the illusion that the computer is giving of being more than a computer. I truly think we're not aware of how many people are being swayed by all this. It's gotta be more insidious than just the worst cases we see in the news.

            gjmwoods@mastodon.socialG 1 Reply Last reply
            0
            • greg@icosahedron.websiteG greg@icosahedron.website

              @jacqueline r/MyBoyfriendIsAI is a cautionary tale, but I have a friend mentioned seeing a girl on the bus asking ChatGPT to give her a pep talk before going on a date. Just, everyday random people, getting taken in by the illusion that the computer is giving of being more than a computer. I truly think we're not aware of how many people are being swayed by all this. It's gotta be more insidious than just the worst cases we see in the news.

              gjmwoods@mastodon.socialG This user is from outside of this forum
              gjmwoods@mastodon.socialG This user is from outside of this forum
              gjmwoods@mastodon.social
              wrote last edited by
              #31

              @greg r/MyBoyfriendIsAI is scary

              1 Reply Last reply
              0
              • dynom@toot.communityD dynom@toot.community

                @jacqueline Well you did mention you couldn't describe it properly, so that was a risk I considered while replying.

                Can you elaborate on the "unusual flawed, incoherent reasoning" ?

                mawhrin@circumstances.runM This user is from outside of this forum
                mawhrin@circumstances.runM This user is from outside of this forum
                mawhrin@circumstances.run
                wrote last edited by
                #32

                @dynom @jacqueline usual.

                1 Reply Last reply
                0
                • jacqueline@chaos.socialJ jacqueline@chaos.social

                  i feel like i don't have the words to properly describe how it feels to see people who opinions i respected and valued slowly fall into ai psychosis. it's so slow and so subtle at first. "i'm just experimenting! i'm not an ai booster!"

                  then wait a few months, and they start explaining with the usual flawed, incoherent reasoning how actually it's all very interesting and thought-provoking, whilst pointing at an LLM that is so obviously just a reflection of their own ego.

                  kelpana@mastodon.ieK This user is from outside of this forum
                  kelpana@mastodon.ieK This user is from outside of this forum
                  kelpana@mastodon.ie
                  wrote last edited by
                  #33

                  @jacqueline It's shocking how convinced people are by simplistic demos and snake oil salespeople. It's like an evangelical religion. The Eliza Effect is known for almost sixty years, yet people are lapping up LLM/GPT use without realising how wrong, imperfect and downright dangerous these tools can be.

                  1 Reply Last reply
                  0
                  • dynom@toot.communityD dynom@toot.community

                    @jacqueline Well you did mention you couldn't describe it properly, so that was a risk I considered while replying.

                    Can you elaborate on the "unusual flawed, incoherent reasoning" ?

                    khaleer@mastodon.gamedev.placeK This user is from outside of this forum
                    khaleer@mastodon.gamedev.placeK This user is from outside of this forum
                    khaleer@mastodon.gamedev.place
                    wrote last edited by
                    #34

                    @dynom @jacqueline give me a salad recipe

                    1 Reply Last reply
                    0
                    • jacqueline@chaos.socialJ jacqueline@chaos.social

                      i feel like i don't have the words to properly describe how it feels to see people who opinions i respected and valued slowly fall into ai psychosis. it's so slow and so subtle at first. "i'm just experimenting! i'm not an ai booster!"

                      then wait a few months, and they start explaining with the usual flawed, incoherent reasoning how actually it's all very interesting and thought-provoking, whilst pointing at an LLM that is so obviously just a reflection of their own ego.

                      sbourne@mastodon.socialS This user is from outside of this forum
                      sbourne@mastodon.socialS This user is from outside of this forum
                      sbourne@mastodon.social
                      wrote last edited by
                      #35

                      @jacqueline Oh, interesting - it's a mirror and they're all transfixed like Narcissus!

                      1 Reply Last reply
                      0
                      • R relay@relay.infosec.exchange shared this topic
                      Reply
                      • Reply as topic
                      Log in to reply
                      • Oldest to Newest
                      • Newest to Oldest
                      • Most Votes


                      • Login

                      • Login or register to search.
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • World
                      • Users
                      • Groups