Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. so the take-away from this is that all of this agentic stuff backend is just begging the LLM to please, please not fuck up?

so the take-away from this is that all of this agentic stuff backend is just begging the LLM to please, please not fuck up?

Scheduled Pinned Locked Moved Uncategorized
20 Posts 14 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • eniko@mastodon.gamedev.placeE eniko@mastodon.gamedev.place

    RE: https://neuromatch.social/@jonny/116324676116121930

    so the take-away from this is that all of this agentic stuff backend is just begging the LLM to please, please not fuck up? am i getting this right?

    andrewt@mathstodon.xyzA This user is from outside of this forum
    andrewt@mathstodon.xyzA This user is from outside of this forum
    andrewt@mathstodon.xyz
    wrote last edited by
    #6

    @eniko yeah, this is how all AI "engineering" works under the hood. every time the model does something stupid, they write an extra bit of system prompt to burn a few tokens explaining not to do that thing. same exact energy as shadiversity writing "correct anatomy, perfect lighting, masterpiece x1000000" at the end of every prompt as if the model knows how to do those things but simply chose not to

    1 Reply Last reply
    0
    • eniko@mastodon.gamedev.placeE eniko@mastodon.gamedev.place

      this isn't engineering this is a religious cult

      fen@zoner.workF This user is from outside of this forum
      fen@zoner.workF This user is from outside of this forum
      fen@zoner.work
      wrote last edited by
      #7

      @eniko@mastodon.gamedev.place there's a reason why Most Engineers don't recognize Software Engineering as an actual engineering discipline, and the current crop isn't doing anything to convince them that should change.

      jimfl@hachyderm.ioJ 1 Reply Last reply
      0
      • eniko@mastodon.gamedev.placeE eniko@mastodon.gamedev.place

        this isn't engineering this is a religious cult

        cxberger@mastodon.boiler.socialC This user is from outside of this forum
        cxberger@mastodon.boiler.socialC This user is from outside of this forum
        cxberger@mastodon.boiler.social
        wrote last edited by
        #8

        @eniko gotta preach the TESCREAL gospel or the Basilisk will get ya

        1 Reply Last reply
        1
        0
        • fen@zoner.workF fen@zoner.work

          @eniko@mastodon.gamedev.place there's a reason why Most Engineers don't recognize Software Engineering as an actual engineering discipline, and the current crop isn't doing anything to convince them that should change.

          jimfl@hachyderm.ioJ This user is from outside of this forum
          jimfl@hachyderm.ioJ This user is from outside of this forum
          jimfl@hachyderm.io
          wrote last edited by
          #9

          @fen @eniko A while back, I was renting an ADU in the backyard of an aeronautical engineer. We got to talking about our respective careers (like you do) and when I told him what I did, he said, “oh, you’re a software guy.”

          Pretty accurate.

          1 Reply Last reply
          0
          • eniko@mastodon.gamedev.placeE eniko@mastodon.gamedev.place

            RE: https://neuromatch.social/@jonny/116324676116121930

            so the take-away from this is that all of this agentic stuff backend is just begging the LLM to please, please not fuck up? am i getting this right?

            jedimb@mastodon.gamedev.placeJ This user is from outside of this forum
            jedimb@mastodon.gamedev.placeJ This user is from outside of this forum
            jedimb@mastodon.gamedev.place
            wrote last edited by
            #10

            @eniko A vast mass of bad code held together by duct tape (and prayers).

            1 Reply Last reply
            0
            • alice@mk.nyaa.placeA alice@mk.nyaa.place

              @eniko@mastodon.gamedev.place you're absolutely right! yup

              bovaz@misskey.socialB This user is from outside of this forum
              bovaz@misskey.socialB This user is from outside of this forum
              bovaz@misskey.social
              wrote last edited by
              #11
              @alice@mk.nyaa.place @eniko@mastodon.gamedev.place because right now brain goes brap, this post just made me realize that "you're absolutely right" can be shortened to YAR.

              Pirates are onto something good.
              1 Reply Last reply
              0
              • R relay@relay.publicsquare.global shared this topic
              • eniko@mastodon.gamedev.placeE eniko@mastodon.gamedev.place

                RE: https://neuromatch.social/@jonny/116324676116121930

                so the take-away from this is that all of this agentic stuff backend is just begging the LLM to please, please not fuck up? am i getting this right?

                foxyoreos@gulp.cafeF This user is from outside of this forum
                foxyoreos@gulp.cafeF This user is from outside of this forum
                foxyoreos@gulp.cafe
                wrote last edited by
                #12

                @eniko YES!!!

                I have been yelling about this in different places for 4 years. Before they really took off I used to do hobby red-teaming around GPT-3 and 4, and there was this moment of realizing that none of the security for this stuff fucking works, at all, and yet no one wanted to admit that, it was just layers of programmers going "what if there's a second LLM though", and stuffing their heads in the sand, and NOTHING HAS CHANGED.

                foxyoreos@gulp.cafeF 1 Reply Last reply
                0
                • foxyoreos@gulp.cafeF foxyoreos@gulp.cafe

                  @eniko YES!!!

                  I have been yelling about this in different places for 4 years. Before they really took off I used to do hobby red-teaming around GPT-3 and 4, and there was this moment of realizing that none of the security for this stuff fucking works, at all, and yet no one wanted to admit that, it was just layers of programmers going "what if there's a second LLM though", and stuffing their heads in the sand, and NOTHING HAS CHANGED.

                  foxyoreos@gulp.cafeF This user is from outside of this forum
                  foxyoreos@gulp.cafeF This user is from outside of this forum
                  foxyoreos@gulp.cafe
                  wrote last edited by
                  #13

                  @eniko the only difference now is that the media has bought into it and now we have "agentic" browsers and all this shit and literally all of it is fundamentally impossible to secure - and the entire tech space (not entire but like, you know) has gone, "but what if we pretend it's fine tho".

                  4 years ago I was arguing with programmers about this and they were like "well by the time it gets access to your emails, this will be fixed."

                  It was not fixed.

                  foxyoreos@gulp.cafeF 1 Reply Last reply
                  0
                  • foxyoreos@gulp.cafeF foxyoreos@gulp.cafe

                    @eniko the only difference now is that the media has bought into it and now we have "agentic" browsers and all this shit and literally all of it is fundamentally impossible to secure - and the entire tech space (not entire but like, you know) has gone, "but what if we pretend it's fine tho".

                    4 years ago I was arguing with programmers about this and they were like "well by the time it gets access to your emails, this will be fixed."

                    It was not fixed.

                    foxyoreos@gulp.cafeF This user is from outside of this forum
                    foxyoreos@gulp.cafeF This user is from outside of this forum
                    foxyoreos@gulp.cafe
                    wrote last edited by
                    #14

                    @eniko we have replaced solid software security with the equivalent of the Google SEO wars. We have expanded phishing attacks so that now they work on your computer itself, not just on you.

                    We are so unbelievably fucked, none of this can be used in a sensitive environment.

                    And the overwhelming consensus from researchers is that this is impossible to solve, all you can do is beg for the computer not to fuck everything up in increasingly desperate ways.

                    foxyoreos@gulp.cafeF 1 Reply Last reply
                    0
                    • foxyoreos@gulp.cafeF foxyoreos@gulp.cafe

                      @eniko we have replaced solid software security with the equivalent of the Google SEO wars. We have expanded phishing attacks so that now they work on your computer itself, not just on you.

                      We are so unbelievably fucked, none of this can be used in a sensitive environment.

                      And the overwhelming consensus from researchers is that this is impossible to solve, all you can do is beg for the computer not to fuck everything up in increasingly desperate ways.

                      foxyoreos@gulp.cafeF This user is from outside of this forum
                      foxyoreos@gulp.cafeF This user is from outside of this forum
                      foxyoreos@gulp.cafe
                      wrote last edited by
                      #15

                      @eniko (this is also why "a chat prompt" is the wrong way to measure energy costs for these things, because the only way to get them to do anything halfway useful is to burn tokens like a forest fire. Not sure about the output? Run an entirely separate LLM to check! Run it 3 times and average the results! Run it in a loop until the test passes! Oops, the format was wrong, run it again and see what happens.

                      foxyoreos@gulp.cafeF 1 Reply Last reply
                      0
                      • eniko@mastodon.gamedev.placeE eniko@mastodon.gamedev.place

                        this isn't engineering this is a religious cult

                        rootwyrm@weird.autosR This user is from outside of this forum
                        rootwyrm@weird.autosR This user is from outside of this forum
                        rootwyrm@weird.autos
                        wrote last edited by
                        #16

                        @eniko @catsalad this isn't even a religious cult, it's a fucking *cargo* cult.

                        1 Reply Last reply
                        0
                        • eniko@mastodon.gamedev.placeE eniko@mastodon.gamedev.place

                          RE: https://neuromatch.social/@jonny/116324676116121930

                          so the take-away from this is that all of this agentic stuff backend is just begging the LLM to please, please not fuck up? am i getting this right?

                          silvermoon82@wandering.shopS This user is from outside of this forum
                          silvermoon82@wandering.shopS This user is from outside of this forum
                          silvermoon82@wandering.shop
                          wrote last edited by
                          #17

                          @eniko
                          Also, "please don't do crimes".

                          1 Reply Last reply
                          0
                          • foxyoreos@gulp.cafeF foxyoreos@gulp.cafe

                            @eniko (this is also why "a chat prompt" is the wrong way to measure energy costs for these things, because the only way to get them to do anything halfway useful is to burn tokens like a forest fire. Not sure about the output? Run an entirely separate LLM to check! Run it 3 times and average the results! Run it in a loop until the test passes! Oops, the format was wrong, run it again and see what happens.

                            foxyoreos@gulp.cafeF This user is from outside of this forum
                            foxyoreos@gulp.cafeF This user is from outside of this forum
                            foxyoreos@gulp.cafe
                            wrote last edited by
                            #18

                            @eniko and I actually think that this is why "agents" are when this slop took off for programmers. Because it lets that happen in the background where you don't have to see your shame.

                            But like.. this is also why even assuming they're true, the "single query is like running your microwave for a second" shit is so disingenuous. No one is doing a single query.

                            They are leaving the microwave running 24x7, which turns out is actually quite bad for the environment!

                            foxyoreos@gulp.cafeF 1 Reply Last reply
                            0
                            • foxyoreos@gulp.cafeF foxyoreos@gulp.cafe

                              @eniko and I actually think that this is why "agents" are when this slop took off for programmers. Because it lets that happen in the background where you don't have to see your shame.

                              But like.. this is also why even assuming they're true, the "single query is like running your microwave for a second" shit is so disingenuous. No one is doing a single query.

                              They are leaving the microwave running 24x7, which turns out is actually quite bad for the environment!

                              foxyoreos@gulp.cafeF This user is from outside of this forum
                              foxyoreos@gulp.cafeF This user is from outside of this forum
                              foxyoreos@gulp.cafe
                              wrote last edited by
                              #19

                              @eniko measuring LLM energy usage per query is like measuring gas efficiency "per rotation of your car's wheel"

                              1 Reply Last reply
                              0
                              • R relay@relay.infosec.exchange shared this topic
                              • eniko@mastodon.gamedev.placeE eniko@mastodon.gamedev.place

                                this isn't engineering this is a religious cult

                                malwareminigun@infosec.exchangeM This user is from outside of this forum
                                malwareminigun@infosec.exchangeM This user is from outside of this forum
                                malwareminigun@infosec.exchange
                                wrote last edited by
                                #20

                                @eniko maybe this is a hot take but I think having employees can be a lot like this which is why it doesn’t raise red flags for managers

                                1 Reply Last reply
                                0
                                • R relay@relay.mycrowd.ca shared this topic
                                Reply
                                • Reply as topic
                                Log in to reply
                                • Oldest to Newest
                                • Newest to Oldest
                                • Most Votes


                                • Login

                                • Login or register to search.
                                • First post
                                  Last post
                                0
                                • Categories
                                • Recent
                                • Tags
                                • Popular
                                • World
                                • Users
                                • Groups