Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. It's so fucking weird how saying anything about AI that isn't prostate-tickling a fucking billionaire gets weird cherry-picking arguments that defend aspects of AI that weren't even being discussed.

It's so fucking weird how saying anything about AI that isn't prostate-tickling a fucking billionaire gets weird cherry-picking arguments that defend aspects of AI that weren't even being discussed.

Scheduled Pinned Locked Moved Uncategorized
22 Posts 16 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • fundamental@social.treehouse.systemsF fundamental@social.treehouse.systems

    @cR0w I get the sense that many people are using the tech not to accomplish different tasks, but to mask insecurities that they have a gap in their knowledge, relationships, abilities, etc. So at a certain level heavy use can lead to people viewing negativity towards LLMs as an attack on them personally. There's also going to be the crowd that's forced into using it and needs to build up a self justification for how it's ok. It's messy...

    merospit@infosec.exchangeM This user is from outside of this forum
    merospit@infosec.exchangeM This user is from outside of this forum
    merospit@infosec.exchange
    wrote last edited by
    #13

    @fundamental @cR0w I had a team member who is typically quite polite suddenly use personal attacks when I pointed out mistakes in the LLM artifacts that they were confident were highly useful.

    fundamental@social.treehouse.systemsF 1 Reply Last reply
    0
    • rootwyrm@weird.autosR rootwyrm@weird.autos

      @Dio9sys @cR0w don't worry, they'll give the "critics" 30 seconds of airtime. Right after the "critics" of RFK Jr's plan to 'reparent' all black children and send all the undesirables to 'wellness farms' they're never allowed to leave.

      dio9sys@haunted.computerD This user is from outside of this forum
      dio9sys@haunted.computerD This user is from outside of this forum
      dio9sys@haunted.computer
      wrote last edited by
      #14

      @rootwyrm

      @cR0w

      It is so frustrating. Like, npr has until the past couple years been genuinely great. Then they got insecure about all the allegations of liberal bias and now it's like they have to meet a daily quota of shitty people to interview. A few weeks ago they straight up had a segment on the concept of "toxic empathy" that the far right has been espousing, but instead of tearing the concept to shreds they gave softball interviews to conservative authors and influencers about it.

      1 Reply Last reply
      0
      • merospit@infosec.exchangeM merospit@infosec.exchange

        @fundamental @cR0w I had a team member who is typically quite polite suddenly use personal attacks when I pointed out mistakes in the LLM artifacts that they were confident were highly useful.

        fundamental@social.treehouse.systemsF This user is from outside of this forum
        fundamental@social.treehouse.systemsF This user is from outside of this forum
        fundamental@social.treehouse.systems
        wrote last edited by
        #15

        @merospit @cR0w It's a tough spot. Reviewing deliverables is essential, but the more degrees someone gets removed from it there seems to be a tendency to shift from specific matters of the content into the feelings associated with completing a task.

        1 Reply Last reply
        0
        • theorangetheme@en.osm.townT theorangetheme@en.osm.town

          @cR0w I am *this* 🫰close to getting shitcanned when the next manager tries to passive-aggressively ask me if I used the slop extruder for some code, and if I say "no", asks me how I could. If you want to force me to use some garbage software I don't need to do my job, don't insult my intelligence by pussyfooting around it. Put on your grownup pants and just give me an order so I can be insubordinate properly.

          fritzadalis@infosec.exchangeF This user is from outside of this forum
          fritzadalis@infosec.exchangeF This user is from outside of this forum
          fritzadalis@infosec.exchange
          wrote last edited by
          #16

          @theorangetheme @cR0w
          My company just gave me 'use ai or else' goal.

          I'm gonna make it so fukken expensive.

          epic_null@infosec.exchangeE tyzbit@toot.nowT pa@hachyderm.ioP theorangetheme@en.osm.townT sancla@infosec.exchangeS 5 Replies Last reply
          0
          • theorangetheme@en.osm.townT theorangetheme@en.osm.town

            @cR0w I am *this* 🫰close to getting shitcanned when the next manager tries to passive-aggressively ask me if I used the slop extruder for some code, and if I say "no", asks me how I could. If you want to force me to use some garbage software I don't need to do my job, don't insult my intelligence by pussyfooting around it. Put on your grownup pants and just give me an order so I can be insubordinate properly.

            xan@xantronix.socialX This user is from outside of this forum
            xan@xantronix.socialX This user is from outside of this forum
            xan@xantronix.social
            wrote last edited by
            #17

            @theorangetheme @cR0w co-signed, outing myself, etc

            Link Preview Image
            Sign Up | LinkedIn

            500 million+ members | Manage your professional identity. Build and engage with your professional network. Access knowledge, insights and opportunities.

            favicon

            (www.linkedin.com)

            1 Reply Last reply
            0
            • fritzadalis@infosec.exchangeF fritzadalis@infosec.exchange

              @theorangetheme @cR0w
              My company just gave me 'use ai or else' goal.

              I'm gonna make it so fukken expensive.

              epic_null@infosec.exchangeE This user is from outside of this forum
              epic_null@infosec.exchangeE This user is from outside of this forum
              epic_null@infosec.exchange
              wrote last edited by
              #18

              @FritzAdalis @theorangetheme @cR0w Good news: that gets easier after this week.They are going to start charging per token.

              1 Reply Last reply
              0
              • fritzadalis@infosec.exchangeF fritzadalis@infosec.exchange

                @theorangetheme @cR0w
                My company just gave me 'use ai or else' goal.

                I'm gonna make it so fukken expensive.

                tyzbit@toot.nowT This user is from outside of this forum
                tyzbit@toot.nowT This user is from outside of this forum
                tyzbit@toot.now
                wrote last edited by
                #19

                @FritzAdalis @theorangetheme @cR0w when i'm inevitably given this imperative, i will just ask one question: "will the agent be responsible for the mistakes it makes?". obviously the answer will be no, to which i'll go "well i'm not going to be responsible for its mistakes either, i'll just give it pointless tasks to do."

                1 Reply Last reply
                0
                • fritzadalis@infosec.exchangeF fritzadalis@infosec.exchange

                  @theorangetheme @cR0w
                  My company just gave me 'use ai or else' goal.

                  I'm gonna make it so fukken expensive.

                  pa@hachyderm.ioP This user is from outside of this forum
                  pa@hachyderm.ioP This user is from outside of this forum
                  pa@hachyderm.io
                  wrote last edited by
                  #20

                  @FritzAdalis People (infra) are starting to talk about the number of lines of code generated... and taking pride in the big numbers.
                  Last weekend I tried to vibecode a little sh-tty POC for something simple. After generating about a dozen versions of the same program, using half a dozen different 120B-weigths models... *all* of it was utter garbage. Most models gave 1 to 3 Python files. One went all-in and produced a full development tree with 15 files.
                  Not a single class from the whole steaming pile was worth keeping.
                  I'm not proud of myself... I probably could have powered a third-world hospital for a year with the electricity consumed, and I threw it all away. 😢
                  @theorangetheme @cR0w

                  1 Reply Last reply
                  0
                  • fritzadalis@infosec.exchangeF fritzadalis@infosec.exchange

                    @theorangetheme @cR0w
                    My company just gave me 'use ai or else' goal.

                    I'm gonna make it so fukken expensive.

                    theorangetheme@en.osm.townT This user is from outside of this forum
                    theorangetheme@en.osm.townT This user is from outside of this forum
                    theorangetheme@en.osm.town
                    wrote last edited by
                    #21

                    @FritzAdalis @cR0w If you're going to comply, it might as well be malicious! Godspeed. 🫡

                    1 Reply Last reply
                    0
                    • fritzadalis@infosec.exchangeF fritzadalis@infosec.exchange

                      @theorangetheme @cR0w
                      My company just gave me 'use ai or else' goal.

                      I'm gonna make it so fukken expensive.

                      sancla@infosec.exchangeS This user is from outside of this forum
                      sancla@infosec.exchangeS This user is from outside of this forum
                      sancla@infosec.exchange
                      wrote last edited by
                      #22

                      @FritzAdalis @theorangetheme @cR0w Just adopt Claud, it’ll probably delete the production database in a month or 2. Just keep away from the blast zone (seriously, cover your ass).

                      1 Reply Last reply
                      1
                      0
                      Reply
                      • Reply as topic
                      Log in to reply
                      • Oldest to Newest
                      • Newest to Oldest
                      • Most Votes


                      • Login

                      • Login or register to search.
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • World
                      • Users
                      • Groups