Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.

The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.

Scheduled Pinned Locked Moved Uncategorized
42 Posts 29 Posters 56 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • map@xoxo.zoneM map@xoxo.zone

    @tante Thinking a lot about this. To me it boils down to code ownership. Which is yet another kind of responsibility/liability that is offloaded to machines that by definition can't be.

    tante@tldr.nettime.orgT This user is from outside of this forum
    tante@tldr.nettime.orgT This user is from outside of this forum
    tante@tldr.nettime.org
    wrote last edited by
    #7

    @map exactly. In a way accepting responsibility for the code one puts in front of people is accepting the connected care duties towards these people.

    8r3n7@mstdn.ca8 1 Reply Last reply
    0
    • map@xoxo.zoneM map@xoxo.zone

      @tante Thinking a lot about this. To me it boils down to code ownership. Which is yet another kind of responsibility/liability that is offloaded to machines that by definition can't be.

      mxey@hachyderm.ioM This user is from outside of this forum
      mxey@hachyderm.ioM This user is from outside of this forum
      mxey@hachyderm.io
      wrote last edited by
      #8

      @map @tante even some pro LLM people see that: https://lucumr.pocoo.org/2026/2/13/the-final-bottleneck/

      1 Reply Last reply
      0
      • nausipoule@mamot.frN nausipoule@mamot.fr

        @tante LLM means the tyranny of shit.
        A very mediocre dystopia indeed.

        ftranschel@norden.socialF This user is from outside of this forum
        ftranschel@norden.socialF This user is from outside of this forum
        ftranschel@norden.social
        wrote last edited by
        #9

        @Nausipoule @tante I'd argue that instead (or, if you'd like: additionally), it is the terminal form of stochastic terrorism:

        You will be randomly denied services, participation and dignity. Now isn't that quite a future.

        1 Reply Last reply
        0
        • tante@tldr.nettime.orgT tante@tldr.nettime.org

          The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
          But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

          It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

          gklka@mastodon.socialG This user is from outside of this forum
          gklka@mastodon.socialG This user is from outside of this forum
          gklka@mastodon.social
          wrote last edited by
          #10

          @tante @map That's just part of the truth. You can make wonderful, creative, unique software using AI. The thing is you have to specify what you want to achieve. If you don't give these goals to the AI then it will come up with some mediocre generic solution.

          raymaccarthy@mastodon.ieR 1 Reply Last reply
          0
          • tante@tldr.nettime.orgT tante@tldr.nettime.org

            The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
            But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

            It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

            elrohir@mastodon.galE This user is from outside of this forum
            elrohir@mastodon.galE This user is from outside of this forum
            elrohir@mastodon.gal
            wrote last edited by
            #11

            @tante And I cannot even begin to emphasize how much *it will cost about the same despite it being of lower quality*.

            Once credit entities realize that GPUs get obsolete very fast and five years down the line the early-mover needs to buy again just as much new processing hardware as a late-comer, they will stop subsidizing today's AI as a gamble to capture the market for tomorrow.

            And then your 45-minutes-saving boilerplate machine will cost 5$ per run.

            1 Reply Last reply
            0
            • R relay@relay.an.exchange shared this topic
            • pikesley@mastodon.me.ukP pikesley@mastodon.me.uk

              @tante yeah but what if Some Guy's bonus depends on making it all shittier?

              kuzmandi@mastodon.socialK This user is from outside of this forum
              kuzmandi@mastodon.socialK This user is from outside of this forum
              kuzmandi@mastodon.social
              wrote last edited by
              #12

              @pikesley @tante The Problem exists as long this "guys" are believing, that they can buy themselves a better world.

              1 Reply Last reply
              0
              • tante@tldr.nettime.orgT tante@tldr.nettime.org

                Here's the thing: I believe that you deserve to have access to high quality products and services. You deserve to use products and services that are safe, secure, well-designed and not destroying the ecological, informational or social environment.

                nojhan@social.antigene.orgN This user is from outside of this forum
                nojhan@social.antigene.orgN This user is from outside of this forum
                nojhan@social.antigene.org
                wrote last edited by
                #13

                @tante I mean… people accepted that for transports, agriculture, entertainments, even education and healthcare. Why stop here?

                1 Reply Last reply
                0
                • R relay@relay.infosec.exchange shared this topic
                • tante@tldr.nettime.orgT tante@tldr.nettime.org

                  The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
                  But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

                  It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

                  nielsa@mas.toN This user is from outside of this forum
                  nielsa@mas.toN This user is from outside of this forum
                  nielsa@mas.to
                  wrote last edited by
                  #14

                  @tante Good take! But also, like "can you create software" is not really an accurate framing of what the hard part of software was.

                  Most people could "create software" by looking up a Hello world example. That wouldn't help them solve amy real problems tho.

                  LLMs produce software that *looks more like* it solves problems... but security, integrity, legality were kind of always implied parts of the problem.

                  Like, it takes a weird subtle reframing of the goal to make LLMs look at all useful.

                  1 Reply Last reply
                  0
                  • tante@tldr.nettime.orgT tante@tldr.nettime.org

                    The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
                    But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

                    It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

                    343max@mastodon.social3 This user is from outside of this forum
                    343max@mastodon.social3 This user is from outside of this forum
                    343max@mastodon.social
                    wrote last edited by
                    #15

                    @tante By now I’m pretty convinced llms can make it easier to produce high quality code then writing high quality code manually. Particularly because the AI is willing to do all the tedious, borring tasks that most developers are often to lazy for. Yes, it also makes it much easier to produce shittier code as well. (1/2)

                    343max@mastodon.social3 1 Reply Last reply
                    0
                    • 343max@mastodon.social3 343max@mastodon.social

                      @tante By now I’m pretty convinced llms can make it easier to produce high quality code then writing high quality code manually. Particularly because the AI is willing to do all the tedious, borring tasks that most developers are often to lazy for. Yes, it also makes it much easier to produce shittier code as well. (1/2)

                      343max@mastodon.social3 This user is from outside of this forum
                      343max@mastodon.social3 This user is from outside of this forum
                      343max@mastodon.social
                      wrote last edited by
                      #16

                      Right now we are seeing way more of the latter because most people haven't learned yet how to produce good AI code and because the bad code sticks out while the good code blends in. But I'm convinced your underlying assumption “AI code = shitty" isn't correct. (2/2)
                      @tante

                      cjk@chaos.socialC raymaccarthy@mastodon.ieR 2 Replies Last reply
                      0
                      • R relay@relay.mycrowd.ca shared this topic
                      • 343max@mastodon.social3 343max@mastodon.social

                        Right now we are seeing way more of the latter because most people haven't learned yet how to produce good AI code and because the bad code sticks out while the good code blends in. But I'm convinced your underlying assumption “AI code = shitty" isn't correct. (2/2)
                        @tante

                        cjk@chaos.socialC This user is from outside of this forum
                        cjk@chaos.socialC This user is from outside of this forum
                        cjk@chaos.social
                        wrote last edited by
                        #17

                        @343max @tante I think that's kind of the wrong question. Skill degradation and the moral implications (crawling of copyrighted material, climate, etc.) don't go away just because the generated code is good.

                        But I'm pretty sure you are aware 🙂

                        343max@mastodon.social3 1 Reply Last reply
                        0
                        • tante@tldr.nettime.orgT tante@tldr.nettime.org

                          Here's the thing: I believe that you deserve to have access to high quality products and services. You deserve to use products and services that are safe, secure, well-designed and not destroying the ecological, informational or social environment.

                          vasilis@social.vasilis.nlV This user is from outside of this forum
                          vasilis@social.vasilis.nlV This user is from outside of this forum
                          vasilis@social.vasilis.nl
                          wrote last edited by
                          #18

                          @tante Then there’s the thing that we never had that software. Business has always accepted low quality products and services. So while I do agree with you, I’m afraid the people who run the software companies simply don’t care.

                          1 Reply Last reply
                          0
                          • cjk@chaos.socialC cjk@chaos.social

                            @343max @tante I think that's kind of the wrong question. Skill degradation and the moral implications (crawling of copyrighted material, climate, etc.) don't go away just because the generated code is good.

                            But I'm pretty sure you are aware 🙂

                            343max@mastodon.social3 This user is from outside of this forum
                            343max@mastodon.social3 This user is from outside of this forum
                            343max@mastodon.social
                            wrote last edited by
                            #19

                            @cjk @tante Honestly I'm not sure about the skill degradation. I think there is a very high chance this is the same “new technology will make the youth stupid" panic that we have seen for centuries with every new technology. Also I really would like to see a deep analysis how much AI is hurting the environment. I don't trust Sam Altmans numbers but I also buy the “every prompt is burning down a small forrest" hyperbole.

                            1 Reply Last reply
                            0
                            • tante@tldr.nettime.orgT tante@tldr.nettime.org

                              The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
                              But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

                              It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

                              countholdem@mastodon.socialC This user is from outside of this forum
                              countholdem@mastodon.socialC This user is from outside of this forum
                              countholdem@mastodon.social
                              wrote last edited by
                              #20

                              @tante Yes, if I had Root Cause Analysis training - #BehavioralScience - in earlier education, I and more like me would've made more of a difference.

                              1 Reply Last reply
                              0
                              • tante@tldr.nettime.orgT tante@tldr.nettime.org

                                The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
                                But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

                                It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

                                alper@rls.socialA This user is from outside of this forum
                                alper@rls.socialA This user is from outside of this forum
                                alper@rls.social
                                wrote last edited by
                                #21

                                @tante No truly Germany has managed to give us great software over the past decades without LLMS.

                                1 Reply Last reply
                                0
                                • tante@tldr.nettime.orgT tante@tldr.nettime.org

                                  The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
                                  But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

                                  It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

                                  stefanfrede@mastodon.socialS This user is from outside of this forum
                                  stefanfrede@mastodon.socialS This user is from outside of this forum
                                  stefanfrede@mastodon.social
                                  wrote last edited by
                                  #22

                                  @tante In my opinion, the problem is that most decision-makers don't understand the risks involved. When those in charge lack the necessary knowledge for software development, the empty promises of AI seem like an easy solution 🤷

                                  1 Reply Last reply
                                  0
                                  • tante@tldr.nettime.orgT tante@tldr.nettime.org

                                    The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
                                    But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

                                    It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

                                    raymaccarthy@mastodon.ieR This user is from outside of this forum
                                    raymaccarthy@mastodon.ieR This user is from outside of this forum
                                    raymaccarthy@mastodon.ie
                                    wrote last edited by
                                    #23

                                    @tante
                                    Is that running or ruining?

                                    1 Reply Last reply
                                    0
                                    • tante@tldr.nettime.orgT tante@tldr.nettime.org

                                      The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
                                      But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

                                      It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

                                      alessandro@mstdn.caA This user is from outside of this forum
                                      alessandro@mstdn.caA This user is from outside of this forum
                                      alessandro@mstdn.ca
                                      wrote last edited by
                                      #24

                                      @tante

                                      Entropy is somebody else's problem - they'll be comfortable on their yacht. We can't expect sociopaths to care about others, else they wouldn't be sociopaths.

                                      1 Reply Last reply
                                      1
                                      0
                                      • 343max@mastodon.social3 343max@mastodon.social

                                        Right now we are seeing way more of the latter because most people haven't learned yet how to produce good AI code and because the bad code sticks out while the good code blends in. But I'm convinced your underlying assumption “AI code = shitty" isn't correct. (2/2)
                                        @tante

                                        raymaccarthy@mastodon.ieR This user is from outside of this forum
                                        raymaccarthy@mastodon.ieR This user is from outside of this forum
                                        raymaccarthy@mastodon.ie
                                        wrote last edited by
                                        #25

                                        @343max @tante
                                        Then, Max, you have no understanding of LLM/Gen AI, or maybe of specifying requirements, designing system (modules, APIs etc) and then writing the code test & debug. If it's any size of project you need a team & management.
                                        There is also documentation.
                                        Actually writing the code is the easiest bit & the only bit the current LLM/Gen AI does, and does badly as it relies on code scraped from elsewhere & statistical shuffling of fragments.
                                        Can't work. It's a technological dead end.

                                        343max@mastodon.social3 1 Reply Last reply
                                        0
                                        • tante@tldr.nettime.orgT tante@tldr.nettime.org

                                          The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
                                          But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

                                          It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

                                          hopeless@mas.toH This user is from outside of this forum
                                          hopeless@mas.toH This user is from outside of this forum
                                          hopeless@mas.to
                                          wrote last edited by
                                          #26

                                          @tante Well, well done for admitting that demonstrably, the dog can play the piano. Now we are just talking about how well it plays.

                                          FWIW these LLMs have no need for being consistent with what happened in a previous context. The same LLM, in a new context, will usefully critique and find and fix flaws in what it itself did in the previous context.

                                          The "slop" aspect of LLM output seems to come from just going with what one context produced blind, when it can iterate as, eg, QA manager.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups