Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.

The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.

Scheduled Pinned Locked Moved Uncategorized
42 Posts 29 Posters 56 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • raymaccarthy@mastodon.ieR raymaccarthy@mastodon.ie

    @343max @tante
    Then, Max, you have no understanding of LLM/Gen AI, or maybe of specifying requirements, designing system (modules, APIs etc) and then writing the code test & debug. If it's any size of project you need a team & management.
    There is also documentation.
    Actually writing the code is the easiest bit & the only bit the current LLM/Gen AI does, and does badly as it relies on code scraped from elsewhere & statistical shuffling of fragments.
    Can't work. It's a technological dead end.

    343max@mastodon.social3 This user is from outside of this forum
    343max@mastodon.social3 This user is from outside of this forum
    343max@mastodon.social
    wrote last edited by
    #28

    @raymaccarthy @tante Oh the “someone disagrees with me so they must be stupid" argument! Amazing. Please go away now.

    raymaccarthy@mastodon.ieR 1 Reply Last reply
    0
    • raymaccarthy@mastodon.ieR raymaccarthy@mastodon.ie

      @gklka @tante @map
      I've a bridge over the river Shannon you can buy.
      "make wonderful, creative, unique software using AI."
      No. An LLM can't create at all and if it actually works and meets the spec it's likely copied.

      gklka@mastodon.socialG This user is from outside of this forum
      gklka@mastodon.socialG This user is from outside of this forum
      gklka@mastodon.social
      wrote last edited by
      #29

      @raymaccarthy @tante @map AI can't create. You create, AI just implements it. I know it is hard to digest but this is everyday work for a lot of us now.

      raymaccarthy@mastodon.ieR 1 Reply Last reply
      0
      • tante@tldr.nettime.orgT tante@tldr.nettime.org

        The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
        But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

        It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

        johnzajac@dice.campJ This user is from outside of this forum
        johnzajac@dice.campJ This user is from outside of this forum
        johnzajac@dice.camp
        wrote last edited by
        #30

        @tante

        I mean, you can also "build a house" by using deck screws to connect some wet doug fir 2x4s into a "frame" and then staple on some drywall and siding and drape the whole thing in a plastic tarp.

        You will die when it falls on you, but for a time, it was a "house".

        1 Reply Last reply
        0
        • tante@tldr.nettime.orgT tante@tldr.nettime.org

          @map exactly. In a way accepting responsibility for the code one puts in front of people is accepting the connected care duties towards these people.

          8r3n7@mstdn.ca8 This user is from outside of this forum
          8r3n7@mstdn.ca8 This user is from outside of this forum
          8r3n7@mstdn.ca
          wrote last edited by
          #31

          @tante @map LLMs are another way to avoid putting skin in the game. Which is the whole point, if you’re a sociopath (or play one on TV). Privatize gains, socialize losses.

          1 Reply Last reply
          0
          • 343max@mastodon.social3 343max@mastodon.social

            @raymaccarthy @tante Oh the “someone disagrees with me so they must be stupid" argument! Amazing. Please go away now.

            raymaccarthy@mastodon.ieR This user is from outside of this forum
            raymaccarthy@mastodon.ieR This user is from outside of this forum
            raymaccarthy@mastodon.ie
            wrote last edited by
            #32

            @343max @tante
            I've designed & written SW for decades and done physical AI courses as well as studying it.
            What's your qualification for your amazing claims Max?
            Expert systems was AI in 1980s and relied on good design and curation of the knowledge of experts. It was too expensive to build and fragile.
            I forecast the the idea of LLM 20+ years ago. Chatbots then had data encoded in the program (Eliza, ALICE etc). I suggested a statistical engine and using the Internet as data. A toy.

            343max@mastodon.social3 amorpheus@kind.socialA 2 Replies Last reply
            0
            • gklka@mastodon.socialG gklka@mastodon.social

              @raymaccarthy @tante @map AI can't create. You create, AI just implements it. I know it is hard to digest but this is everyday work for a lot of us now.

              raymaccarthy@mastodon.ieR This user is from outside of this forum
              raymaccarthy@mastodon.ieR This user is from outside of this forum
              raymaccarthy@mastodon.ie
              wrote last edited by
              #33

              @gklka @tante @map
              A compiler implements it. The LLM/Gen is a rubbish search engine, database and statistical engine. It regurgitates based on prompts, not formal specifications.

              gklka@mastodon.socialG 1 Reply Last reply
              0
              • raymaccarthy@mastodon.ieR raymaccarthy@mastodon.ie

                @gklka @tante @map
                A compiler implements it. The LLM/Gen is a rubbish search engine, database and statistical engine. It regurgitates based on prompts, not formal specifications.

                gklka@mastodon.socialG This user is from outside of this forum
                gklka@mastodon.socialG This user is from outside of this forum
                gklka@mastodon.social
                wrote last edited by
                #34

                @raymaccarthy @tante @map Ok, feel free to think whatever you want.

                rolle@mementomori.socialR 1 Reply Last reply
                0
                • raymaccarthy@mastodon.ieR raymaccarthy@mastodon.ie

                  @343max @tante
                  I've designed & written SW for decades and done physical AI courses as well as studying it.
                  What's your qualification for your amazing claims Max?
                  Expert systems was AI in 1980s and relied on good design and curation of the knowledge of experts. It was too expensive to build and fragile.
                  I forecast the the idea of LLM 20+ years ago. Chatbots then had data encoded in the program (Eliza, ALICE etc). I suggested a statistical engine and using the Internet as data. A toy.

                  343max@mastodon.social3 This user is from outside of this forum
                  343max@mastodon.social3 This user is from outside of this forum
                  343max@mastodon.social
                  wrote last edited by
                  #35

                  @raymaccarthy You are absolutely right, I really shouldn’t trust my own day to day experience and the experience of all the people that I trust over your 20 year old predictions. We are all wrong, our eyes betrayed us, please help us see! Oh please!

                  1 Reply Last reply
                  0
                  • raymaccarthy@mastodon.ieR raymaccarthy@mastodon.ie

                    @343max @tante
                    I've designed & written SW for decades and done physical AI courses as well as studying it.
                    What's your qualification for your amazing claims Max?
                    Expert systems was AI in 1980s and relied on good design and curation of the knowledge of experts. It was too expensive to build and fragile.
                    I forecast the the idea of LLM 20+ years ago. Chatbots then had data encoded in the program (Eliza, ALICE etc). I suggested a statistical engine and using the Internet as data. A toy.

                    amorpheus@kind.socialA This user is from outside of this forum
                    amorpheus@kind.socialA This user is from outside of this forum
                    amorpheus@kind.social
                    wrote last edited by
                    #36

                    @raymaccarthy @343max @tante AI has, in theory, the same potential as a calculator. It can make tasks easier, but it actually does imply skill degradation in a certain field. Solving n-th grade differential equations back in Blaise Pascals time was frustrating. So since Schickard, automating such tasks helped humanity spend their time on the bigger picture instead of grinding through repetitive tasks. New technologies always shifted human skills to a new domain.

                    1 Reply Last reply
                    0
                    • tante@tldr.nettime.orgT tante@tldr.nettime.org

                      The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
                      But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

                      It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

                      budududuroiu@hachyderm.ioB This user is from outside of this forum
                      budududuroiu@hachyderm.ioB This user is from outside of this forum
                      budududuroiu@hachyderm.io
                      wrote last edited by
                      #37

                      @tante The best engineers I know just became more ambitious, and so should all of us.

                      I'll keep repeating this, there's tonnes of proprietary binary blobs in all of our tech. You can shout from the rooftops about how much you love your /e/OS phone, if your phone modem relies on a proprietary driver, it's pretty much worthless as a "resistance against big tech". European digital sovereignty is equally worthless.

                      LLMs are good at staring at hexdumps, humans aren't. Use their advantage to build actually open tech.

                      > But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

                      Skill issue, idk what more to say. I don't find it any different to managing juniors and reviewing their PRs. Bad code is bad code.

                      1 Reply Last reply
                      0
                      • tante@tldr.nettime.orgT tante@tldr.nettime.org

                        Here's the thing: I believe that you deserve to have access to high quality products and services. You deserve to use products and services that are safe, secure, well-designed and not destroying the ecological, informational or social environment.

                        benoitb@framapiaf.orgB This user is from outside of this forum
                        benoitb@framapiaf.orgB This user is from outside of this forum
                        benoitb@framapiaf.org
                        wrote last edited by
                        #38

                        @tante

                        +1

                        > "destroying the ecological, informational or social environment"

                        As good as this generated code may be, it remains unacceptable because of that. And that should be the ultimate reason (as the code quality may rise, but at the cost of more destruction)

                        1 Reply Last reply
                        0
                        • tante@tldr.nettime.orgT tante@tldr.nettime.org

                          The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
                          But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

                          It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

                          kevinhuigens@genealysis.socialK This user is from outside of this forum
                          kevinhuigens@genealysis.socialK This user is from outside of this forum
                          kevinhuigens@genealysis.social
                          wrote last edited by
                          #39

                          @tante

                          5 years retired from IT but I still remember CRUD

                          1 Reply Last reply
                          0
                          • tante@tldr.nettime.orgT tante@tldr.nettime.org

                            The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
                            But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

                            It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

                            dalias@hachyderm.ioD This user is from outside of this forum
                            dalias@hachyderm.ioD This user is from outside of this forum
                            dalias@hachyderm.io
                            wrote last edited by
                            #40

                            @tante Hot take: there should be no "software writing" involved in CRUD to begin with. Just some declarative stuff for your specific application and fully generic code.

                            1 Reply Last reply
                            0
                            • tante@tldr.nettime.orgT tante@tldr.nettime.org

                              The question is not whether you can create software using LLMs - you can - most software is just boring CRUD shit.
                              But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

                              It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

                              O This user is from outside of this forum
                              O This user is from outside of this forum
                              oytis@mastodon.social
                              wrote last edited by
                              #41

                              @tante Between development speed and software quality, the industry has chosen speed long before LLMs came to be. It would not doubt be better for all of us, if we all did not let LLMs write software. But whether not using them is viable for a specific company or a specific individual in a competitive environment stays open.

                              1 Reply Last reply
                              0
                              • gklka@mastodon.socialG gklka@mastodon.social

                                @raymaccarthy @tante @map Ok, feel free to think whatever you want.

                                rolle@mementomori.socialR This user is from outside of this forum
                                rolle@mementomori.socialR This user is from outside of this forum
                                rolle@mementomori.social
                                wrote last edited by
                                #42

                                @gklka @raymaccarthy @tante @map I agree with GK on this. Not all AI is the same, and it's definitely not black and white. With the right expertise and detailed specs, you can achieve great results while keeping the code maintainable and retaining ownership. I really dislike the mindset that everything has to be either absolutely good or 100% bad.

                                1 Reply Last reply
                                1
                                0
                                Reply
                                • Reply as topic
                                Log in to reply
                                • Oldest to Newest
                                • Newest to Oldest
                                • Most Votes


                                • Login

                                • Login or register to search.
                                • First post
                                  Last post
                                0
                                • Categories
                                • Recent
                                • Tags
                                • Popular
                                • World
                                • Users
                                • Groups