Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. we joke that when the AI bubble pops and the managers can't afford the chatbot any more, the surviving companies will hire the people who know how shit works to clean up

we joke that when the AI bubble pops and the managers can't afford the chatbot any more, the surviving companies will hire the people who know how shit works to clean up

Scheduled Pinned Locked Moved Uncategorized
64 Posts 51 Posters 36 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • fulk_it@mastodon.socialF fulk_it@mastodon.social

    @davidgerard I guess it depends on how much damage is done to internal codebases. Most of the time we'll prefer to start over from scratch is my guess. New winners and losers will emerge from that rubble. And the big companies will say they can't fail, or grumble, grumble national security.

    gimulnautti@mastodon.greenG This user is from outside of this forum
    gimulnautti@mastodon.greenG This user is from outside of this forum
    gimulnautti@mastodon.green
    wrote last edited by
    #61

    @Fulk_It @davidgerard For sure it will be more ”effiecient” to run a codebase that as closely as possible matches both the tokeniser and the training data of the model.

    You would need a more expensive model to ”jump over the hoops” of your legacy codebase, that breaks every time the context window is compacted.

    1 Reply Last reply
    0
    • linguacelta@toot.walesL This user is from outside of this forum
      linguacelta@toot.walesL This user is from outside of this forum
      linguacelta@toot.wales
      wrote last edited by
      #62

      @jalefkowit

      I have some optimism here that we won't actually lose all the programmers, because so many of us do it for the fun and the intellectual challenge. There's little of that in letting an LLM generate bad code on your behalf. So the programmers who are left will be the ones who genuinely enjoy it - probably a lot of hobbyists and open-sourcers alongside industry devs who have enough job security to resist using LLMs (which they know will just get in their way).

      1 Reply Last reply
      0
      • davidgerard@circumstances.runD davidgerard@circumstances.run

        we joke that when the AI bubble pops and the managers can't afford the chatbot any more, the surviving companies will hire the people who know how shit works to clean up

        but this is of course optimistic. observed behaviour is that they will instead do the stupidest and shortest-term thing they can do instead of ever doing it properly.

        so what do you envision this might be?

        for clarity, i think when the AI bubble pops, which I place as some time next year at the latest - and you can hear the screeching noises in 2026 - the current recession signs will turn into a full Great Depression 2, so those surviving companies will also be doing not so great

        frang@meow.socialF This user is from outside of this forum
        frang@meow.socialF This user is from outside of this forum
        frang@meow.social
        wrote last edited by
        #63

        @davidgerard let's see.. skilled contractors will be too expensive.. their previous employees will tell them some variation of "die in a fire"... So.. untrained, unsupervised, entry-level people being paid at as close to minimum wage as they can. They will not have any idea how anything works. Nor will they be able to understand the existing code. So, instead of fixing what is there, they will slather layer upon layer of new (also buggy) code on top of the existing mess to try to patch/correct the problems after the fact.

        (Note: I saw exactly this happen back in the 90s.. so there is an existence proof)

        1 Reply Last reply
        0
        • davidgerard@circumstances.runD davidgerard@circumstances.run

          we joke that when the AI bubble pops and the managers can't afford the chatbot any more, the surviving companies will hire the people who know how shit works to clean up

          but this is of course optimistic. observed behaviour is that they will instead do the stupidest and shortest-term thing they can do instead of ever doing it properly.

          so what do you envision this might be?

          for clarity, i think when the AI bubble pops, which I place as some time next year at the latest - and you can hear the screeching noises in 2026 - the current recession signs will turn into a full Great Depression 2, so those surviving companies will also be doing not so great

          foolishowl@social.coopF This user is from outside of this forum
          foolishowl@social.coopF This user is from outside of this forum
          foolishowl@social.coop
          wrote last edited by
          #64

          @davidgerard I think the "AI" bubble is one of the more obvious aspects of a long-developing system of social crises coming to a head. "GenAI" is bent on absorbing everything and turning it into slime.

          I think there may be opportunities coming out of its inevitable collapse. Part of the difficulty for any radical movement has been that any success gets co-opted or integrated; nothing remains ours. A positive response to the collapse of the "AI" bubble would have to be concerned with things like authenticity and human dignity, with resisting that co-optation.

          In cyberpunk, the point is the oppressed survive. We want thriving, not just survival, of course.

          There are also the reactionary responses, fantasies of return to an idealized past. Fascism is a species of reaction, that embraces aspects of the new, but older forms of reaction will likely become more prominent.

          I imagine decades to come will be filled with grief and profound loss, but some hope if we work towards it.

          1 Reply Last reply
          0
          • em0nm4stodon@infosec.exchangeE em0nm4stodon@infosec.exchange shared this topic
          Reply
          • Reply as topic
          Log in to reply
          • Oldest to Newest
          • Newest to Oldest
          • Most Votes


          • Login

          • Login or register to search.
          • First post
            Last post
          0
          • Categories
          • Recent
          • Tags
          • Popular
          • World
          • Users
          • Groups