Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. Both Meta & Microsoft have said they're shedding staff explicitly to free up cash flow to invest in AI;

Both Meta & Microsoft have said they're shedding staff explicitly to free up cash flow to invest in AI;

Scheduled Pinned Locked Moved Uncategorized
workers
65 Posts 38 Posters 36 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

    @linuxgnome @ChrisMayLA6

    There's an element of FOMO. Microsoft was late with a mobile platform that worked nicely on capacitive touchscreens and, as a result, lost that market entirely. Though it turns out that wasn't a bad thing: they can sell Office on both iOS and Android and no one is actually making money from mobile operating systems (they are on the surrounding ecosystem, but the OS is the loss-making part that enables that and being a player in the ecosystem without paying that cost is often better).

    But there's a much bigger part of a need to grow.

    It's easy to grow when a product is useful and new. The IBM PC wasn't the first personal computer to be powerful enough to be useful, but it was around that time. When I was a small child, in a middle-class area, almost no one I knew owned a computer. My school had a few (less than one per 20 pupils). Going from there to Gates' goal of a computer on every desk allowed them to double sales for many years.

    Then the Internet came along. Gates said it was a passing fad and things like MSN (at the time, an OSP, not a web site) would replace it. But it then caused another decade of growth as every business went from needing a computer to needing a web presence. MS didn't get the lion's share of this, but still had a load of products (especially acquisitions like Hotmail) that grew along with this expansion.

    Then came two things at about the same time. One was a bunch of technologies (capacitive touchscreens, 3G mobile networks, better LiIon batteries, power-efficient ARM SoCs), which made mobile phones feasible. Going from a computer in every desk to a computer in every pocket allowed a load more doublings. Microsoft again didn't get the biggest part of this growth, but they rode that wave.

    The other thing that happened was that virtualisation on x86 became feasible. Xen showed that computers were fast enough for paravirtualisation to give you multiple useful virtual machines on a single computer, Intel and AMD responded with extensions to allow unmodified operating systems to run. This provided a path to consolidation in what became the cloud. Rather than buying a computer, you could rent a fraction of a computer, which would be cheaper since you didn't actually need 100% of a computer 100% of the time. Even if the provider charged you a 100% markup, you were probably using only 20% of a computer so paying for 40% of one was cheaper than buying a whole one. Especially if you were actually using 80-100% of a computer but only 20-25% of the time (e.g. during peak business hours).

    Nadella was the lead in the cloud division when it went from being a weird thing to being one of the major revenue sources for the company.

    But the cloud has a problem. People's requirements for cloud storage and compute grow organically. You might need 20% more cloud stuff this year than you did last year. At the same time, the cost of compute and storage is dropping. Here's a fun graph of storage costs. From 2013 to 2018 (ignore the numbers after that, they're predictions and are nonsense), the cost of 1TB of SSD storage went from $694 to $107. To remain competitive, cloud prices needed to come down at the same rate. They didn't, so they're relying more on lock-in, but that doesn't get you new customers.

    Most of the growth in the cloud was not new compute demand, it was people moving things from on-premises deployments to the cloud. That's a finite (and nearly exhausted) market. That, combined with the need to lower prices over time to prevent companies moving back, is a problem. It's made worse by the fact that the biggest customer see the least benefit. If you're a large company with your own server rooms full of machines, the cost reductions of the cloud are negligible. If you're a small company with one server, moving to a cheaper system with built-in redundancy is a win. But getting each of those companies to move costs a lot.

    The cloud really needs a use case that has growing compute requirements. The push for 'big data' was starting to run up against both regulatory issues (GDPR was making data a liability, not an asset) and security problems (you get very bad press when you leak customer data that you have no real reason for holding). AI came along with a promise that customers would keep needing more and faster compute every year. The thing that the leadership at these companies missed is that, for this to make business sense, they also need to be willing to pay an increasing amount each year. And that means you need to deliver increasing productivity improvements each year.

    Delivering zero productivity increases while having to put up prices to customers is how we see the bubble start to burst.

    graydon@canada.masto.hostG This user is from outside of this forum
    graydon@canada.masto.hostG This user is from outside of this forum
    graydon@canada.masto.host
    wrote last edited by
    #39

    @david_chisnall The gold rush is over.

    MS, and quite a few other companies, are creatures of that gold rush, and not at all well-adapted to conditions where computing is infrastructure, rather than the shiny new hotness you haven't got yet. (Where "not well-adapted" tends to mean things like "willing to do nearly anything instead of adapt"; "AI" is a burn-the-world denial of reality.)

    @linuxgnome @ChrisMayLA6

    cstross@wandering.shopC johnzajac@dice.campJ 2 Replies Last reply
    0
    • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

      @linuxgnome @ChrisMayLA6

      There's an element of FOMO. Microsoft was late with a mobile platform that worked nicely on capacitive touchscreens and, as a result, lost that market entirely. Though it turns out that wasn't a bad thing: they can sell Office on both iOS and Android and no one is actually making money from mobile operating systems (they are on the surrounding ecosystem, but the OS is the loss-making part that enables that and being a player in the ecosystem without paying that cost is often better).

      But there's a much bigger part of a need to grow.

      It's easy to grow when a product is useful and new. The IBM PC wasn't the first personal computer to be powerful enough to be useful, but it was around that time. When I was a small child, in a middle-class area, almost no one I knew owned a computer. My school had a few (less than one per 20 pupils). Going from there to Gates' goal of a computer on every desk allowed them to double sales for many years.

      Then the Internet came along. Gates said it was a passing fad and things like MSN (at the time, an OSP, not a web site) would replace it. But it then caused another decade of growth as every business went from needing a computer to needing a web presence. MS didn't get the lion's share of this, but still had a load of products (especially acquisitions like Hotmail) that grew along with this expansion.

      Then came two things at about the same time. One was a bunch of technologies (capacitive touchscreens, 3G mobile networks, better LiIon batteries, power-efficient ARM SoCs), which made mobile phones feasible. Going from a computer in every desk to a computer in every pocket allowed a load more doublings. Microsoft again didn't get the biggest part of this growth, but they rode that wave.

      The other thing that happened was that virtualisation on x86 became feasible. Xen showed that computers were fast enough for paravirtualisation to give you multiple useful virtual machines on a single computer, Intel and AMD responded with extensions to allow unmodified operating systems to run. This provided a path to consolidation in what became the cloud. Rather than buying a computer, you could rent a fraction of a computer, which would be cheaper since you didn't actually need 100% of a computer 100% of the time. Even if the provider charged you a 100% markup, you were probably using only 20% of a computer so paying for 40% of one was cheaper than buying a whole one. Especially if you were actually using 80-100% of a computer but only 20-25% of the time (e.g. during peak business hours).

      Nadella was the lead in the cloud division when it went from being a weird thing to being one of the major revenue sources for the company.

      But the cloud has a problem. People's requirements for cloud storage and compute grow organically. You might need 20% more cloud stuff this year than you did last year. At the same time, the cost of compute and storage is dropping. Here's a fun graph of storage costs. From 2013 to 2018 (ignore the numbers after that, they're predictions and are nonsense), the cost of 1TB of SSD storage went from $694 to $107. To remain competitive, cloud prices needed to come down at the same rate. They didn't, so they're relying more on lock-in, but that doesn't get you new customers.

      Most of the growth in the cloud was not new compute demand, it was people moving things from on-premises deployments to the cloud. That's a finite (and nearly exhausted) market. That, combined with the need to lower prices over time to prevent companies moving back, is a problem. It's made worse by the fact that the biggest customer see the least benefit. If you're a large company with your own server rooms full of machines, the cost reductions of the cloud are negligible. If you're a small company with one server, moving to a cheaper system with built-in redundancy is a win. But getting each of those companies to move costs a lot.

      The cloud really needs a use case that has growing compute requirements. The push for 'big data' was starting to run up against both regulatory issues (GDPR was making data a liability, not an asset) and security problems (you get very bad press when you leak customer data that you have no real reason for holding). AI came along with a promise that customers would keep needing more and faster compute every year. The thing that the leadership at these companies missed is that, for this to make business sense, they also need to be willing to pay an increasing amount each year. And that means you need to deliver increasing productivity improvements each year.

      Delivering zero productivity increases while having to put up prices to customers is how we see the bubble start to burst.

      dfyx@social.helios42.deD This user is from outside of this forum
      dfyx@social.helios42.deD This user is from outside of this forum
      dfyx@social.helios42.de
      wrote last edited by
      #40

      @david_chisnall It is, once again, a solution looking for the right problem.

      LLMs seem to have some uses where they're better than other solutions (translation might be one) but those are too niche to sell them to everyone on the planet.

      So they try to sell them as search engines, copywriters, programmers and a dozen other things just to attract more companies even if LLMs are a poor choice for their needs.

      nxskok@cupoftea.socialN 1 Reply Last reply
      0
      • graydon@canada.masto.hostG graydon@canada.masto.host

        @david_chisnall The gold rush is over.

        MS, and quite a few other companies, are creatures of that gold rush, and not at all well-adapted to conditions where computing is infrastructure, rather than the shiny new hotness you haven't got yet. (Where "not well-adapted" tends to mean things like "willing to do nearly anything instead of adapt"; "AI" is a burn-the-world denial of reality.)

        @linuxgnome @ChrisMayLA6

        cstross@wandering.shopC This user is from outside of this forum
        cstross@wandering.shopC This user is from outside of this forum
        cstross@wandering.shop
        wrote last edited by
        #41

        @graydon @david_chisnall @linuxgnome @ChrisMayLA6 More to the point: the end is in sight for the annual gains Moore's Law accustomed everybody to—you can't build circuits smaller than atomic orbitals—but it has run for over 40 years, so everybody in a decision-making position has grown up expecting it to continue. Not so much in semiconductors, but everyone *else*: VCs, PE firms, software, the general public.

        The cluetrain is bound to run off the track and derail in an unploughed field.

        graydon@canada.masto.hostG mdm@mcnamarii.townM david_chisnall@infosec.exchangeD iinavpov@mastodon.onlineI 4 Replies Last reply
        0
        • cstross@wandering.shopC cstross@wandering.shop

          @graydon @david_chisnall @linuxgnome @ChrisMayLA6 More to the point: the end is in sight for the annual gains Moore's Law accustomed everybody to—you can't build circuits smaller than atomic orbitals—but it has run for over 40 years, so everybody in a decision-making position has grown up expecting it to continue. Not so much in semiconductors, but everyone *else*: VCs, PE firms, software, the general public.

          The cluetrain is bound to run off the track and derail in an unploughed field.

          graydon@canada.masto.hostG This user is from outside of this forum
          graydon@canada.masto.hostG This user is from outside of this forum
          graydon@canada.masto.host
          wrote last edited by
          #42

          @cstross TSMC is working on going from 3 nm to 2 nm fabrication.

          On the one hand, that's a big change, percentage-wise.

          On the other hand, only TSMC is doing this because the entire world economy can afford at most one fab.

          On the third hand, it's not clear there's any actual advantage to making the change. There's almost certainly better things to do with that money. But line must more tinyness! is built into the whole process.

          @david_chisnall @linuxgnome @ChrisMayLA6

          cstross@wandering.shopC kurtmrufa@dragon.styleK 2 Replies Last reply
          0
          • graydon@canada.masto.hostG graydon@canada.masto.host

            @cstross TSMC is working on going from 3 nm to 2 nm fabrication.

            On the one hand, that's a big change, percentage-wise.

            On the other hand, only TSMC is doing this because the entire world economy can afford at most one fab.

            On the third hand, it's not clear there's any actual advantage to making the change. There's almost certainly better things to do with that money. But line must more tinyness! is built into the whole process.

            @david_chisnall @linuxgnome @ChrisMayLA6

            cstross@wandering.shopC This user is from outside of this forum
            cstross@wandering.shopC This user is from outside of this forum
            cstross@wandering.shop
            wrote last edited by
            #43

            @graydon @david_chisnall @linuxgnome @ChrisMayLA6

            The logical end-point after the node size bottoms out is going to be for the inherent deflation to become evident—fabs get amortized over time, so the product stops being premium and becomes a cash cow, and prices have to drop.

            Nvidia can't survive that. Intel can't survive that. They need something like the AI hyperscalers to keep demand high, but the demand is artificial, and actual consumer demand is soft if not soggy.

            Crash is inevitable.

            stompyrobot@mastodon.gamedev.placeS 1 Reply Last reply
            0
            • graydon@canada.masto.hostG graydon@canada.masto.host

              @david_chisnall The gold rush is over.

              MS, and quite a few other companies, are creatures of that gold rush, and not at all well-adapted to conditions where computing is infrastructure, rather than the shiny new hotness you haven't got yet. (Where "not well-adapted" tends to mean things like "willing to do nearly anything instead of adapt"; "AI" is a burn-the-world denial of reality.)

              @linuxgnome @ChrisMayLA6

              johnzajac@dice.campJ This user is from outside of this forum
              johnzajac@dice.campJ This user is from outside of this forum
              johnzajac@dice.camp
              wrote last edited by
              #44

              @graydon @david_chisnall @linuxgnome @ChrisMayLA6

              The bubble bursting will simply be their final breath being released into the sky. Blessedly.

              1 Reply Last reply
              0
              • chrismayla6@zirk.usC chrismayla6@zirk.us

                Both Meta & Microsoft have said they're shedding staff explicitly to free up cash flow to invest in AI;

                on one level this is unemployment linked to technology, but its a bit different from *actual* technological unemployment - the latter sees people losing jobs due to the deployment of technology to do their jobs. Microsoft & Meta on the other hand are sacking people to take a (bigger) punt on a business strategy that is yet to prove its transformation of productivity.

                #AI #workers
                h/t FT

                openrisk@mastodon.socialO This user is from outside of this forum
                openrisk@mastodon.socialO This user is from outside of this forum
                openrisk@mastodon.social
                wrote last edited by
                #45

                @ChrisMayLA6

                Oracle invoked the same argument.

                Extraordinary times whatever the interpretation.

                One possibility is that they don't think its a risky move. How can they be so sure? Only if they know they have a stranglehold on users and can push "AI" in all eventualities.

                Another possibility is that they are actually "bust", not literally - as in bankrupt - but in terms of defending their astronomical valuations: the risky bets aim to avoid a massive correction.

                Time will tell I suppose...

                chrismayla6@zirk.usC 1 Reply Last reply
                0
                • openrisk@mastodon.socialO openrisk@mastodon.social

                  @ChrisMayLA6

                  Oracle invoked the same argument.

                  Extraordinary times whatever the interpretation.

                  One possibility is that they don't think its a risky move. How can they be so sure? Only if they know they have a stranglehold on users and can push "AI" in all eventualities.

                  Another possibility is that they are actually "bust", not literally - as in bankrupt - but in terms of defending their astronomical valuations: the risky bets aim to avoid a massive correction.

                  Time will tell I suppose...

                  chrismayla6@zirk.usC This user is from outside of this forum
                  chrismayla6@zirk.usC This user is from outside of this forum
                  chrismayla6@zirk.us
                  wrote last edited by
                  #46

                  @openrisk

                  I think the final post may be right - one last throw of the dice in a bid to avoid a 'correction' in their share price (see the warning today from the BoE about UK share prices, which is just as applicable to US ones, in my view)

                  openrisk@mastodon.socialO 1 Reply Last reply
                  0
                  • cstross@wandering.shopC cstross@wandering.shop

                    @graydon @david_chisnall @linuxgnome @ChrisMayLA6 More to the point: the end is in sight for the annual gains Moore's Law accustomed everybody to—you can't build circuits smaller than atomic orbitals—but it has run for over 40 years, so everybody in a decision-making position has grown up expecting it to continue. Not so much in semiconductors, but everyone *else*: VCs, PE firms, software, the general public.

                    The cluetrain is bound to run off the track and derail in an unploughed field.

                    mdm@mcnamarii.townM This user is from outside of this forum
                    mdm@mcnamarii.townM This user is from outside of this forum
                    mdm@mcnamarii.town
                    wrote last edited by
                    #47

                    @cstross @graydon @david_chisnall @linuxgnome @ChrisMayLA6

                    "you can't build circuits smaller than atomic orbitals"

                    Well, not with that attitude /s

                    cstross@wandering.shopC 1 Reply Last reply
                    0
                    • chrismayla6@zirk.usC chrismayla6@zirk.us

                      Both Meta & Microsoft have said they're shedding staff explicitly to free up cash flow to invest in AI;

                      on one level this is unemployment linked to technology, but its a bit different from *actual* technological unemployment - the latter sees people losing jobs due to the deployment of technology to do their jobs. Microsoft & Meta on the other hand are sacking people to take a (bigger) punt on a business strategy that is yet to prove its transformation of productivity.

                      #AI #workers
                      h/t FT

                      brokar@mastodon.socialB This user is from outside of this forum
                      brokar@mastodon.socialB This user is from outside of this forum
                      brokar@mastodon.social
                      wrote last edited by
                      #48

                      @ChrisMayLA6 Concerning AI, i wonder how the politicians, who celebrate and support the AI and sign related bills without consequence, want to rectify the prognosed and inevitable loss of jobs while later they are campaigning in the next election on how many jobs they will create?
                      Better ask them then what they think in what business they intend to create jobs in? Data cleanup for AI then instead of programming, bookkeeping and teaching?

                      chrismayla6@zirk.usC 1 Reply Last reply
                      0
                      • mdm@mcnamarii.townM mdm@mcnamarii.town

                        @cstross @graydon @david_chisnall @linuxgnome @ChrisMayLA6

                        "you can't build circuits smaller than atomic orbitals"

                        Well, not with that attitude /s

                        cstross@wandering.shopC This user is from outside of this forum
                        cstross@wandering.shopC This user is from outside of this forum
                        cstross@wandering.shop
                        wrote last edited by
                        #49

                        @mdm @graydon @david_chisnall @linuxgnome @ChrisMayLA6 Well you *can* if you use muons instead of electrons but then you have to do your computing inside a particle accelerator and everything is radioactive and on fire

                        otfrom@functional.cafeO stompyrobot@mastodon.gamedev.placeS 2 Replies Last reply
                        0
                        • brokar@mastodon.socialB brokar@mastodon.social

                          @ChrisMayLA6 Concerning AI, i wonder how the politicians, who celebrate and support the AI and sign related bills without consequence, want to rectify the prognosed and inevitable loss of jobs while later they are campaigning in the next election on how many jobs they will create?
                          Better ask them then what they think in what business they intend to create jobs in? Data cleanup for AI then instead of programming, bookkeeping and teaching?

                          chrismayla6@zirk.usC This user is from outside of this forum
                          chrismayla6@zirk.usC This user is from outside of this forum
                          chrismayla6@zirk.us
                          wrote last edited by
                          #50

                          @Brokar

                          Ahhh.... but you're expecting our political class to think beyond the next electoral cycle there, aren't you, and we know they find that almost impossible (for a range of internal & external reasons)

                          brokar@mastodon.socialB 1 Reply Last reply
                          0
                          • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

                            @linuxgnome @ChrisMayLA6

                            There's an element of FOMO. Microsoft was late with a mobile platform that worked nicely on capacitive touchscreens and, as a result, lost that market entirely. Though it turns out that wasn't a bad thing: they can sell Office on both iOS and Android and no one is actually making money from mobile operating systems (they are on the surrounding ecosystem, but the OS is the loss-making part that enables that and being a player in the ecosystem without paying that cost is often better).

                            But there's a much bigger part of a need to grow.

                            It's easy to grow when a product is useful and new. The IBM PC wasn't the first personal computer to be powerful enough to be useful, but it was around that time. When I was a small child, in a middle-class area, almost no one I knew owned a computer. My school had a few (less than one per 20 pupils). Going from there to Gates' goal of a computer on every desk allowed them to double sales for many years.

                            Then the Internet came along. Gates said it was a passing fad and things like MSN (at the time, an OSP, not a web site) would replace it. But it then caused another decade of growth as every business went from needing a computer to needing a web presence. MS didn't get the lion's share of this, but still had a load of products (especially acquisitions like Hotmail) that grew along with this expansion.

                            Then came two things at about the same time. One was a bunch of technologies (capacitive touchscreens, 3G mobile networks, better LiIon batteries, power-efficient ARM SoCs), which made mobile phones feasible. Going from a computer in every desk to a computer in every pocket allowed a load more doublings. Microsoft again didn't get the biggest part of this growth, but they rode that wave.

                            The other thing that happened was that virtualisation on x86 became feasible. Xen showed that computers were fast enough for paravirtualisation to give you multiple useful virtual machines on a single computer, Intel and AMD responded with extensions to allow unmodified operating systems to run. This provided a path to consolidation in what became the cloud. Rather than buying a computer, you could rent a fraction of a computer, which would be cheaper since you didn't actually need 100% of a computer 100% of the time. Even if the provider charged you a 100% markup, you were probably using only 20% of a computer so paying for 40% of one was cheaper than buying a whole one. Especially if you were actually using 80-100% of a computer but only 20-25% of the time (e.g. during peak business hours).

                            Nadella was the lead in the cloud division when it went from being a weird thing to being one of the major revenue sources for the company.

                            But the cloud has a problem. People's requirements for cloud storage and compute grow organically. You might need 20% more cloud stuff this year than you did last year. At the same time, the cost of compute and storage is dropping. Here's a fun graph of storage costs. From 2013 to 2018 (ignore the numbers after that, they're predictions and are nonsense), the cost of 1TB of SSD storage went from $694 to $107. To remain competitive, cloud prices needed to come down at the same rate. They didn't, so they're relying more on lock-in, but that doesn't get you new customers.

                            Most of the growth in the cloud was not new compute demand, it was people moving things from on-premises deployments to the cloud. That's a finite (and nearly exhausted) market. That, combined with the need to lower prices over time to prevent companies moving back, is a problem. It's made worse by the fact that the biggest customer see the least benefit. If you're a large company with your own server rooms full of machines, the cost reductions of the cloud are negligible. If you're a small company with one server, moving to a cheaper system with built-in redundancy is a win. But getting each of those companies to move costs a lot.

                            The cloud really needs a use case that has growing compute requirements. The push for 'big data' was starting to run up against both regulatory issues (GDPR was making data a liability, not an asset) and security problems (you get very bad press when you leak customer data that you have no real reason for holding). AI came along with a promise that customers would keep needing more and faster compute every year. The thing that the leadership at these companies missed is that, for this to make business sense, they also need to be willing to pay an increasing amount each year. And that means you need to deliver increasing productivity improvements each year.

                            Delivering zero productivity increases while having to put up prices to customers is how we see the bubble start to burst.

                            terrybtwo@ohai.socialT This user is from outside of this forum
                            terrybtwo@ohai.socialT This user is from outside of this forum
                            terrybtwo@ohai.social
                            wrote last edited by
                            #51

                            @david_chisnall @linuxgnome@todon.eu @ChrisMayLA6 Agree. This also explains the desperation to hype “Artificial Intelligence “ for just about every activity from shoe lace tying to Cosmology, when really it’s an occasionally useful trinket.

                            1 Reply Last reply
                            0
                            • chrismayla6@zirk.usC chrismayla6@zirk.us

                              @Brokar

                              Ahhh.... but you're expecting our political class to think beyond the next electoral cycle there, aren't you, and we know they find that almost impossible (for a range of internal & external reasons)

                              brokar@mastodon.socialB This user is from outside of this forum
                              brokar@mastodon.socialB This user is from outside of this forum
                              brokar@mastodon.social
                              wrote last edited by
                              #52

                              @ChrisMayLA6 one can still have hope 😆 Since this is also not a US only thing. This is worldwide. Like COVID. Epidemic.

                              1 Reply Last reply
                              0
                              • reggiehere@mastodon.socialR reggiehere@mastodon.social

                                @HarriettMB

                                Yes. I've mentioned this before, but US foreign policy is heavily biased towards US big tech and cross-border data transfers to the degree that it's becoming a geopolitical tool akin to hosting US military bases.

                                @TCatInReality @ChrisMayLA6

                                tcatinreality@mastodon.socialT This user is from outside of this forum
                                tcatinreality@mastodon.socialT This user is from outside of this forum
                                tcatinreality@mastodon.social
                                wrote last edited by
                                #53

                                @ReggieHere @HarriettMB @ChrisMayLA6

                                Nice analogy

                                1 Reply Last reply
                                0
                                • hub@cosocial.caH hub@cosocial.ca

                                  @HarriettMB @TCatInReality @ChrisMayLA6 @ReggieHere our coward new (last year) banker-prime-minister cancelled the DST (it’s Canada). What have we gained from it a year after? Austerity. More threats from the Orange shitstain.

                                  tcatinreality@mastodon.socialT This user is from outside of this forum
                                  tcatinreality@mastodon.socialT This user is from outside of this forum
                                  tcatinreality@mastodon.social
                                  wrote last edited by
                                  #54

                                  @hub @HarriettMB @ChrisMayLA6 @ReggieHere

                                  Time to bring back the tax then 😁

                                  1 Reply Last reply
                                  0
                                  • cstross@wandering.shopC cstross@wandering.shop

                                    @graydon @david_chisnall @linuxgnome @ChrisMayLA6 More to the point: the end is in sight for the annual gains Moore's Law accustomed everybody to—you can't build circuits smaller than atomic orbitals—but it has run for over 40 years, so everybody in a decision-making position has grown up expecting it to continue. Not so much in semiconductors, but everyone *else*: VCs, PE firms, software, the general public.

                                    The cluetrain is bound to run off the track and derail in an unploughed field.

                                    david_chisnall@infosec.exchangeD This user is from outside of this forum
                                    david_chisnall@infosec.exchangeD This user is from outside of this forum
                                    david_chisnall@infosec.exchange
                                    wrote last edited by
                                    #55

                                    @cstross @graydon @linuxgnome @ChrisMayLA6

                                    Well, kind of. Moore's law is about the size of IC you can build assuming a fixed investment (the latter isn't explicitly stated in the law, but it is an underlying assumption in the paper. Increases in yield contribute as well, as do more mature processes coming down in price over time. So do things like 3D stacking and chiplets (chiplets, in particular, let you build smaller chips and get the yield benefits, but assemble them into more complex complete chips).

                                    Moore's second law is a bit more relevant because it discusses the doubling of fab costs for each new process node. That's predicated on making enough money from the previous generation to justify the investment. That's why we've seen so much consolidation: you need enormous economies of scale to be able to afford the R&D costs. Once you hit 'good enough' performance for 90% of use cases, funding the R&D for the next process out of the 10% that needs the higher performance is hard, if not impossible. Once you reach 99%, it's definitely impossible.

                                    Somewhere, I have a copy of the issue of BYTE where the cover story is the new 1nm process (note: nm, not µm). It confidently predicts the end of Moore's Law within a little over a decade.

                                    We hit the end of Dennard Scaling around 2007 and that was a far bigger shock than slowing of Moore's Law. Prior to that, shrinking a die had given you a commensurate decrease in leakage current. Your clock frequency is determined by the signal propagation delay (one clock cycle at the maximum frequency supported by the part is the time taken for a signal to propagate along the critical path). As you make transistors smaller, the amount of stuff you can do in one cycle is much more because you can fit more logic in.

                                    This is how we're able to run our first test chip at 512 MHz on a 22nm process, even though it's a microcontroller with a three-stage pipeline, whereas Intel needed five stages (and a lot of engineering work) to break 100 MHz with the 800nm process.

                                    But back prior to around 2007, that increase in clock speeds came for free with respect to power. With newer processes, the leakage current is higher and that means that you need to increase the voltage more to increase the clock speed. And that is what gives us power problems.

                                    There are a few interesting experimental processes that look like they might get back to much lower leakage, which would allow chips of similar sizes to todays to run at hundreds of GHz in the same power budget, if they work. We've had some initial discussions with some folks who built a small fab around one of these. That has no impact on Moore's First Law as it's actually written, but it would have a big impact on the common informal understanding of Moore's Law.

                                    1 Reply Last reply
                                    0
                                    • mgleadow@mastodon.greenM mgleadow@mastodon.green

                                      @TCatInReality @HarriettMB @ChrisMayLA6 the machines will decide for us. Obviously.

                                      tcatinreality@mastodon.socialT This user is from outside of this forum
                                      tcatinreality@mastodon.socialT This user is from outside of this forum
                                      tcatinreality@mastodon.social
                                      wrote last edited by
                                      #56

                                      @mgleadow @HarriettMB @ChrisMayLA6

                                      Early tests show AI making some very dangerous decisions

                                      Link Preview Image
                                      King's study finds AI chose nuclear signalling in 95% of simulated crises | King's College London

                                      Artificial intelligence (AI) models used for a simulated war game escalated conflicts by threatening nuclear strikes in 95% of scenarios, according to new research from King’s College London.

                                      favicon

                                      King's College London (www.kcl.ac.uk)

                                      1 Reply Last reply
                                      0
                                      • chrismayla6@zirk.usC chrismayla6@zirk.us

                                        @openrisk

                                        I think the final post may be right - one last throw of the dice in a bid to avoid a 'correction' in their share price (see the warning today from the BoE about UK share prices, which is just as applicable to US ones, in my view)

                                        openrisk@mastodon.socialO This user is from outside of this forum
                                        openrisk@mastodon.socialO This user is from outside of this forum
                                        openrisk@mastodon.social
                                        wrote last edited by
                                        #57

                                        @ChrisMayLA6 yes, I have seen today an economist talk about the Wile E Coyote effect (in relation to the Iran war and the oil crisis). People seem to want to have their stock market party go on forever, decoupling it from annoying reality.

                                        But in the end, forecasting is hard, especially when it is about the future 🤣. A tech bubble burst has been predicted several times already. The monopoly position of those companies does give them remarkable resilience...

                                        1 Reply Last reply
                                        0
                                        • graydon@canada.masto.hostG graydon@canada.masto.host

                                          @cstross TSMC is working on going from 3 nm to 2 nm fabrication.

                                          On the one hand, that's a big change, percentage-wise.

                                          On the other hand, only TSMC is doing this because the entire world economy can afford at most one fab.

                                          On the third hand, it's not clear there's any actual advantage to making the change. There's almost certainly better things to do with that money. But line must more tinyness! is built into the whole process.

                                          @david_chisnall @linuxgnome @ChrisMayLA6

                                          kurtmrufa@dragon.styleK This user is from outside of this forum
                                          kurtmrufa@dragon.styleK This user is from outside of this forum
                                          kurtmrufa@dragon.style
                                          wrote last edited by
                                          #58

                                          @graydon @cstross @david_chisnall @linuxgnome @ChrisMayLA6 The percentage is only in marketing. There are only small improvements in power, performance, etc (like 10-15%) but with a doubling in mask costs. SRAMs and wires are not scaling and logic gates are no longer getting any cheaper to print.

                                          stompyrobot@mastodon.gamedev.placeS 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups