Both Meta & Microsoft have said they're shedding staff explicitly to free up cash flow to invest in AI;
-
@TCatInReality @ReggieHere @ChrisMayLA6 Today there’s a headline that #Trump is threatening tarriffs on the #UK if they don’t drop the digital services tax; a couple of days ago he was claiming the relationship between the UK & #USA would be repaired because he likes the King. Typical Trump. He gets what he wants [Royal visit] but will continue to bully, demean and dominate the subject of his ire. There’s no pleasing this


@HarriettMB @TCatInReality @ChrisMayLA6 @ReggieHere our coward new (last year) banker-prime-minister cancelled the DST (it’s Canada). What have we gained from it a year after? Austerity. More threats from the Orange shitstain.
-
Why are governments embracing AI?
Because they value corporate profits over the wellbeing of their citizenry (IMO). They fear a massive financial drain if they aren't competitive with the US in encouraging AI replacement of human workers.
Also, they fear even greater dependence on US-based technology dependence.
But what will society be with mass unemployment and hallucinating machines making most decisions?
@TCatInReality @HarriettMB @ChrisMayLA6 the machines will decide for us. Obviously.
-
@ReggieHere @TCatInReality @ChrisMayLA6 I’d like to see them out of every country as well, especially all of the EU and UK.
You'd not be alone. The US must have serious concerns about Scottish independence on that score.
-
There's an element of FOMO. Microsoft was late with a mobile platform that worked nicely on capacitive touchscreens and, as a result, lost that market entirely. Though it turns out that wasn't a bad thing: they can sell Office on both iOS and Android and no one is actually making money from mobile operating systems (they are on the surrounding ecosystem, but the OS is the loss-making part that enables that and being a player in the ecosystem without paying that cost is often better).
But there's a much bigger part of a need to grow.
It's easy to grow when a product is useful and new. The IBM PC wasn't the first personal computer to be powerful enough to be useful, but it was around that time. When I was a small child, in a middle-class area, almost no one I knew owned a computer. My school had a few (less than one per 20 pupils). Going from there to Gates' goal of a computer on every desk allowed them to double sales for many years.
Then the Internet came along. Gates said it was a passing fad and things like MSN (at the time, an OSP, not a web site) would replace it. But it then caused another decade of growth as every business went from needing a computer to needing a web presence. MS didn't get the lion's share of this, but still had a load of products (especially acquisitions like Hotmail) that grew along with this expansion.
Then came two things at about the same time. One was a bunch of technologies (capacitive touchscreens, 3G mobile networks, better LiIon batteries, power-efficient ARM SoCs), which made mobile phones feasible. Going from a computer in every desk to a computer in every pocket allowed a load more doublings. Microsoft again didn't get the biggest part of this growth, but they rode that wave.
The other thing that happened was that virtualisation on x86 became feasible. Xen showed that computers were fast enough for paravirtualisation to give you multiple useful virtual machines on a single computer, Intel and AMD responded with extensions to allow unmodified operating systems to run. This provided a path to consolidation in what became the cloud. Rather than buying a computer, you could rent a fraction of a computer, which would be cheaper since you didn't actually need 100% of a computer 100% of the time. Even if the provider charged you a 100% markup, you were probably using only 20% of a computer so paying for 40% of one was cheaper than buying a whole one. Especially if you were actually using 80-100% of a computer but only 20-25% of the time (e.g. during peak business hours).
Nadella was the lead in the cloud division when it went from being a weird thing to being one of the major revenue sources for the company.
But the cloud has a problem. People's requirements for cloud storage and compute grow organically. You might need 20% more cloud stuff this year than you did last year. At the same time, the cost of compute and storage is dropping. Here's a fun graph of storage costs. From 2013 to 2018 (ignore the numbers after that, they're predictions and are nonsense), the cost of 1TB of SSD storage went from $694 to $107. To remain competitive, cloud prices needed to come down at the same rate. They didn't, so they're relying more on lock-in, but that doesn't get you new customers.
Most of the growth in the cloud was not new compute demand, it was people moving things from on-premises deployments to the cloud. That's a finite (and nearly exhausted) market. That, combined with the need to lower prices over time to prevent companies moving back, is a problem. It's made worse by the fact that the biggest customer see the least benefit. If you're a large company with your own server rooms full of machines, the cost reductions of the cloud are negligible. If you're a small company with one server, moving to a cheaper system with built-in redundancy is a win. But getting each of those companies to move costs a lot.
The cloud really needs a use case that has growing compute requirements. The push for 'big data' was starting to run up against both regulatory issues (GDPR was making data a liability, not an asset) and security problems (you get very bad press when you leak customer data that you have no real reason for holding). AI came along with a promise that customers would keep needing more and faster compute every year. The thing that the leadership at these companies missed is that, for this to make business sense, they also need to be willing to pay an increasing amount each year. And that means you need to deliver increasing productivity improvements each year.
Delivering zero productivity increases while having to put up prices to customers is how we see the bubble start to burst.
@david_chisnall The gold rush is over.
MS, and quite a few other companies, are creatures of that gold rush, and not at all well-adapted to conditions where computing is infrastructure, rather than the shiny new hotness you haven't got yet. (Where "not well-adapted" tends to mean things like "willing to do nearly anything instead of adapt"; "AI" is a burn-the-world denial of reality.)
-
There's an element of FOMO. Microsoft was late with a mobile platform that worked nicely on capacitive touchscreens and, as a result, lost that market entirely. Though it turns out that wasn't a bad thing: they can sell Office on both iOS and Android and no one is actually making money from mobile operating systems (they are on the surrounding ecosystem, but the OS is the loss-making part that enables that and being a player in the ecosystem without paying that cost is often better).
But there's a much bigger part of a need to grow.
It's easy to grow when a product is useful and new. The IBM PC wasn't the first personal computer to be powerful enough to be useful, but it was around that time. When I was a small child, in a middle-class area, almost no one I knew owned a computer. My school had a few (less than one per 20 pupils). Going from there to Gates' goal of a computer on every desk allowed them to double sales for many years.
Then the Internet came along. Gates said it was a passing fad and things like MSN (at the time, an OSP, not a web site) would replace it. But it then caused another decade of growth as every business went from needing a computer to needing a web presence. MS didn't get the lion's share of this, but still had a load of products (especially acquisitions like Hotmail) that grew along with this expansion.
Then came two things at about the same time. One was a bunch of technologies (capacitive touchscreens, 3G mobile networks, better LiIon batteries, power-efficient ARM SoCs), which made mobile phones feasible. Going from a computer in every desk to a computer in every pocket allowed a load more doublings. Microsoft again didn't get the biggest part of this growth, but they rode that wave.
The other thing that happened was that virtualisation on x86 became feasible. Xen showed that computers were fast enough for paravirtualisation to give you multiple useful virtual machines on a single computer, Intel and AMD responded with extensions to allow unmodified operating systems to run. This provided a path to consolidation in what became the cloud. Rather than buying a computer, you could rent a fraction of a computer, which would be cheaper since you didn't actually need 100% of a computer 100% of the time. Even if the provider charged you a 100% markup, you were probably using only 20% of a computer so paying for 40% of one was cheaper than buying a whole one. Especially if you were actually using 80-100% of a computer but only 20-25% of the time (e.g. during peak business hours).
Nadella was the lead in the cloud division when it went from being a weird thing to being one of the major revenue sources for the company.
But the cloud has a problem. People's requirements for cloud storage and compute grow organically. You might need 20% more cloud stuff this year than you did last year. At the same time, the cost of compute and storage is dropping. Here's a fun graph of storage costs. From 2013 to 2018 (ignore the numbers after that, they're predictions and are nonsense), the cost of 1TB of SSD storage went from $694 to $107. To remain competitive, cloud prices needed to come down at the same rate. They didn't, so they're relying more on lock-in, but that doesn't get you new customers.
Most of the growth in the cloud was not new compute demand, it was people moving things from on-premises deployments to the cloud. That's a finite (and nearly exhausted) market. That, combined with the need to lower prices over time to prevent companies moving back, is a problem. It's made worse by the fact that the biggest customer see the least benefit. If you're a large company with your own server rooms full of machines, the cost reductions of the cloud are negligible. If you're a small company with one server, moving to a cheaper system with built-in redundancy is a win. But getting each of those companies to move costs a lot.
The cloud really needs a use case that has growing compute requirements. The push for 'big data' was starting to run up against both regulatory issues (GDPR was making data a liability, not an asset) and security problems (you get very bad press when you leak customer data that you have no real reason for holding). AI came along with a promise that customers would keep needing more and faster compute every year. The thing that the leadership at these companies missed is that, for this to make business sense, they also need to be willing to pay an increasing amount each year. And that means you need to deliver increasing productivity improvements each year.
Delivering zero productivity increases while having to put up prices to customers is how we see the bubble start to burst.
@david_chisnall It is, once again, a solution looking for the right problem.
LLMs seem to have some uses where they're better than other solutions (translation might be one) but those are too niche to sell them to everyone on the planet.
So they try to sell them as search engines, copywriters, programmers and a dozen other things just to attract more companies even if LLMs are a poor choice for their needs.
-
@david_chisnall The gold rush is over.
MS, and quite a few other companies, are creatures of that gold rush, and not at all well-adapted to conditions where computing is infrastructure, rather than the shiny new hotness you haven't got yet. (Where "not well-adapted" tends to mean things like "willing to do nearly anything instead of adapt"; "AI" is a burn-the-world denial of reality.)
@graydon @david_chisnall @linuxgnome @ChrisMayLA6 More to the point: the end is in sight for the annual gains Moore's Law accustomed everybody to—you can't build circuits smaller than atomic orbitals—but it has run for over 40 years, so everybody in a decision-making position has grown up expecting it to continue. Not so much in semiconductors, but everyone *else*: VCs, PE firms, software, the general public.
The cluetrain is bound to run off the track and derail in an unploughed field.
-
@graydon @david_chisnall @linuxgnome @ChrisMayLA6 More to the point: the end is in sight for the annual gains Moore's Law accustomed everybody to—you can't build circuits smaller than atomic orbitals—but it has run for over 40 years, so everybody in a decision-making position has grown up expecting it to continue. Not so much in semiconductors, but everyone *else*: VCs, PE firms, software, the general public.
The cluetrain is bound to run off the track and derail in an unploughed field.
@cstross TSMC is working on going from 3 nm to 2 nm fabrication.
On the one hand, that's a big change, percentage-wise.
On the other hand, only TSMC is doing this because the entire world economy can afford at most one fab.
On the third hand, it's not clear there's any actual advantage to making the change. There's almost certainly better things to do with that money. But line must more tinyness! is built into the whole process.
-
@cstross TSMC is working on going from 3 nm to 2 nm fabrication.
On the one hand, that's a big change, percentage-wise.
On the other hand, only TSMC is doing this because the entire world economy can afford at most one fab.
On the third hand, it's not clear there's any actual advantage to making the change. There's almost certainly better things to do with that money. But line must more tinyness! is built into the whole process.
@graydon @david_chisnall @linuxgnome @ChrisMayLA6
The logical end-point after the node size bottoms out is going to be for the inherent deflation to become evident—fabs get amortized over time, so the product stops being premium and becomes a cash cow, and prices have to drop.
Nvidia can't survive that. Intel can't survive that. They need something like the AI hyperscalers to keep demand high, but the demand is artificial, and actual consumer demand is soft if not soggy.
Crash is inevitable.
-
@david_chisnall The gold rush is over.
MS, and quite a few other companies, are creatures of that gold rush, and not at all well-adapted to conditions where computing is infrastructure, rather than the shiny new hotness you haven't got yet. (Where "not well-adapted" tends to mean things like "willing to do nearly anything instead of adapt"; "AI" is a burn-the-world denial of reality.)
@graydon @david_chisnall @linuxgnome @ChrisMayLA6
The bubble bursting will simply be their final breath being released into the sky. Blessedly.
-
Both Meta & Microsoft have said they're shedding staff explicitly to free up cash flow to invest in AI;
on one level this is unemployment linked to technology, but its a bit different from *actual* technological unemployment - the latter sees people losing jobs due to the deployment of technology to do their jobs. Microsoft & Meta on the other hand are sacking people to take a (bigger) punt on a business strategy that is yet to prove its transformation of productivity.
Oracle invoked the same argument.
Extraordinary times whatever the interpretation.
One possibility is that they don't think its a risky move. How can they be so sure? Only if they know they have a stranglehold on users and can push "AI" in all eventualities.
Another possibility is that they are actually "bust", not literally - as in bankrupt - but in terms of defending their astronomical valuations: the risky bets aim to avoid a massive correction.
Time will tell I suppose...
-
Oracle invoked the same argument.
Extraordinary times whatever the interpretation.
One possibility is that they don't think its a risky move. How can they be so sure? Only if they know they have a stranglehold on users and can push "AI" in all eventualities.
Another possibility is that they are actually "bust", not literally - as in bankrupt - but in terms of defending their astronomical valuations: the risky bets aim to avoid a massive correction.
Time will tell I suppose...
I think the final post may be right - one last throw of the dice in a bid to avoid a 'correction' in their share price (see the warning today from the BoE about UK share prices, which is just as applicable to US ones, in my view)
-
@graydon @david_chisnall @linuxgnome @ChrisMayLA6 More to the point: the end is in sight for the annual gains Moore's Law accustomed everybody to—you can't build circuits smaller than atomic orbitals—but it has run for over 40 years, so everybody in a decision-making position has grown up expecting it to continue. Not so much in semiconductors, but everyone *else*: VCs, PE firms, software, the general public.
The cluetrain is bound to run off the track and derail in an unploughed field.
@cstross @graydon @david_chisnall @linuxgnome @ChrisMayLA6
"you can't build circuits smaller than atomic orbitals"
Well, not with that attitude /s
-
Both Meta & Microsoft have said they're shedding staff explicitly to free up cash flow to invest in AI;
on one level this is unemployment linked to technology, but its a bit different from *actual* technological unemployment - the latter sees people losing jobs due to the deployment of technology to do their jobs. Microsoft & Meta on the other hand are sacking people to take a (bigger) punt on a business strategy that is yet to prove its transformation of productivity.
@ChrisMayLA6 Concerning AI, i wonder how the politicians, who celebrate and support the AI and sign related bills without consequence, want to rectify the prognosed and inevitable loss of jobs while later they are campaigning in the next election on how many jobs they will create?
Better ask them then what they think in what business they intend to create jobs in? Data cleanup for AI then instead of programming, bookkeeping and teaching? -
@cstross @graydon @david_chisnall @linuxgnome @ChrisMayLA6
"you can't build circuits smaller than atomic orbitals"
Well, not with that attitude /s
@mdm @graydon @david_chisnall @linuxgnome @ChrisMayLA6 Well you *can* if you use muons instead of electrons but then you have to do your computing inside a particle accelerator and everything is radioactive and on fire
-
@ChrisMayLA6 Concerning AI, i wonder how the politicians, who celebrate and support the AI and sign related bills without consequence, want to rectify the prognosed and inevitable loss of jobs while later they are campaigning in the next election on how many jobs they will create?
Better ask them then what they think in what business they intend to create jobs in? Data cleanup for AI then instead of programming, bookkeeping and teaching?Ahhh.... but you're expecting our political class to think beyond the next electoral cycle there, aren't you, and we know they find that almost impossible (for a range of internal & external reasons)
-
There's an element of FOMO. Microsoft was late with a mobile platform that worked nicely on capacitive touchscreens and, as a result, lost that market entirely. Though it turns out that wasn't a bad thing: they can sell Office on both iOS and Android and no one is actually making money from mobile operating systems (they are on the surrounding ecosystem, but the OS is the loss-making part that enables that and being a player in the ecosystem without paying that cost is often better).
But there's a much bigger part of a need to grow.
It's easy to grow when a product is useful and new. The IBM PC wasn't the first personal computer to be powerful enough to be useful, but it was around that time. When I was a small child, in a middle-class area, almost no one I knew owned a computer. My school had a few (less than one per 20 pupils). Going from there to Gates' goal of a computer on every desk allowed them to double sales for many years.
Then the Internet came along. Gates said it was a passing fad and things like MSN (at the time, an OSP, not a web site) would replace it. But it then caused another decade of growth as every business went from needing a computer to needing a web presence. MS didn't get the lion's share of this, but still had a load of products (especially acquisitions like Hotmail) that grew along with this expansion.
Then came two things at about the same time. One was a bunch of technologies (capacitive touchscreens, 3G mobile networks, better LiIon batteries, power-efficient ARM SoCs), which made mobile phones feasible. Going from a computer in every desk to a computer in every pocket allowed a load more doublings. Microsoft again didn't get the biggest part of this growth, but they rode that wave.
The other thing that happened was that virtualisation on x86 became feasible. Xen showed that computers were fast enough for paravirtualisation to give you multiple useful virtual machines on a single computer, Intel and AMD responded with extensions to allow unmodified operating systems to run. This provided a path to consolidation in what became the cloud. Rather than buying a computer, you could rent a fraction of a computer, which would be cheaper since you didn't actually need 100% of a computer 100% of the time. Even if the provider charged you a 100% markup, you were probably using only 20% of a computer so paying for 40% of one was cheaper than buying a whole one. Especially if you were actually using 80-100% of a computer but only 20-25% of the time (e.g. during peak business hours).
Nadella was the lead in the cloud division when it went from being a weird thing to being one of the major revenue sources for the company.
But the cloud has a problem. People's requirements for cloud storage and compute grow organically. You might need 20% more cloud stuff this year than you did last year. At the same time, the cost of compute and storage is dropping. Here's a fun graph of storage costs. From 2013 to 2018 (ignore the numbers after that, they're predictions and are nonsense), the cost of 1TB of SSD storage went from $694 to $107. To remain competitive, cloud prices needed to come down at the same rate. They didn't, so they're relying more on lock-in, but that doesn't get you new customers.
Most of the growth in the cloud was not new compute demand, it was people moving things from on-premises deployments to the cloud. That's a finite (and nearly exhausted) market. That, combined with the need to lower prices over time to prevent companies moving back, is a problem. It's made worse by the fact that the biggest customer see the least benefit. If you're a large company with your own server rooms full of machines, the cost reductions of the cloud are negligible. If you're a small company with one server, moving to a cheaper system with built-in redundancy is a win. But getting each of those companies to move costs a lot.
The cloud really needs a use case that has growing compute requirements. The push for 'big data' was starting to run up against both regulatory issues (GDPR was making data a liability, not an asset) and security problems (you get very bad press when you leak customer data that you have no real reason for holding). AI came along with a promise that customers would keep needing more and faster compute every year. The thing that the leadership at these companies missed is that, for this to make business sense, they also need to be willing to pay an increasing amount each year. And that means you need to deliver increasing productivity improvements each year.
Delivering zero productivity increases while having to put up prices to customers is how we see the bubble start to burst.
@david_chisnall @linuxgnome@todon.eu @ChrisMayLA6 Agree. This also explains the desperation to hype “Artificial Intelligence “ for just about every activity from shoe lace tying to Cosmology, when really it’s an occasionally useful trinket.
-
Ahhh.... but you're expecting our political class to think beyond the next electoral cycle there, aren't you, and we know they find that almost impossible (for a range of internal & external reasons)
@ChrisMayLA6 one can still have hope
Since this is also not a US only thing. This is worldwide. Like COVID. Epidemic. -
Yes. I've mentioned this before, but US foreign policy is heavily biased towards US big tech and cross-border data transfers to the degree that it's becoming a geopolitical tool akin to hosting US military bases.
@ReggieHere @HarriettMB @ChrisMayLA6
Nice analogy
-
@HarriettMB @TCatInReality @ChrisMayLA6 @ReggieHere our coward new (last year) banker-prime-minister cancelled the DST (it’s Canada). What have we gained from it a year after? Austerity. More threats from the Orange shitstain.
@hub @HarriettMB @ChrisMayLA6 @ReggieHere
Time to bring back the tax then

-
@graydon @david_chisnall @linuxgnome @ChrisMayLA6 More to the point: the end is in sight for the annual gains Moore's Law accustomed everybody to—you can't build circuits smaller than atomic orbitals—but it has run for over 40 years, so everybody in a decision-making position has grown up expecting it to continue. Not so much in semiconductors, but everyone *else*: VCs, PE firms, software, the general public.
The cluetrain is bound to run off the track and derail in an unploughed field.
@cstross @graydon @linuxgnome @ChrisMayLA6
Well, kind of. Moore's law is about the size of IC you can build assuming a fixed investment (the latter isn't explicitly stated in the law, but it is an underlying assumption in the paper. Increases in yield contribute as well, as do more mature processes coming down in price over time. So do things like 3D stacking and chiplets (chiplets, in particular, let you build smaller chips and get the yield benefits, but assemble them into more complex complete chips).
Moore's second law is a bit more relevant because it discusses the doubling of fab costs for each new process node. That's predicated on making enough money from the previous generation to justify the investment. That's why we've seen so much consolidation: you need enormous economies of scale to be able to afford the R&D costs. Once you hit 'good enough' performance for 90% of use cases, funding the R&D for the next process out of the 10% that needs the higher performance is hard, if not impossible. Once you reach 99%, it's definitely impossible.
Somewhere, I have a copy of the issue of BYTE where the cover story is the new 1nm process (note: nm, not µm). It confidently predicts the end of Moore's Law within a little over a decade.
We hit the end of Dennard Scaling around 2007 and that was a far bigger shock than slowing of Moore's Law. Prior to that, shrinking a die had given you a commensurate decrease in leakage current. Your clock frequency is determined by the signal propagation delay (one clock cycle at the maximum frequency supported by the part is the time taken for a signal to propagate along the critical path). As you make transistors smaller, the amount of stuff you can do in one cycle is much more because you can fit more logic in.
This is how we're able to run our first test chip at 512 MHz on a 22nm process, even though it's a microcontroller with a three-stage pipeline, whereas Intel needed five stages (and a lot of engineering work) to break 100 MHz with the 800nm process.
But back prior to around 2007, that increase in clock speeds came for free with respect to power. With newer processes, the leakage current is higher and that means that you need to increase the voltage more to increase the clock speed. And that is what gives us power problems.
There are a few interesting experimental processes that look like they might get back to much lower leakage, which would allow chips of similar sizes to todays to run at hundreds of GHz in the same power budget, if they work. We've had some initial discussions with some folks who built a small fab around one of these. That has no impact on Moore's First Law as it's actually written, but it would have a big impact on the common informal understanding of Moore's Law.