Both Meta & Microsoft have said they're shedding staff explicitly to free up cash flow to invest in AI;
-
Both Meta & Microsoft have said they're shedding staff explicitly to free up cash flow to invest in AI;
on one level this is unemployment linked to technology, but its a bit different from *actual* technological unemployment - the latter sees people losing jobs due to the deployment of technology to do their jobs. Microsoft & Meta on the other hand are sacking people to take a (bigger) punt on a business strategy that is yet to prove its transformation of productivity.
@ChrisMayLA6 "One of these things is not like the other"
-
@ChrisMayLA6 the problem is. They have nothing else to induce growth. So they throw money on a bet and hope it will work.
@prefec2 If there are no longer any users willing to use those substandard products, the system will shut down immediately.
-
@ReggieHere @HarriettMB @ChrisMayLA6
Correct. Corey Doctorow has written about the decades of US trade policy that pushes US tech dominance.
But Trump has made it clear to every nation that relying on America (for anything) is doomed to fail.
@TCatInReality @ReggieHere @ChrisMayLA6 Today there’s a headline that #Trump is threatening tarriffs on the #UK if they don’t drop the digital services tax; a couple of days ago he was claiming the relationship between the UK & #USA would be repaired because he likes the King. Typical Trump. He gets what he wants [Royal visit] but will continue to bully, demean and dominate the subject of his ire. There’s no pleasing this


-
@TCatInReality @ReggieHere @ChrisMayLA6 Today there’s a headline that #Trump is threatening tarriffs on the #UK if they don’t drop the digital services tax; a couple of days ago he was claiming the relationship between the UK & #USA would be repaired because he likes the King. Typical Trump. He gets what he wants [Royal visit] but will continue to bully, demean and dominate the subject of his ire. There’s no pleasing this


Yes. I've mentioned this before, but US foreign policy is heavily biased towards US big tech and cross-border data transfers to the degree that it's becoming a geopolitical tool akin to hosting US military bases.
-
@TCatInReality @ReggieHere @ChrisMayLA6 Today there’s a headline that #Trump is threatening tarriffs on the #UK if they don’t drop the digital services tax; a couple of days ago he was claiming the relationship between the UK & #USA would be repaired because he likes the King. Typical Trump. He gets what he wants [Royal visit] but will continue to bully, demean and dominate the subject of his ire. There’s no pleasing this


@HarriettMB @TCatInReality @ReggieHere @ChrisMayLA6 Let him tariff away. Americans for the bill. He's digging his own grave. He's impenetrably stupid.
-
Yes. I've mentioned this before, but US foreign policy is heavily biased towards US big tech and cross-border data transfers to the degree that it's becoming a geopolitical tool akin to hosting US military bases.
@ReggieHere @TCatInReality @ChrisMayLA6 I’d like to see them out of every country as well, especially all of the EU and UK.
-
@ChrisMayLA6 So humans are progressively shut out of economic as well as semantic loops, the algorithms talk to and copy each other... A weaving machine so perfect it no longer needs thread, nor makes cloth? https://thesinisterscience.wordpress.com/2021/01/11/frederik-pohls-mass-consumer-1-the-midas-plague/
@SusiArnott @ChrisMayLA6
Whatever you think of Pohl’s story, “The Midas Plague” seems an entirely apt description of the late stage capitalism we are currently suffering. Its greed will destroy everything, just as King Midas’s did.
#enshittification -
FOMO.
There's an element of FOMO. Microsoft was late with a mobile platform that worked nicely on capacitive touchscreens and, as a result, lost that market entirely. Though it turns out that wasn't a bad thing: they can sell Office on both iOS and Android and no one is actually making money from mobile operating systems (they are on the surrounding ecosystem, but the OS is the loss-making part that enables that and being a player in the ecosystem without paying that cost is often better).
But there's a much bigger part of a need to grow.
It's easy to grow when a product is useful and new. The IBM PC wasn't the first personal computer to be powerful enough to be useful, but it was around that time. When I was a small child, in a middle-class area, almost no one I knew owned a computer. My school had a few (less than one per 20 pupils). Going from there to Gates' goal of a computer on every desk allowed them to double sales for many years.
Then the Internet came along. Gates said it was a passing fad and things like MSN (at the time, an OSP, not a web site) would replace it. But it then caused another decade of growth as every business went from needing a computer to needing a web presence. MS didn't get the lion's share of this, but still had a load of products (especially acquisitions like Hotmail) that grew along with this expansion.
Then came two things at about the same time. One was a bunch of technologies (capacitive touchscreens, 3G mobile networks, better LiIon batteries, power-efficient ARM SoCs), which made mobile phones feasible. Going from a computer in every desk to a computer in every pocket allowed a load more doublings. Microsoft again didn't get the biggest part of this growth, but they rode that wave.
The other thing that happened was that virtualisation on x86 became feasible. Xen showed that computers were fast enough for paravirtualisation to give you multiple useful virtual machines on a single computer, Intel and AMD responded with extensions to allow unmodified operating systems to run. This provided a path to consolidation in what became the cloud. Rather than buying a computer, you could rent a fraction of a computer, which would be cheaper since you didn't actually need 100% of a computer 100% of the time. Even if the provider charged you a 100% markup, you were probably using only 20% of a computer so paying for 40% of one was cheaper than buying a whole one. Especially if you were actually using 80-100% of a computer but only 20-25% of the time (e.g. during peak business hours).
Nadella was the lead in the cloud division when it went from being a weird thing to being one of the major revenue sources for the company.
But the cloud has a problem. People's requirements for cloud storage and compute grow organically. You might need 20% more cloud stuff this year than you did last year. At the same time, the cost of compute and storage is dropping. Here's a fun graph of storage costs. From 2013 to 2018 (ignore the numbers after that, they're predictions and are nonsense), the cost of 1TB of SSD storage went from $694 to $107. To remain competitive, cloud prices needed to come down at the same rate. They didn't, so they're relying more on lock-in, but that doesn't get you new customers.
Most of the growth in the cloud was not new compute demand, it was people moving things from on-premises deployments to the cloud. That's a finite (and nearly exhausted) market. That, combined with the need to lower prices over time to prevent companies moving back, is a problem. It's made worse by the fact that the biggest customer see the least benefit. If you're a large company with your own server rooms full of machines, the cost reductions of the cloud are negligible. If you're a small company with one server, moving to a cheaper system with built-in redundancy is a win. But getting each of those companies to move costs a lot.
The cloud really needs a use case that has growing compute requirements. The push for 'big data' was starting to run up against both regulatory issues (GDPR was making data a liability, not an asset) and security problems (you get very bad press when you leak customer data that you have no real reason for holding). AI came along with a promise that customers would keep needing more and faster compute every year. The thing that the leadership at these companies missed is that, for this to make business sense, they also need to be willing to pay an increasing amount each year. And that means you need to deliver increasing productivity improvements each year.
Delivering zero productivity increases while having to put up prices to customers is how we see the bubble start to burst.
-
@ReggieHere @TCatInReality @ChrisMayLA6 I’d like to see them out of every country as well, especially all of the EU and UK.
@HarriettMB @ReggieHere @TCatInReality @ChrisMayLA6
Back in the last millennium, when graffiti was the social media, I recall seeing a graffito that said:
"US out of North America"
-
Both Meta & Microsoft have said they're shedding staff explicitly to free up cash flow to invest in AI;
on one level this is unemployment linked to technology, but its a bit different from *actual* technological unemployment - the latter sees people losing jobs due to the deployment of technology to do their jobs. Microsoft & Meta on the other hand are sacking people to take a (bigger) punt on a business strategy that is yet to prove its transformation of productivity.
@ChrisMayLA6 "...Microsoft & Meta on the other hand are sacking people to take a (bigger) punt on a business strategy that is yet to prove its transformation of productivity." Which, if my empirical (yet not fully quantified) experience with GenAI is anything to go by, just will not work. The moral hazard compels employees to refuse, and to pursue constructive and unfair dismissal claims; they cannot be held responsible for their employer's neglect of the obvious information asymmetry at play.
-
@TCatInReality @ReggieHere @ChrisMayLA6 Today there’s a headline that #Trump is threatening tarriffs on the #UK if they don’t drop the digital services tax; a couple of days ago he was claiming the relationship between the UK & #USA would be repaired because he likes the King. Typical Trump. He gets what he wants [Royal visit] but will continue to bully, demean and dominate the subject of his ire. There’s no pleasing this


@HarriettMB @TCatInReality @ReggieHere
Indeed, exactly why any sort of accommodation with he Tangerine Tyrant is a fool's game...
-
I was at Microsoft when the pandemic hit. Amy Hood told all of the employees that they were not going to rush into hiring (unlike many competitors) because they wanted sustainable growth. Hiring people to deal with a spike in demand and then firing them when the spike subsides would be bad for everyone, she said.
Since then, Microsoft has got rid of about 20% of the workforce. That counts only people in the big redundancy rounds. A lot of people I respected left voluntarily and they ended the policy that orgs reclaim headcount when people leave and so can hire replacements: if someone left, you needed to explicitly request new headcount from your management to get a replacement. A lot of the folks who left had their role filled by promoting someone else, who was then not replaced.
The culture of lying to management means that the senior leadership has no idea how under resourced most of the critical revenue-generating business units are. Anyone who tells them anything other than ‘everything is great, I bet we don’t even need all of the people we have!’ gets a reduced bonus and learns not to do it again.
The company reminded me of a dead oak tree. It looks strong from the outside but a single storm could knock the whole thing down.
@david_chisnall @ChrisMayLA6 somehow I’m not even slightly surprised.
-
@TCatInReality @ReggieHere @ChrisMayLA6 Today there’s a headline that #Trump is threatening tarriffs on the #UK if they don’t drop the digital services tax; a couple of days ago he was claiming the relationship between the UK & #USA would be repaired because he likes the King. Typical Trump. He gets what he wants [Royal visit] but will continue to bully, demean and dominate the subject of his ire. There’s no pleasing this


@HarriettMB @TCatInReality @ChrisMayLA6 @ReggieHere our coward new (last year) banker-prime-minister cancelled the DST (it’s Canada). What have we gained from it a year after? Austerity. More threats from the Orange shitstain.
-
Why are governments embracing AI?
Because they value corporate profits over the wellbeing of their citizenry (IMO). They fear a massive financial drain if they aren't competitive with the US in encouraging AI replacement of human workers.
Also, they fear even greater dependence on US-based technology dependence.
But what will society be with mass unemployment and hallucinating machines making most decisions?
@TCatInReality @HarriettMB @ChrisMayLA6 the machines will decide for us. Obviously.
-
@ReggieHere @TCatInReality @ChrisMayLA6 I’d like to see them out of every country as well, especially all of the EU and UK.
You'd not be alone. The US must have serious concerns about Scottish independence on that score.
-
There's an element of FOMO. Microsoft was late with a mobile platform that worked nicely on capacitive touchscreens and, as a result, lost that market entirely. Though it turns out that wasn't a bad thing: they can sell Office on both iOS and Android and no one is actually making money from mobile operating systems (they are on the surrounding ecosystem, but the OS is the loss-making part that enables that and being a player in the ecosystem without paying that cost is often better).
But there's a much bigger part of a need to grow.
It's easy to grow when a product is useful and new. The IBM PC wasn't the first personal computer to be powerful enough to be useful, but it was around that time. When I was a small child, in a middle-class area, almost no one I knew owned a computer. My school had a few (less than one per 20 pupils). Going from there to Gates' goal of a computer on every desk allowed them to double sales for many years.
Then the Internet came along. Gates said it was a passing fad and things like MSN (at the time, an OSP, not a web site) would replace it. But it then caused another decade of growth as every business went from needing a computer to needing a web presence. MS didn't get the lion's share of this, but still had a load of products (especially acquisitions like Hotmail) that grew along with this expansion.
Then came two things at about the same time. One was a bunch of technologies (capacitive touchscreens, 3G mobile networks, better LiIon batteries, power-efficient ARM SoCs), which made mobile phones feasible. Going from a computer in every desk to a computer in every pocket allowed a load more doublings. Microsoft again didn't get the biggest part of this growth, but they rode that wave.
The other thing that happened was that virtualisation on x86 became feasible. Xen showed that computers were fast enough for paravirtualisation to give you multiple useful virtual machines on a single computer, Intel and AMD responded with extensions to allow unmodified operating systems to run. This provided a path to consolidation in what became the cloud. Rather than buying a computer, you could rent a fraction of a computer, which would be cheaper since you didn't actually need 100% of a computer 100% of the time. Even if the provider charged you a 100% markup, you were probably using only 20% of a computer so paying for 40% of one was cheaper than buying a whole one. Especially if you were actually using 80-100% of a computer but only 20-25% of the time (e.g. during peak business hours).
Nadella was the lead in the cloud division when it went from being a weird thing to being one of the major revenue sources for the company.
But the cloud has a problem. People's requirements for cloud storage and compute grow organically. You might need 20% more cloud stuff this year than you did last year. At the same time, the cost of compute and storage is dropping. Here's a fun graph of storage costs. From 2013 to 2018 (ignore the numbers after that, they're predictions and are nonsense), the cost of 1TB of SSD storage went from $694 to $107. To remain competitive, cloud prices needed to come down at the same rate. They didn't, so they're relying more on lock-in, but that doesn't get you new customers.
Most of the growth in the cloud was not new compute demand, it was people moving things from on-premises deployments to the cloud. That's a finite (and nearly exhausted) market. That, combined with the need to lower prices over time to prevent companies moving back, is a problem. It's made worse by the fact that the biggest customer see the least benefit. If you're a large company with your own server rooms full of machines, the cost reductions of the cloud are negligible. If you're a small company with one server, moving to a cheaper system with built-in redundancy is a win. But getting each of those companies to move costs a lot.
The cloud really needs a use case that has growing compute requirements. The push for 'big data' was starting to run up against both regulatory issues (GDPR was making data a liability, not an asset) and security problems (you get very bad press when you leak customer data that you have no real reason for holding). AI came along with a promise that customers would keep needing more and faster compute every year. The thing that the leadership at these companies missed is that, for this to make business sense, they also need to be willing to pay an increasing amount each year. And that means you need to deliver increasing productivity improvements each year.
Delivering zero productivity increases while having to put up prices to customers is how we see the bubble start to burst.
@david_chisnall The gold rush is over.
MS, and quite a few other companies, are creatures of that gold rush, and not at all well-adapted to conditions where computing is infrastructure, rather than the shiny new hotness you haven't got yet. (Where "not well-adapted" tends to mean things like "willing to do nearly anything instead of adapt"; "AI" is a burn-the-world denial of reality.)
-
There's an element of FOMO. Microsoft was late with a mobile platform that worked nicely on capacitive touchscreens and, as a result, lost that market entirely. Though it turns out that wasn't a bad thing: they can sell Office on both iOS and Android and no one is actually making money from mobile operating systems (they are on the surrounding ecosystem, but the OS is the loss-making part that enables that and being a player in the ecosystem without paying that cost is often better).
But there's a much bigger part of a need to grow.
It's easy to grow when a product is useful and new. The IBM PC wasn't the first personal computer to be powerful enough to be useful, but it was around that time. When I was a small child, in a middle-class area, almost no one I knew owned a computer. My school had a few (less than one per 20 pupils). Going from there to Gates' goal of a computer on every desk allowed them to double sales for many years.
Then the Internet came along. Gates said it was a passing fad and things like MSN (at the time, an OSP, not a web site) would replace it. But it then caused another decade of growth as every business went from needing a computer to needing a web presence. MS didn't get the lion's share of this, but still had a load of products (especially acquisitions like Hotmail) that grew along with this expansion.
Then came two things at about the same time. One was a bunch of technologies (capacitive touchscreens, 3G mobile networks, better LiIon batteries, power-efficient ARM SoCs), which made mobile phones feasible. Going from a computer in every desk to a computer in every pocket allowed a load more doublings. Microsoft again didn't get the biggest part of this growth, but they rode that wave.
The other thing that happened was that virtualisation on x86 became feasible. Xen showed that computers were fast enough for paravirtualisation to give you multiple useful virtual machines on a single computer, Intel and AMD responded with extensions to allow unmodified operating systems to run. This provided a path to consolidation in what became the cloud. Rather than buying a computer, you could rent a fraction of a computer, which would be cheaper since you didn't actually need 100% of a computer 100% of the time. Even if the provider charged you a 100% markup, you were probably using only 20% of a computer so paying for 40% of one was cheaper than buying a whole one. Especially if you were actually using 80-100% of a computer but only 20-25% of the time (e.g. during peak business hours).
Nadella was the lead in the cloud division when it went from being a weird thing to being one of the major revenue sources for the company.
But the cloud has a problem. People's requirements for cloud storage and compute grow organically. You might need 20% more cloud stuff this year than you did last year. At the same time, the cost of compute and storage is dropping. Here's a fun graph of storage costs. From 2013 to 2018 (ignore the numbers after that, they're predictions and are nonsense), the cost of 1TB of SSD storage went from $694 to $107. To remain competitive, cloud prices needed to come down at the same rate. They didn't, so they're relying more on lock-in, but that doesn't get you new customers.
Most of the growth in the cloud was not new compute demand, it was people moving things from on-premises deployments to the cloud. That's a finite (and nearly exhausted) market. That, combined with the need to lower prices over time to prevent companies moving back, is a problem. It's made worse by the fact that the biggest customer see the least benefit. If you're a large company with your own server rooms full of machines, the cost reductions of the cloud are negligible. If you're a small company with one server, moving to a cheaper system with built-in redundancy is a win. But getting each of those companies to move costs a lot.
The cloud really needs a use case that has growing compute requirements. The push for 'big data' was starting to run up against both regulatory issues (GDPR was making data a liability, not an asset) and security problems (you get very bad press when you leak customer data that you have no real reason for holding). AI came along with a promise that customers would keep needing more and faster compute every year. The thing that the leadership at these companies missed is that, for this to make business sense, they also need to be willing to pay an increasing amount each year. And that means you need to deliver increasing productivity improvements each year.
Delivering zero productivity increases while having to put up prices to customers is how we see the bubble start to burst.
@david_chisnall It is, once again, a solution looking for the right problem.
LLMs seem to have some uses where they're better than other solutions (translation might be one) but those are too niche to sell them to everyone on the planet.
So they try to sell them as search engines, copywriters, programmers and a dozen other things just to attract more companies even if LLMs are a poor choice for their needs.
-
@david_chisnall The gold rush is over.
MS, and quite a few other companies, are creatures of that gold rush, and not at all well-adapted to conditions where computing is infrastructure, rather than the shiny new hotness you haven't got yet. (Where "not well-adapted" tends to mean things like "willing to do nearly anything instead of adapt"; "AI" is a burn-the-world denial of reality.)
@graydon @david_chisnall @linuxgnome @ChrisMayLA6 More to the point: the end is in sight for the annual gains Moore's Law accustomed everybody to—you can't build circuits smaller than atomic orbitals—but it has run for over 40 years, so everybody in a decision-making position has grown up expecting it to continue. Not so much in semiconductors, but everyone *else*: VCs, PE firms, software, the general public.
The cluetrain is bound to run off the track and derail in an unploughed field.
-
@graydon @david_chisnall @linuxgnome @ChrisMayLA6 More to the point: the end is in sight for the annual gains Moore's Law accustomed everybody to—you can't build circuits smaller than atomic orbitals—but it has run for over 40 years, so everybody in a decision-making position has grown up expecting it to continue. Not so much in semiconductors, but everyone *else*: VCs, PE firms, software, the general public.
The cluetrain is bound to run off the track and derail in an unploughed field.
@cstross TSMC is working on going from 3 nm to 2 nm fabrication.
On the one hand, that's a big change, percentage-wise.
On the other hand, only TSMC is doing this because the entire world economy can afford at most one fab.
On the third hand, it's not clear there's any actual advantage to making the change. There's almost certainly better things to do with that money. But line must more tinyness! is built into the whole process.
-
@cstross TSMC is working on going from 3 nm to 2 nm fabrication.
On the one hand, that's a big change, percentage-wise.
On the other hand, only TSMC is doing this because the entire world economy can afford at most one fab.
On the third hand, it's not clear there's any actual advantage to making the change. There's almost certainly better things to do with that money. But line must more tinyness! is built into the whole process.
@graydon @david_chisnall @linuxgnome @ChrisMayLA6
The logical end-point after the node size bottoms out is going to be for the inherent deflation to become evident—fabs get amortized over time, so the product stops being premium and becomes a cash cow, and prices have to drop.
Nvidia can't survive that. Intel can't survive that. They need something like the AI hyperscalers to keep demand high, but the demand is artificial, and actual consumer demand is soft if not soggy.
Crash is inevitable.