ChatGPT lies then gaslights reporter with fake transcript

Between research breakthroughs in more efficient computing and improved computer hardware that consumes less power that number will come down drastically over time. That’s always been the case.

Maybe you didn't read the articles - the costs we are talking about are mostly hardware.
So, your solution to how expensive hardware is is to... replace all the hardware yet again? When the new hardware will undoubtedly be more expensive than the old hardware to purchase?

Plus, when they are spending hundreds of billions... soon approaching trillions... more than they are bringing in, it isn't clear that they can hold on until the next generations of hardware make operations affordable.
 

log in or register to remove this ad


Maybe you didn't read the articles - the costs we are talking about are mostly hardware.

Yes. Actually the Bain article isn't about AI-developping companies but datacenters investment.

The article (at least the Bain & Co report) isn't about AI technology spending but the conundrum faced by datacenter investors. It doesn't identify shortage of investment as a risk to AI: while it does mention a shortfall of 800 billions between the 1.2 trillion in increased revenue by 2030 and the 2 trillions expected, the outcome of a lack of investment wouldn't be, according the the report, a failure from AI-developping companies, but a failure for supply of compute power to meet the demand coming from AI-using companies (which the report assumes will be there to create demand in 2030). The risk identified in the report are :
  • insufficent investment to meet the demand,
  • increase in algorithm efficiency (so the compute demand can be met with fewer datacenters, potentially leaving datacenter owners with infrastructure supplying unneeded compute to the market),
  • unpredictable technological breakthrough changing the landscape,
  • Supply-chain shortage (you can't build datacenter in the US if you have tariffs on electronics, or if the electrical grid collapse)

Basically, I thing Frogreaver read the article and expressed optimism toward point (2) and (3) of the Bain report, moreso than Bain (who expects the next breakthrough in AI-optimized chips to occur later than in the 2025-2030 timeframe they analyze. If he hadn't read the article, he would have to be incredibly lucky to mention two core points of the report's analysis by chance!

So, your solution to how expensive hardware is is to... replace all the hardware yet again? When the new hardware will undoubtedly be more expensive than the old hardware to purchase?

That's exactly what the report explains is happening. The investment shortfall is because compute powers gets outdated fast, that's why you need large returns on capital expenditure to keep the profitability of Amazon, Microsoft, Oracle, Google... at their current level.


Plus, when they are spending hundreds of billions... soon approaching trillions... more than they are bringing in, it isn't clear that they can hold on until the next generations of hardware make operations affordable.

The report warns about the "risk" of not having enough compute for AI, not AI companies collapsing leaving excess unused datacenters lying around. When it barely alludes to it (page 34 of the report), it states "However, without such innovations and breakthroughs, general progress could slow, and the field could be left with only those players in the market with adequate public funding". The part of the report explaining the market fragmention linked to sovereign AI demand develops on that: 200 bn€ from the EU, Chinese initiative to build public computing infrastructure (and lending it to Chinese AI players without concern for rentability) may be the deciding element as private investment lags where there isn't enough public support.
 
Last edited:

As a side note regarding the "market fragmentation", China, which had been hit by export restriction regarding chips for AI computing, has retaliated by banning some Nvidia products and increasing crackdown on smuggled US chips:


China also aims to triple its production of advanced semiconductors next year, in a move designed to fill the demand left by Nvidia, the FT reported last month.

It seems they determined they could compete with US firms on chipmaking. It might provide a cheaper solution for them than relying on smuggled chips or renting computing power in third-party countries.
 

1. There’s a limit to the amount of data we have. We will hit that boundary.
This isn't a goodbthing because...
2. A models accuracy cannot be more than 100%. As you approach 100% accuracy for a given task there isn’t much much to gain by increased performance. Consider a hypothetical 99.9% correct vs 99.999% correct.
Right now we aren't close to 99% across the board. Some might be close to 80%, but many -if not most- models are still around 50% accuracy when it comes to data. This is a problem because now it takes more processing power to get less improvement. In part because unnused data isbrunning out and in part because synthetic data is now plentiful.
 

That's exactly what the report explains is happening. The investment shortfall is because compute powers gets outdated fast, that's why you need large returns on capital expenditure to keep the profitability of Amazon, Microsoft, Oracle, Google... at their current level.

Yes. That's the point. Constantly refreshing the hardware is driving capital investment through the roof, and therefore the required return is also through the roof. Refreshing again doesn't save you from that loop - competition will still drive tech companies to out-perform each other, so there will be more and more refreshes, until we run out of rare earth metals....

I will note for all concerned - in the history of technology, increased efficiency does not generally drive reduction in expenditure. It drives increased use. When corporations found that fluorescent lights were cheaper than incandescent lights, they didn't reduce their electricity use - they increased how much space was lit, and how long they left the lights on! When we increase the efficiency of algorithms or decrease the cost per flop of hardware... they'll just do EVEN MORE COMPUTUING.

While the idea that efficiency upgrades will somehow save the situation, the history suggests otherwise. If nothing else - the current cost is over an order of magnitude higher than the current revenue. You aren't getting a 10x decrease in cost in short order, even considering Moore's Law. And nobody understands who is going to spend a trillion dollars on AI output.

The report warns about the "risk" of not having enough compute for AI, not AI companies collapsing leaving excess unused datacenters lying around.

The actual risk is the economic upheaval should the AI business collapse for any reason. Collapsing because they can't get enough machoflops to be an actual product is one possible reason for collapse. Not being able to get enough people to pay for the result is another possible reason. Either one leaves us with massive debt incurred that cannot be repaid, and the economic impacts therefrom.

Data centers lying around unused is kind of the least of our problems at that point.
 

I will note for all concerned - in the history of technology, increased efficiency does not generally drive reduction in expenditure. It drives increased use. When corporations found that fluorescent lights were cheaper than incandescent lights, they didn't reduce their electricity use - they increased how much space was lit, and how long they left the lights on! When we increase the efficiency of algorithms or decrease the cost per flop of hardware... they'll just do EVEN MORE COMPUTUING.

True. The demand in lighting that couldn't be met earlier because it was too costly was met through the technological progress. There is no reason to think that more efficient computing will lead to less computing being used overall. The point that more efficient computing can reach the same goal with less investment is totally unrelated to the total use: if we get cars that run 100km on one liter of gasoline, we'll probably see people use their car more often rather than just save on gasoline. There are some goods for which there is a plateau (you don't eat twice as much when food becomes available more readily after some point...) but computing is probably not reached.

While the idea that efficiency upgrades will somehow save the situation, the history suggests otherwise. If nothing else - the current cost is over an order of magnitude higher than the current revenue. You aren't getting a 10x decrease in cost in short order, even considering Moore's Law. And nobody understands who is going to spend a trillion dollars on AI output.
Well, Bain does, apparently, since they estimate that revenue could be up to 1.2 trillion out of the 2 trillion investment they forecast to be needed.

The actual risk is the economic upheaval should the AI business collapse for any reason. Collapsing because they can't get enough machoflops to be an actual product is one possible reason for collapse. Not being able to get enough people to pay for the result is another possible reason. Either one leaves us with massive debt incurred that cannot be repaid, and the economic impacts therefrom.

I wouldn't use the term upheaval. While it can have localized effect, stock bubble busts do not greatly impact global growth.

1760373410708.png


The Internet bubble only caused a temporary slowdown in global growth, while the 2009 crisis triggered by the subprime collapse reduced growth by about 1% before a rapid recovery. The monetary scale of those events is also very different from today’s AI investments, especially outside infrastructure, which, as you correctly point out, are the main source of capital need.

Major infrastructure providers like Amazon, Google, and Microsoft are highly profitable and can afford to run their data centers below capacity if AI startups fail. They aren’t overleveraged or indebted beyond their means. At worst, they may end up with underused infrastructure and see lower profits compared to recent years, but their debt levels are not a concern given their tremendous profitability. Even a trillion dollars in losses would represent only about three years of combined net profit for the GAFAM group.

As for AI companies themselves, they carry almost no debt. Their growth is financed mainly through venture capital rather than borrowing. Firms like OpenAI, Mistral, and Anthropic, for instance, have minimal debt and rely on burning investor capital instead of taking on credit.

The most significant dent in growth came in 2020, with a lot of developped countries shuting down their economies out of sanitary concern, and not economic concerns.

Even at the epicenter of the phenomenon, results were quite tame when stock valuation bubbles aren't connected to debt bubbles:

1760376474887.png


The Internet bubble crash didn't severely impede even the US economy:

(annual growth rate, measured quarterly, in 2000-2003)
1760376779883.png


It could have had localized effects (in area with strong employment in the same sector and ineffective social safety nets), or at the individual level of course but that's far from an economic upheaval. It wasn't even technically a recession.
 
Last edited:

Remove ads

Top