ChatGPT lies then gaslights reporter with fake transcript

And, it may be that the capital expenditure required by the data centers may sink the thing.

The actual revenue generated by AI this year is reportedly about $20 billion.

But expenditure on the data centers this year may be more like $320 billion. Next year, and following years, we expect expansion, so that quickly the annual revenue needed to break even is $1 to $2 trillion.

Clearly, that is revenue growth that is... a little hard to believe will materialize.

And when the debt incurred to build those data centers comes due... crash. Or, really, slightly before it actually comes due, as people figure out how untenable their position is about to be...



Between research breakthroughs in more efficient computing and improved computer hardware that consumes less power that number will come down drastically over time. That’s always been the case.

And we already know that doubling the number of parameters or dataset observations doesn’t get anywhere near doubling performance after a certain point. Essentially cost is almost guaranteed to come down.
 

log in or register to remove this ad

Between research breakthroughs in more efficient computing and improved computer hardware that consumes less power that number will come down drastically over time. That’s always been the case.

And we already know that doubling the number of parameters or dataset observations doesn’t get anywhere near doubling performance after a certain point. Essentially cost is almost guaranteed to come down.

With a 20 billion revenue this year, and Bain's estimate that, the 2 trillion investment needed to maintain the current growth in computing capacity being 800 billions short, this means a revenue going from 20 to 1,200 billion in over just 5 years. While the increase in capabilities linked to increase in compute power might be unsustainable, especially in the US where they point out problem with the lack of investment into the power grid, it is still a sixtyfold increase in revenue for the sector. The article also points out that LLMs are no longer the main focus of investment right now and that sovereign AI program are picking it up, with the US, which seems to concentrate the venture-capital-funded companies, only accounting for half of the worldwide expanditure.
 

With a 20 billion revenue this year, and Bain's estimate that, the 2 trillion investment needed to maintain the current growth in computing capacity being 800 billions short, this means a revenue going from 20 to 1,200 billion in over just 5 years. While the increase in capabilities linked to increase in compute power might be unsustainable, especially in the US where they point out problem with the lack of investment into the power grid, it is still a sixtyfold increase in revenue for the sector. The article also points out that LLMs are no longer the main focus of investment right now and that sovereign AI program are picking it up, with the US only accounting for half of the worldwide expanditure.

I have no idea what you’re saying here. Like I see a few facts, but no overall picture
 

I have no idea what you’re saying here. Like a see a few facts, but no overall picture

Not only your argument that the computing cost will go down over time is true, but if Bain's forecast is to be trusted, the 800 billions capital need will be easily met given the tremendous increase in revenue they forecast: the growth speed is tremendous and is sure to attract new investors. It is also to be noted in that report that half the investment worldwide doesn't come from sources concerned about financial returns but geopolitical returns, so a reduction in investment from venture capitalists might not hamper significantly the technology.
 

I have no idea what you’re saying here. Like I see a few facts, but no overall picture
Training and deployment needs to be top of the line, and top of the line is always increasingly more and more expensive.

And new and better chips do get more energy efficient, but only when they become better by size improvements at transistor level. Other improvements are done at design level and they don't reduce energy consumption. Some even increase it.

We are nearing the physical limits of sillicon. And there are potential replacements, but at this point they are basically exotic materials or processes which won't be cheap or even reliable for a while.
 

Not only your argument that the computing cost will go down over time is true, but if Bain's forecast is to be trusted, the 800 billions capital need will be easily met given the tremendous increase in revenue they forecast: the growth speed is tremendous and is sure to attract new investors. It is also to be noted in that report that half the investment worldwide doesn't come from sources concerned about financial returns but geopolitical returns, so a reduction in investment from venture capitalists might not hamper significantly the technology.

This seems like as good of a point as any go back to a critical disconnect about most of these AI threads.

When people talk about AI on ENWorld, the default assumption is that they're discussing generative AI (I've even been chasitized for not making this assumption). When Bain and major corporations are talking about AI, huge amounts of the businesses model is AI is being used for analysis, data management, QA, logistics, engineering, organization, military operations, surveillance... Just so many things other than ChatGPT and non-human art. The debates in this thread barely scratch the surface of what AI is actually being used for.
 

Training and deployment needs to be top of the line, and top of the line is always increasingly more and more expensive.

1. There’s a limit to the amount of data we have. We will hit that boundary.

2. A models accuracy cannot be more than 100%. As you approach 100% accuracy for a given task there isn’t much much to gain by increased performance. Consider a hypothetical 99.9% correct vs 99.999% correct.

And new and better chips do get more energy efficient, but only when they become better by size improvements at transistor level. Other improvements are done at design level and they don't reduce energy consumption. Some even increase it.

Not only but that’s the most common way. And while it’s true we are nearing the physical limits of silicon to our current knowledge I’m also skeptical the knowledge of cutting edge research from chip manufacturers is public. Better architecture design for a given task also can be a performance increase/lower decrease.
 

Plus the bigboys will push it to collapse because they aren't leveraged totally in AI, Bezos, Gates, etc. will still have their physical holdings to weather the bubble bursting that kills the competition.
 


Remove ads

Top