Judge decides case based on AI-hallucinated case law


log in or register to remove this ad

Well, actually....

"“What is different about generative AI is the power density it requires. Fundamentally, it is just computing, but a generative AI training cluster might consume seven or eight times more energy than a typical computing workload,” says Noman Bashir, lead author of the impact paper, who is a Computing and Climate Impact Fellow at MIT Climate and Sustainability Consortium (MCSC) and a postdoc in the Computer Science and Artificial Intelligence Laboratory (CSAIL)."
...
"“The demand for new data centers cannot be met in a sustainable way. The pace at which companies are building new data centers means the bulk of the electricity to power them must come from fossil fuel-based power plants,” says Bashir."
...
"Plus, generative AI models have an especially short shelf-life, driven by rising demand for new AI applications. Companies release new models every few weeks, so the energy used to train prior versions goes to waste, Bashir adds. New models often consume more energy for training, since they usually have more parameters than their predecessors."


And, in addition to the energy use in training...
"Researchers have estimated that a ChatGPT query consumes about five times more electricity than a simple web search."
It’s worth pointing out that this article is a year and two months old, so written on data from 18 months plus ago. His estimates are almost certainly now out by a factor of 2, as the A100/A1000 chips then used have been replaced by more efficient versions, running about half the power.

Furthermore, the article was written before models that use previously created models and distill them into cheaper / more efficient versions hit the scene (think DeepSeek).

Yeah, they still use a heap of energy, but currently it’s not unreasonable to assume they will halve in energy use every 18 months.

You do have to set that against the rise in multi-query chain of thought models, which are computationally and energy-wise more expensive. But I do think that Bashir’s model is a bit naive in not taking into account the pace of development of more efficient systems.

Plus it’s a little weird to say that this will require more fossil fuel investment, since solar is by far the largest growing segment, and is now cheaper than fossil. No-one is building new fossil plants — they’re building solar or nuclear. I’m pretty sure this was known when he wrote his paper, so not sure why he ignored this.

Anyway, TLDR — These old estimates are not reflected in todays costs; but better newer models require more and likely balance out the trend. Fossil is dead, anyone needing power is going solar or nuclear.
 

It’s worth pointing out that this article is a year and two months old, so written on data from 18 months plus ago. His estimates are almost certainly now out by a factor of 2, as the A100/A1000 chips then used have been replaced by more efficient versions, running about half the power.

And that would mean something if they were continuing to run 18-month old AIs, but they aren't. His estimates will also be 18-months and more behind on the computational needs, too.

Increases in efficiency almost never result in reduction of power use - they typically lead to increase in use, because now use is cheaper! LED lightbulbs (and fluorescents before them) are more efficient than incandescent - their rollout meant that people (including businesses) lit more places more brightly, and left the lights on longer because they felt they could afford it.

So, if it became cheaper per computing operation, or needed fewer operations, you can be quite sure that the response was, 'Well, then use even more operations!" The basic limit will be a monetary one - if they have a budget of $100 million, they will spend all of it, and buy the maximum number of machoflops they can get for it.

Plus it’s a little weird to say that this will require more fossil fuel investment, since solar is by far the largest growing segment...

It is unlikely to stay that way in the US, due to policy change.
 
Last edited:

It’s worth pointing out that this article is a year and two months old, so written on data from 18 months plus ago. His estimates are almost certainly now out by a factor of 2, as the A100/A1000 chips then used have been replaced by more efficient versions, running about half the power.

Furthermore, the article was written before models that use previously created models and distill them into cheaper / more efficient versions hit the scene (think DeepSeek).

Yeah, they still use a heap of energy, but currently it’s not unreasonable to assume they will halve in energy use every 18 months.

You do have to set that against the rise in multi-query chain of thought models, which are computationally and energy-wise more expensive. But I do think that Bashir’s model is a bit naive in not taking into account the pace of development of more efficient systems.

Plus it’s a little weird to say that this will require more fossil fuel investment, since solar is by far the largest growing segment, and is now cheaper than fossil. No-one is building new fossil plants — they’re building solar or nuclear. I’m pretty sure this was known when he wrote his paper, so not sure why he ignored this.

Anyway, TLDR — These old estimates are not reflected in todays costs; but better newer models require more and likely balance out the trend. Fossil is dead, anyone needing power is going solar or nuclear.
Don't forget the tendency of code to become less efficient, not more, as more computing power becomes available. Then there's the constant chasing of improved speed of computation, making that code rely on more and more processors to perform the same function. As the resources become available, they get used immediately. Sometimes the demand outstrips the supply. "We have it, so let's use it!"
 

It matters more what people perceive it as - if they think it's reasonable, good advice most of the time, but it isn't, that's a problem.
And a knife would probably absolutely be forbidden if people got killed or hurt by them commonly because they accidentally use them wrong and end up not cutting non-human meat but other people. But that doesn't happen with most available knifes, most people can avoid serious accidents most of the time. And those that do harm with them can get into serious trouble.

Most people are trained with knives (if only by their mum telling them not to play with them), the same they are trained with many tools we take for granted (few people answer the people speaking in the TV). AI will probably needs some adjustment, hopefully less than the printing press. Initial problems (like the deadly wars of religion in Europe amplified if not made possible by the printing press) are bound to happen as the tool is adopted more and more.
 

Pets & Sidekicks

Remove ads

Top