Judge decides case based on AI-hallucinated case law


log in or register to remove this ad


So, since this thread bubbled up to the top again, I thought I might note something I ran across which is germane to this discussion. To wit: The language we use matters.

Back in the 1950s, when computer scientists and engineers started thinking about, and working on the possibility of, building computers that mimic human thought, they landed on calling it "artificial intelligence", because they were scientists, that described what they were trying to make, and it sounds cool and futuristic, which is great if you are trying to get money to support research.

And "artificial intelligence" stuck. And Isaac Asimov wrote robot novels, and it was good. And everyone stays focused on how it can be as good as or better than a human at cognitive tasks. It became all about how AI could replace humans, for good or ill.

But, if we ask normal folks what they want to DO with AI, it is "I want it to help me do X."

So, how would it have been if we didn't call it "artificial AI", and instead called it, "assistive technologies"? Same basic tech underneath.

Assistive technology is still certainly valuable. Our corporate masters would still be interested in assisting their workers with tasks, making them more efficient, giving them tools. "Assistive tech" doesn't suggest general intelligence, though. It suggests helping people with focused tasks, which the tech is better at doing anyway.

Most importantly, "assistive tech" is worthwhile, but not worth going crazy over. Like, your corporate master isn't going to have FOMO over not having assistive tech NOW. So, no economic bubble, no threats to energy and water resources getting chewed up by data centers. Gradual development of functionality, which might still end in "general intelligence", but adopted at something like a reasonable pace...
 

So, since this thread bubbled up to the top again, I thought I might note something I ran across which is germane to this discussion. To wit: The language we use matters.

Back in the 1950s, when computer scientists and engineers started thinking about, and working on the possibility of, building computers that mimic human thought, they landed on calling it "artificial intelligence", because they were scientists, that described what they were trying to make, and it sounds cool and futuristic, which is great if you are trying to get money to support research.

And "artificial intelligence" stuck. And Isaac Asimov wrote robot novels, and it was good. And everyone stays focused on how it can be as good as or better than a human at cognitive tasks. It became all about how AI could replace humans, for good or ill.

But, if we ask normal folks what they want to DO with AI, it is "I want it to help me do X."

So, how would it have been if we didn't call it "artificial AI", and instead called it, "assistive technologies"? Same basic tech underneath.

Assistive technology is still certainly valuable. Our corporate masters would still be interested in assisting their workers with tasks, making them more efficient, giving them tools. "Assistive tech" doesn't suggest general intelligence, though. It suggests helping people with focused tasks, which the tech is better at doing anyway.

Most importantly, "assistive tech" is worthwhile, but not worth going crazy over. Like, your corporate master isn't going to have FOMO over not having assistive tech NOW. So, no economic bubble, no threats to energy and water resources getting chewed up by data centers. Gradual development of functionality, which might still end in "general intelligence", but adopted at something like a reasonable pace...
It would, of course, never happen for the same reason that news headlines are sensationalized, and multiple products claim to be "the number 1" on the market.
 

But, if we ask normal folks what they want to DO with AI, it is "I want it to help me do X."

I think, while true, this answer can fit any tool. What I want from a hammer is to be an assistive technology to nail things. But let's roll with it, as "artificial intelligence" is certainly too broad to be useful as well, as demonstrated by the throngs of people using artificial intelligence for generative AI and even more specifically, web-based LLMs, or even more specifically, ChatGPT. Which can certainly cause misconception -- language does matter: if an army is investing in equipping drones with AI, they are (most probably) not trying to make them able to have a chat over whether pineapple pizzas are authentically Italian or not.

Most importantly, "assistive tech" is worthwhile, but not worth going crazy over. Like, your corporate master isn't going to have FOMO over not having assistive tech NOW. So, no economic bubble, no threats to energy and water resources getting chewed up by data centers. Gradual development of functionality, which might still end in "general intelligence", but adopted at something like a reasonable pace...

I think you're overoptimistic. Even if a more correct denomination was found, it would remain used among specialists, while marketing types would find another buzzword to increase adoption and we'd end up at the same place. As an illustration, I don't think "the Internet" was a particularly catchy word, and yet we had an Internet bubble, because it was sold to investors as "the new economy".
 
Last edited:

I think, while true, this answer can fit any tool. What I want from a hammer is to be an assistive technology to nail things.

Yes.
And note how folks don't mistake a hammer for an all-purpose tool? That's the point.

I think you're overoptimistic. Even if a more correct denomination was found, it would remain used among specialists, while marketing types would find another buzzword to increase adoption and we'd end up at the same place.

Yes. And that's the problem.
 


I think a better description of most LLMs would be “small talk simulator”. It’s just trying to predict what the next appropriate sentence should be.
 

And yet, I think we're currently at the, "If all you have is a hammer..." stage of AI.

I mean, folks trying to sell you AI, will tell you it does everything, if that's what you mean.

But, really, you don't only have a hammer. You have a human brain.
Right?
RIGHT??!??!
 


Remove ads

Top