Judge decides case based on AI-hallucinated case law

If that were true, the issue cited in the OP, and in other similar cases, would never have arisen.
A combination of a bad lawyer (probably using an LLM) and a bad judge (who couldn't care less and could have been fooled by a human made invented [or misunderstood] case) let that happen. I nonetheless propose we don't ban lawyers and judges yet (though it would make justice much quicker than it is now). The example in the OP is like an article on a plane crash. It happens, and we rightfully get reports on such occurrences, but it doesn't mean than it's representative on usual plane travel. Same with this kind of goof: I expect US judges to actually check the precedents the lawyer is quoting. Errors predate LLMs, and the human lawyer coul be honestly mistaken about case law.

I also think the article would get less views it was titled differently. Because, if we read the article linked by the OP, the story is:

  • Husband file for divorce and apparently couldn't (or wasn't really trying to) contact Wife,
  • Wife file to reopen the divorce case, citing precedent that Husband should have tried to contact her, and noting that Husband filing contained bogus cases,
  • Husband doesn't address this claim, and Husband attorney's still rely on two invented case and to irrelevant cases,
  • Husband's attorney provide 11 new cases, either also irrelevant or invented,
  • There is a suspicion that the court order was actually written by the Husband's lawyer.

There is absolutely no evidence whatsoever that the bogus cases were invented by AI and not a human. I'd say it's easier to ask an AI to do it and it's probably what happened, but the exact same thing could happen if Husband's attorney, who kept providing bogus case even when told the previous one where bogus, was just inventing them themselves.

The crux of the problem isn't in the AI, anymore than it is with the typewriter Husband's attorney used to file the claim. The problem lies with Husband's attorney, and with the judge who didn't check the cases -- and who didn't use AI at all.



What you mean is professionally, you should find the reference anyway.

Yes, and I wouldn't call someone not doing this a professional. Also, I am pretty sure the AI models proposed by editors like LexisNexis will contain specific countermeasures to detect hallucinations before they get sent to the user.
 
Last edited:

log in or register to remove this ad


Not really, because a plane crash is unlikely to pass without anyone noticing.

Honestly? The job of the attorney is also to explain the ruling to his client. When Wife got the divorce filing, her attorney immediately checked the cases and found them bogus or irrelevant.

For this to pass under the radar, you need:
  • a lying and careless attorney on the plaintiff's side (who may or may not have used AI to invent the bogus cases),
  • a careless (or corrupt) judge who didn't check the cases and just signed a court order pre-written by Husband's attorney,
  • a careless attorney on the defendant's side.
That's a lot of grossly incompetent actors at the same time. It can happen, but I am not sure it happens more than an airplane mechanics being careless with maintaining a plane... And if you have this conjunction of inept people, they don't need AI at all to make a poor job. In the OP's article, there is no proof AI was involved, it's just a supposition because AI is prone to hallucinate, explaining the bogus case. But it's not the AI that made the lawyer draft the order for the judge, which may or may not be an acceptable practice in this case but from my legal background sounds extremely problematic (the job of the court is to draft the order and each word is carefully weighted, but maybe it's accepted in Georgia).
 
Last edited:

That's a lot of grossly incompetent actors at the same time
Every profession has plenty of careless and/or downright dishonest actors. The trouble with diligent and honest people is they assume other people are like them.
I am not sure it happens more than an airplane mechanics being careless with maintaining a plane.
That also happens a lot (I have a friend who is a pilot). It's just that so long as the plane doesn't crash no one notices.
 


Every profession has plenty of careless and/or downright dishonest actors. The trouble with diligent and honest people is they assume other people are like them.
Especially when the consequences of the act are the potential loss of the ability to continue in the profession.
That also happens a lot (I have a friend who is a pilot). It's just that so long as the plane doesn't crash no one notices.
Indeed. There's a reason why pilots walk the aircraft and do rigorous preflight checks. When they don't, things often go wrong. I once witnessed a commercial aircraft take off with the cargo door still open.
 

bummer sucks to be them if for whatever reason they can't do something right?
As someone who is losing the ability to paint miniatures due to a medical problem, and who initially enjoyed “making” ai images and playing on AiDungeon, yes.

"I have stage 4 cancer and aiart [sic] actually gave me reason to keep fighting it," wrote a user named bodden3113 on Reddit in December 2022. "I no longer have to wait till I'm not doing chemo to figure out how to draw hair or decide if i should even learn if i won't live long enough to see on paper what i had in mind... Why do we have to do things the same way it was always done?"
So we should keep diving into an inherently unethical technology because it makes one guy happy?
One artist, Claire Silver, has been using AI art collaboratively since 2018 and gained renown from it, becoming the first AI-augmented artist to sign with the WME talent agency. "I have a chronic, disabling illness, an experience that has galvanized my love for augmenting skill in favor of expression," she told Ars Technica in an interview. "I grew up in poverty and have changed my family’s life with my AI art."

Well Claire sucks to be you and you too user bodden

Sucks to be theses students

I mean, I guess so?

Someone made a bunch of money off novelty art partially using a tech that steals from artists….okay?

As for students, at most you’ve found a limited positive use case that doesn’t actually require broad acceptance of llms, and
You're assuming that only really bad lawyers
nope. I’m stating factually that generative ai is never going to be a good replacement for a lawyer. Because the only context in which ai makes the legal system accessible to more people is by “replacing” lawyers.
 



With the rise of AI, for the first time, the barrier of skill is swept away.
In this evolving era, taste is the new skill.

No Way Reaction GIF
 


Remove ads

Top