gnarlygninja
Adventurer
ChatGPT "helped" her by reminding her...that she had 10k dollar in a bank account.
Quoting the article:Here's a case in which lawyers submitted hallucinated case law and it was done right, though I think the sanctions are on the light side for it.
Damien Charlotin tracks court cases from across the world where generative AI produced hallucinated content and where a court or tribunal specifically levied warnings or other punishments. There are 206 cases identified as of Thursday — and that's only since the spring, he told NPR. There were very few cases before April, he said, but for months since there have been cases "popping up every day."
Charlotin's database doesn't cover every single case where there is a hallucination. But he said, "I suspect there are many, many, many more, but just a lot of courts and parties prefer not to address it because it's very embarrassing for everyone involved."
Quoting the article:
A couple hundred cases “since the spring”, and an educated guess that most don’t get reported.
Wrist slaps for AIBS need to turn into suspensions, IMHO, ASAP.
Charlotin's database doesn't cover every single case where there is a hallucination. But he said, "I suspect there are many, many, many more, but just a lot of courts and parties prefer not to address it because it's very embarrassing for everyone involved."
I agree. It amounts to legal malpractice and should be subject to a review by the appropriate Bar Association, at the least, in addition to court sanctions.Quoting the article:
A couple hundred cases “since the spring”, and an educated guess that most don’t get reported.
Wrist slaps for AIBS need to turn into suspensions, IMHO, ASAP.
![]()
Woman says ChatGPT helped her pay off over $11K in debt
Jennifer Allan says AI helped her focus on paying off her debt.www.yahoo.com
Allan also said the journey helped her re-discover other money sources she hadn't thought about in awhile.
"My husband was actually like, 'Oh, didn't we have a brokerage account?'" Allan recalled.
"There's $10,200 sitting in this account that is available. Like I could literally cry right now," she said in a TikTok video.
Although Allan's method worked for her, some financial experts like Noelle Carter, president and CEO of Parachute Credit Counseling, warn ChatGPT and AI should be treated as a tool and not a solution.
"AI can be a powerful assistant to come up with ideas, but, you know, certainly not a substitute for human expertise or critical thinking," Carter said.
Other experts also encouraged people to only spend within their means so as to avoid debt completely.
...spend within their means to avoid debt completely.
I don’t think so.This bit really concerns me:
Are we looking at the possibility that AI slop threatens to drown even the legal system just because it's too "embarrassing" to deal with? I'm envisioning bad actors exploiting an already creaky system in very detrimental ways.
I've been reflecting on this for a few days and want to add something.Condescension implies a patronizing superiority. I’m not claiming to be superior; that they are inferior.
I’m saying that the average layperson doesn’t have the training to evaluate medical treatment claims with accuracy, and part of that lack is not having the necessary vocabulary. For example, the concept of comorbidity isn’t that difficult, it’s just the existence in a particular patient multiple afflictions capable of harming or killing them
Which could be particularly bad if in the hands of a vexatious litigant, or someone whose whole purpose is to fatigue the court. I've mentioned before that I was a witness in a case where the accused went through something like 8 lawyers, over the course of a few years.I don’t think so.
I suspect that what’s happening in those cases is that the judge- having caught the attorneys using bogus AI cases- gives a verbal reprimand (and possibly a contempt citation) and allows them to rectify their pleadings without referring them to the bar association for further action.
It still slows proceedings down, though.
(Emphasis mine.)When you approach this kind of community and say "you don't have the training to evaluate medical claims with accuracy", it doesn't matter if you're right, and it doesn't matter that you have the best intentions and really want to help them. It is going to come across, to many people, as if you are saying "I think you're stupid because you didn't go to college". If you repeat this over and over while making decisions that affect their lives, it's going to breed distrust and resentment and conspiratorial thinking.