Judge decides case based on AI-hallucinated case law

I was reading a paywalled article on CNN- via someone else’s machine*- that was exposing ELSA (the FDA’s own AI) as being as prone to hallucinations and misrepresentation as general purpose LLMs. Some of the employees interviewed basically said that ELSA was not saving them time because they had to be extra diligent in evaluating its results.





*🤷🏾‍♂️
 

log in or register to remove this ad

This video is specifically related to things like higher physics and coding, but raises some very good points about why LLMs are bad for doing more complex things. The only point that I disagree with is the concept that venture capitalists have "feelings."

 

Pets & Sidekicks

Remove ads

Top