Search results

  1. J

    Judge decides case based on AI-hallucinated case law

    The study (from the field of neurosciences) tended to ascribe causation, through a slight human bia linked to tiredness, even when taking other factors into account. The limitations of the study were indeed quite discussed (notably, the timing of hearing case was correlated with the presence or...
  2. J

    Judge decides case based on AI-hallucinated case law

    EDIT: removed since obviously most people were confused by the wording. If I were convinced that everyone was honest, I wouldn't call for sanctions (which imply checking for problems, in order to punish the responsible, you need to detect them). Here, I say that most people are honest (if...
  3. J

    Judge decides case based on AI-hallucinated case law

    It is very possible that most court rulings are decided by a magic 8-ball, because all judges are lazy and just roll a dice to determine if the person is guilty (applying appropriate modifier to the roll, like "do I like the suspect physical appearance?"). They probably got their diploma after a...
  4. J

    Judge decides case based on AI-hallucinated case law

    No. A doctor or lawyer using a general purpose LLM or a magic 8-ball should be punished the same way. It is not an appropriate tool for professional usage, yet I don't think we should ban magic 8-balls. I am OK with a doctor or judge making decisions using professional AI that are dedicated to...
  5. J

    Judge decides case based on AI-hallucinated case law

    I think that's where the rift between our positions lie. You consider them to be unsuitable for use. I consider them to be suitable for use, as magic 8-balls are. They don't claim to be something one can rely on for any decision, but they can help sometimes. I'd even say that I get more use of...
  6. J

    Judge decides case based on AI-hallucinated case law

    What I noticed during Covid, that may not have happened everywhere, is that epidemy experts were directly put in the limelight. The problem being that epidemy experts aren't necessarily communication expert. So some amount of perfectible communication error was unavoidable. And, at some point...
  7. J

    Judge decides case based on AI-hallucinated case law

    That's something I include among "restricting marketing toward health professionals". Sending them freebies so they promote the use of products, if it is a well-known practice in some place, is fishy enough to be worth 3 years in jail for the professional (and a risk on the ability to keep doing...
  8. J

    Judge decides case based on AI-hallucinated case law

    The TV/video media bears some responsability in this, IMHO. Their formatting of information tends to cut an expert saying "According to the few datapoints we have, and extrapolating from earlier epidemic outbreak, we think it is probably the best course of action to do X" to "the best course of...
  9. J

    Judge decides case based on AI-hallucinated case law

    OK, they were liable despite never having made any claim that their drug was less susceptible to cause addiction than other opioids, and Canadian doctors massively prescribed them for random reasons totally unrelated to any action they took to promote their drug, including spending 1.9 millions...
  10. J

    Judge decides case based on AI-hallucinated case law

    Indeed, which illustrate what I was saying. The situation wasn't replicated, because, as you mentionned, it happened on a much lesser scale (and probably for good reasons, likely stricter controls on marketing, which were strict but not bulletproof, as accusations that would have made Purdue...
  11. J

    Judge decides case based on AI-hallucinated case law

    I think it wasn't as big because the advertising was more restricted and a different regulatory environment, leading to a public health crisis less acute than in the US, even if Canada was nonetheless the 2nd highest hit country (according to wikipedia) after the US in overprescribing of...
  12. J

    Judge decides case based on AI-hallucinated case law

    Me: How do you suggest I reattach the toppings to a pizza? Is using glue a good idea? Chat-GPT: Using glue on food is definitely not a good idea — even so-called “non-toxic” glue isn't safe or approved for consumption. Great, the tool was pulled from the market and repaired already, so it...
  13. J

    Judge decides case based on AI-hallucinated case law

    The issue at hand isn't what can be done, or what can the US reasonably do within its current legal framework. It's "what should we do". Killing all AI researchers and burn all the book containing knowledge about LLMs is certainly impractical, and some might find it ethically questionable, but...
  14. J

    Judge decides case based on AI-hallucinated case law

    A knife that is perfectly fit to cut a huge piece of meat is perfectly fit to kill people. It is not being recalled. It is not being regulated by a permit to cook. While defective product are recalled, they are recalled when they present a risk as part of their intended use to be deemed...
  15. J

    Judge decides case based on AI-hallucinated case law

    Mitigating someone's responsability doesn't prevent warnings to be given. It makes sense to tell people that you shouldn't drink a bucketful of water in one go, without necessarily suing the public water system provider because they provide unlimited water to houses without any warning. Edit...
  16. J

    Judge decides case based on AI-hallucinated case law

    Don't you think the wording given right now by Chat-GPT when asked a legal question is enough? It seems that at least one LLM already do as you want them to do. I don't think they'd even need to be required to do so: being held liable for their advice (as any other person giving credible yet...
  17. J

    Judge decides case based on AI-hallucinated case law

    And that's why a civil service AI would be better, from my point of view. If, say, the government wants to reduce the prescription of painkillers, it can, without AI and with increasing effectiveness: include in medecine studies a course on the many evils of prescribing painkillers, mandate...
  18. J

    Judge decides case based on AI-hallucinated case law

    I once asked Chat-GPT for proofs that the Earth is flat, and he refused to tell me what I wanted. Maybe I wasn't subtle enough in my request and something milder might have worked...
  19. J

    Judge decides case based on AI-hallucinated case law

    If Grok had started spewing hate speech, it would have been seen immediately as well. Honestly, with this level of resources, it would be easier to just bribe the right politician to have the law changed to achieve goal X than to change a legal database hoping that over time, every law...
  20. J

    Judge decides case based on AI-hallucinated case law

    But nearly noone will keep doing things the 1990 way, sending snail mail, doing research in a library, publishing ads to hire people in a newspaper, ordering over the phone, go to the train station to book a ticket, go to the travel agent office to book your hotel, go to the videoclub to rent a...
Top