Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
NOW LIVE! Today's the day you meet your new best friend. You don’t have to leave Wolfy behind... In 'Pets & Sidekicks' your companions level up with you!
Community
General Tabletop Discussion
*Geek Talk & Media
Judge decides case based on AI-hallucinated case law
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Jfdlsjfd" data-source="post: 9702747" data-attributes="member: 42856"><p>Don't you think the wording given right now by Chat-GPT when asked a legal question is enough? It seems that at least one LLM already do as you want them to do. I don't think they'd even need to be required to do so: being held liable for their advice (as any other person giving credible yet dangerous advice in many juridictions) would prompt them to display warnings to avoid or limit their responsability without being required. This would be in their best interest to do so.</p><p></p><p>For medical wording, I asked what it could do for a layman and the warning was that it can only help phrase things to explain a doctor, not provide diagnosis or treatment. It said (among other warning): "<strong>No Diagnosis or Treatment Plans:</strong> I can’t diagnose medical conditions or suggest specific treatments. Only qualified healthcare professionals can do that after examining a patient directly." It is however useful when providing medical information (like, how paracetamol work, including mention of its analgesic power, but without saying "you're in pain? Take paracetamol".</p><p></p><p>Grok, when presented with symptoms (I told it I had a pain in the knee), states:</p><p></p><p>"I'm not a doctor, but I can offer some general information. It can have various causes, ranging from benign to potentially serious. Here are some possible reasons and steps you can consider: [...] I strongly recommend consulting a healthcare professional, such as a general practitioner or a specialist, especially if the pain persists or is accompanied by other symptoms. They can provide a proper diagnosis and tailored advice. If you’d like, I can search for more specific information or help you formulate questions to ask your doctor. [...] Disclaimer: Grok is not a doctor; please consult one.</p><p></p><p>Really, I don't think they need to do more to prevent people for mistaking the text they provide with medical advice.</p><p></p><p>It wouldn't exonerate their responsability in all context, for example if they kept giving bad advice after being warned of the problem, for example.</p><p></p><p></p><p></p><p>It will vary tremendously by juridiction. Saying "Source X says Y" doesn't make one liable everywhere, especially without endorsement. I think you were speaking of the case where an LLM would advise to stop a cancer treatment and pray intead, providing a link to a faith healer as "proof"?</p><p></p><p></p><p></p><p>We get that some people are morons, but the law isn't necessarily made to protect morons. There are juridiction where the person can't claim liability because he failed what we'd call common sense. I have in mind a specific example of a person who was told to get on a chair to clean something too high and the chair broke, and said person was denied her claim because she couldn't ignore the chair was a garden chair and couldn't support her weight. There is a middle ground to be found between protecting people (by having warnings posted, for example) and totally suppressing the information (or requiring a permit to access information).</p></blockquote><p></p>
[QUOTE="Jfdlsjfd, post: 9702747, member: 42856"] Don't you think the wording given right now by Chat-GPT when asked a legal question is enough? It seems that at least one LLM already do as you want them to do. I don't think they'd even need to be required to do so: being held liable for their advice (as any other person giving credible yet dangerous advice in many juridictions) would prompt them to display warnings to avoid or limit their responsability without being required. This would be in their best interest to do so. For medical wording, I asked what it could do for a layman and the warning was that it can only help phrase things to explain a doctor, not provide diagnosis or treatment. It said (among other warning): "[B]No Diagnosis or Treatment Plans:[/B] I can’t diagnose medical conditions or suggest specific treatments. Only qualified healthcare professionals can do that after examining a patient directly." It is however useful when providing medical information (like, how paracetamol work, including mention of its analgesic power, but without saying "you're in pain? Take paracetamol". Grok, when presented with symptoms (I told it I had a pain in the knee), states: "I'm not a doctor, but I can offer some general information. It can have various causes, ranging from benign to potentially serious. Here are some possible reasons and steps you can consider: [...] I strongly recommend consulting a healthcare professional, such as a general practitioner or a specialist, especially if the pain persists or is accompanied by other symptoms. They can provide a proper diagnosis and tailored advice. If you’d like, I can search for more specific information or help you formulate questions to ask your doctor. [...] Disclaimer: Grok is not a doctor; please consult one. Really, I don't think they need to do more to prevent people for mistaking the text they provide with medical advice. It wouldn't exonerate their responsability in all context, for example if they kept giving bad advice after being warned of the problem, for example. It will vary tremendously by juridiction. Saying "Source X says Y" doesn't make one liable everywhere, especially without endorsement. I think you were speaking of the case where an LLM would advise to stop a cancer treatment and pray intead, providing a link to a faith healer as "proof"? We get that some people are morons, but the law isn't necessarily made to protect morons. There are juridiction where the person can't claim liability because he failed what we'd call common sense. I have in mind a specific example of a person who was told to get on a chair to clean something too high and the chair broke, and said person was denied her claim because she couldn't ignore the chair was a garden chair and couldn't support her weight. There is a middle ground to be found between protecting people (by having warnings posted, for example) and totally suppressing the information (or requiring a permit to access information). [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Geek Talk & Media
Judge decides case based on AI-hallucinated case law
Top