I'm imagining a case where it has been detected and we all know it's wrong, but the legal environment mandates the LLMs lead to this false information.
Asked and answered. The LLM isn’t the responsible party- whomever is causing the FDA to promulgate falsehoods is.
Not in this case, as I described.
I won't speak specifically about COVID. But in general, when things are presented authoritatively and turn out to be wrong, that undermines trust. We still see people citing the 1975 global cooling article as evidence that it is all bunk--and that wasn't even a very authoritative portrayal. This is true especially when you are asking people to make major lifestyle changes as a result of your authoritative portrayal.
And I agree this is a massive problem. But I disagree on the solution. I think circling the wagons and restricting information to experts only is going to make the trust situation worse, not better. It takes decades to build trust and not very much time at all for it to evaporate. Pointing to a degree or a license is something that only works in a high-trust environment. And that no longer exists.
Nevertheless, COVID provides a well-documented, recent case study in this.
Public health organizations and experts didn’t simply claim they were right because they were authorities, they said Covid-19 was a new virus and they didn’t know exactly what it could do,
so they were basing their recommendations on what they knew from related pathogens while waiting for new research results.
When those results came in, they revised their recommendations,
expressly in the context that new information was responsible for the changes. That’s how you create policy in accord with the scientific method- you change recommendations when better information becomes available.
This was mischaracterized by certain outlets and individuals as
lying. And that narrative captured the minds of an unfortunately large segment of the populace.
The CDC, WHO, Fauci, etc, didn’t misrepresent what they knew & when they knew it, nor hide behind their credentials. People
systematically attacked their credibility. With that pretext, they also destroyed trust in well-established medical and public health findings.
At this point, a large enough segment of the adult population (in America, at least) have demonstrated they can’t properly evaluate medical info for veracity and accuracy.
So no, generalized AIs should not be able to disseminate any medical advice beyond “find a qualified medical professional near you”*
And further, AIs specialized for medical or legal professionals shouldn’t be accessible to the general public either. Most people lack even the
vocabulary to fully grasp the results their questions would return.
* and likewise for legal advice.