It doesn't matter.
Look, I pointed out that there is a longstanding body of law (in the common law) that doesn't just look at a product's intended use- but at the actual and reasonably foreseeable use- these are all simple concepts for anyone who can tell the difference between a tort and, um, a cake.
But because I happened to mention the Sacklers (who are just one of many examples), it immediately became, "Well, the opioid crisis is only ONE EXAMPLE!" Because people don't want to understand the law, they want to argue that they are right.
It's not one example- it's a bedrock principle of the law. But this isn't really the forum for that- people would rather discuss fanciful hypotheticals because that's more likely to support what they already know to be true.
These are all complicated concepts, and the framing matters- there is a difference between, for example, a general use AI released to the public, and a specialized AI that has gone through FDA certification and used for diagnostic procedures in the medical field. Because people mix-and-match the framing to make their points, not much is being accomplished other than people talking past each other.
I will reiterate that as to the subject of general AIs drafting legal documents, I would state the following:
A. I think that in America, the corporations that knowingly allow this to happen should be subject to UPL penalties in each state.
B. I also think that any attorney who uses a product and submits it to a Court, signing their name to same, should be harshly disciplined, with no less than a 90 day suspension from the practice of law.
But that's me.