EzekielRaiden
Follower of the Way
At present, the absolute best AIs in the world, trained on literally gigabytes of raw text files, can manage about six or seven paragraphs before the text becomes completely unhinged.Aren't they? For all I know, they are created by "Wizards of the Coast". The internal process they use is unknown to me. I guess they have writers because... I am not that knowledgeable in AI and I don't know if it's possible.
I don't mean "ooh, you really notice the flaws." I mean that it starts talking straight-up bizarro weird stuff, like fourteen-horned unicorns, or repeatedly contradicting itself on what color something is, etc.
The problem is, the correlational structure upon which text-generation AI is built cannot handle the connections necessary for logical cohesion in a long-form work.* Combinatoric explosion takes over, and it scales much, much faster than technology can. We're currently still in the early days, where most of the limitations are in implementation, or training time, or learning the most effective ways of doing things. As it stands, writing even a single chapter of the kind of length expected of an RPG rulebook is completely impossible.
The irony, of course, is that it is much easier to fake images than it is to fake text. As the links others have given show, you can make completely realistic-looking (minus some eldritch horror side-faces and surrealist clothing/backgrounds) faces that are 100% fictitious. Making three pages of consistent, readable fiction? Completely impossible at present--and it's going to be hard to collect enough of a corpus to improve it much further. We've already used a scrape of a large portion of the internet. Where are we going to get the data to train GPT-4 on?
*Really, the problem is that "generative" text AIs, like GPT, are trying to substitute really, really, really complex correlation networks in place of actual knowledge. That is, they are trying to use a really, really, really complex model of English syntax--a model of "word X should come next, then word Y, then word Z, and NOT word B"--in order to successfully produce semantic content in English--that is, things which contain meaning to English speakers. That is, formally speaking, impossible. You cannot magic up semantic content from exclusively syntactic content. You can get some surprising, even shocking connections, you can get some truly impressive relationships. But you cannot derive the semantic from the syntactic, even in the limit of an infinitely complex syntactic model.