D&D General Can ChatGPT create a Campaign setting?


log in or register to remove this ad

Jer

Legend
Supporter
RE: Anything original

I mean, it is just remixing what it's given in a fairly sophisticated way, right? Is that what we do (King Lear vs. Oedipus at Colonus)? Is there anything new under the sun?
One major difference between what ChatGPT is doing and what we do (and there are so, so many) is that ChatGPT can literally only mix together what it has "read" to produce new text.

Humans, on the other hand, have a complex system of inputs that include not just what we've read but also what we've experienced. And our emotional responses to these things. Yes if you want to get essentialist these are chemical reactions to the experiences that lead to different brain states, but we have them and they are an input that Large Language Models - which only have language - do not.

What an LLM does is nothing like the way a human brain works and what it produces is going to be far, far more limited because its inputs are far more limited. This isn't even about whether or not it's possible to get a machine to reproduce what a human being does in a "strong AI" sense - LLMs are nowhere near the boundary for strong AI. They just remix text in a way that our brain says "good enough". Especially for things where we don't expect novelty but in fact want a standardized response (like the back-and-forth of a chat, or a form letter, or a 5 paragraph essay).
 



RE: Anything original

I mean, it is just remixing what it's given in a fairly sophisticated way, right? Is that what we do (King Lear vs. Oedipus at Colonus)? Is there anything new under the sun?

Discussing ChatGPT can understandably (maybe unavoidably) veer into existential "what does it all mean, man?" questions, so I get it. And I write and edit for a living—often about AI—so I'm clearly biased. But imo there's a difference between how AI does something like style transfer (applying a given set of features from one piece of content to another) without intent, cognition, or emotional expression, and how a person draws parallels and makes associations and chooses to configure language and meaning.

But at the end of the day, to me, it comes down to a question of scarcity: Is ChatGPT producing something of value that's currently in short supply or otherwise hard to obtain? The world is full of books and texts of all kinds, so I think no. And even if the model happens to produce something you think is neat, it's not like when you come across a well-written bit of text from an RPG designer, and can then dig into other work by them, or keep track of their future projects. ChatGPT basically wrote you a little accidental mandala, and there's no reason to think its output will be of specific interest again tomorrow, especially after whatever local interaction you had has been overwritten and blown away.
 

Clint_L

Hero
As writing, though, that is absolutely, as the kids say, mid. Just hackneyed "joke" after joke. As with every long-ish ChatGPT output I read, I wish I could get back the time I spent reading it, because they never produce anything interesting or original.
I can't answer for your subjective taste. But I can tell you that, objectively, that second memo is better than the vast majority of human beings would write given the same prompts. I find it delightful.

ChatGPT can, right now, write at a level sophisticated enough to easily pass most writing assignments through an undergraduate level. When given random samples, extremely experienced assessors cannot tell the difference between it, given smart prompts, and a human being, so I think we are fooling ourselves if we think that we would do better.

I am going to suggest that many folks are looking at this technology the wrong way. It is not a replacement for human beings, but it is an incredibly potent new tool. Just as other forms of AI have revolutionized fields ranging from mathematics to animation, it is going to revolution writing, including creative writing.
 

Cadence

Legend
Supporter
Discussing ChatGPT can understandably (maybe unavoidably) veer into existential "what does it all mean, man?" questions, so I get it. And I write and edit for a living—often about AI—so I'm clearly biased. But imo there's a difference between how AI does something like style transfer (applying a given set of features from one piece of content to another) without intent, cognition, or emotional expression, and how a person draws parallels and makes associations and chooses to configure language and meaning.

But at the end of the day, to me, it comes down to a question of scarcity: Is ChatGPT producing something of value that's currently in short supply or otherwise hard to obtain? The world is full of books and texts of all kinds, so I think no. And even if the model happens to produce something you think is neat, it's not like when you come across a well-written bit of text from an RPG designer, and can then dig into other work by them, or keep track of their future projects. ChatGPT basically wrote you a little accidental mandala, and there's no reason to think its output will be of specific interest again tomorrow, especially after whatever local interaction you had has been overwritten and blown away.

The part I worry about a bit is all of the stuff that people do that doesn't seem great. And not just the person (student, employee, whatnot) half-arseing something because they don't care or are time pressured. But I think about "one hit wonders". I would not be surprised if the brilliant poems, songs, and stories are safe - so that the various anthology and best of's by humans will still stand out. But how many collections and albums by professionals have a few real standouts and then a bunch of meh? How hard will it be to get it to make the common parts that don't work but are part of paying the bills?
 

Clint_L

Hero
Or like I tell my students - we've been "less than 10 years away" from self-driving cars for around 13 years at this point.
I have students using ChatGPT with their work right now. Every school and university does. This is not happening tomorrow or next year. It's happened. The revolution has occurred. What comes next is evolution.
 

Cadence

Legend
Supporter
I am going to suggest that many folks are looking at this technology the wrong way. It is not a replacement for human beings, but it is an incredibly potent new tool. Just as other forms of AI have revolutionized fields ranging from mathematics to animation, it is going to revolution writing, including creative writing.

In Chess, AI hasn't really replaced people because people enjoy it for the competition and the people-part of it. And I like a lot of the ways an AI can back up a doctor, for example.

But in general I wish I was as optimistic as you about it not just turning into a replacement everywhere it can for things that involve money and profits.

People used to value buy local and supporting neighborhood stores. And now we have Amazon, Walmart, and a lot less of the rest. Right now a lot of people are angry about AI art. Right now.
 

Jer

Legend
Supporter
I have students using ChatGPT with their work right now. Every school and university does. This is not happening tomorrow or next year. It's happened. The revolution has occurred. What comes next is evolution.
This is because your students are being asked to put together relatively trivial writing assignments that are there to build their skills not because you're asking them to do wonderfully creative work.

It's the same in my field - programming. Yes some of these models can put together work that 1st or 2nd year students can do. That's because we ask them to write things to learn how to write not because we actually care about that output. Frankly every essay you might ask your students to write for a class probably exists on the internet somewhere already given the density of the text out there.

Ask ChatGPT to put together a novel for you. It's terrible. Ask ChatGPT to develop an actual solution to a real world programming problem. It can't. It's nuts for us at the professional educator level because it short-circuits our learning processes - it makes it hard to tell if the students are actually practicing or just turning in what an AI has written. But that's because what our students generally churn out is banal stuff that is the same year after year with subtle variations as practice to scaffold them up to doing more interesting things. The exact kind of form-filling that ChatGPT is good at.
 

Remove ads

Top