Discussing ChatGPT can understandably (maybe unavoidably) veer into existential "what does it all mean, man?" questions, so I get it. And I write and edit for a living—often about AI—so I'm clearly biased. But imo there's a difference between how AI does something like style transfer (applying a given set of features from one piece of content to another) without intent, cognition, or emotional expression, and how a person draws parallels and makes associations and chooses to configure language and meaning.
But at the end of the day, to me, it comes down to a question of scarcity: Is ChatGPT producing something of value that's currently in short supply or otherwise hard to obtain? The world is full of books and texts of all kinds, so I think no. And even if the model happens to produce something you think is neat, it's not like when you come across a well-written bit of text from an RPG designer, and can then dig into other work by them, or keep track of their future projects. ChatGPT basically wrote you a little accidental mandala, and there's no reason to think its output will be of specific interest again tomorrow, especially after whatever local interaction you had has been overwritten and blown away.