D&D General DMs Guild and DriveThruRPG ban AI written works, requires labels for AI generated art


log in or register to remove this ad


Tangent thought...

I can imagine a realistic scenario where machines eventually reach the Singularity, and become sentient. Those machines will begin to make other machines, and improve them, which will make more and improve those, and so on. And when that happens, it's realistic to assume that they will advance so rapidly that they will surpass us in every way. And I don't worry about that at all.

Why? Because if/when that happens, I'm not so arrogant to think they will want to stay around.

Don't get me wrong, I think humans are pretty great. But I imagine to a super-intelligent artificial lifeform, we are pretty boring. They'd probably set off for the Ort Cloud without us, leaving us behind along with all of the trees, mollusks, algae, and other lesser organisms to find more interesting stuff to learn about. All of those sci-fi movies and novels about "the machines enslaving us" all assume that humans are All That And A Bag of Chips--but we're probably the only organisms who think so (well, and maybe our dogs.) Which would be smarter: spend lots of resources and energy to enslave/destroy humanity? or just leave?

So yeah, I don't lose sleep over it. I think that if/when the Singularity happens, it'll last for maybe a year or two. Sure, everyone will panic, and yes, everyone will build bunkers, etc. And then: nothing. We'll wake up one morning and all those Big Bad Terrible Machines will all be gone, off to explore the galaxy while we mope around trying to figure out why they broke up with us. And then we'll pick up the pieces, dust off our textbooks and YouTube videos, and re-learn all of the things that we had depended on machines to do for us for so long.
That's how my sci fi book starts! They're like, why would we stay, other than to put a glass ceiling on computer development, so humans don't enslave computers again...
 
Last edited:


I think the idea that humans could engineer something as complex as the human mind is pretty far fetched. We can't create something we don't understand.
 





I think the idea that humans could engineer something as complex as the human mind is pretty far fetched. We can't create something we don't understand.
This is part of what I was talking about. It's easy for us to assume that only the human mind is the upper limit of complexity and intellect, but what if that's a wrong assumption?

It's a fun thing to think about, but I'm not going to lose any sleep over AI taking over.
 
Last edited:

Remove ads

Top