D&D General WotC hiring a Principal AI Engineer

There is nothing creative that AI does remotely well without so much user input that the user could do the same thing without the AI.

AI is literally just advanced “I see you’re trying to compose a letter, how can I help?”
It’s a probability engine that someone fed art into instead of math problems or dice or whatever.

The idea that TTRPG or video game studios need to even think about AI in terms of content is so bonkers left-field wacky that it’s hard to even force my brain into a shape that can track the reasoning behind such a statement.

Use for organization and workflow stuff makes sense, and leans into some of the very very few things AI is any good at, and you certainly can do some interesting stuff with human-curated AI-powered video game mechanics and content, like infinite levels, more responsive companion characters (any NPC with very very consistent and detailed profiles and major stuff done by a human. AI is worse than little kids at writing good fiction), etc.
It's inevitable that computers will take most knowledge jobs. They aren't there today, but they will be.
 

log in or register to remove this ad

AI is literally just advanced “I see you’re trying to compose a letter, how can I help?”
It’s a probability engine that someone fed art into instead of math problems or dice or whatever.
I don't know how much you know about the field, but I feel compelled to point out that this is a drastic oversimplification of what is going on under the hood in these models. Contextual language encoding in 7k-dimensional space (or however many dimensions they're using for modern LLMs) is really good at conceptual encoding, provided sufficient training data (which is, of course, another point of contention, but for other reasons). GANs were a good start and the diffusion models are scarily good, and despite issues I see hallucination as a beneficial side-effect that has use cases and other interesting implications.

It also seems to me that "AI" is becoming almost useless as a general description. Almost nothing said in this thread applies to computer vision, for instance, but CNNs certainly fall under the label of "AI."
 

It's inevitable that computers will take most knowledge jobs. They aren't there today, but they will be.
It's also important to understand that in the media, "AI" is a misleading and often outright wrong term for what is happening. LLMs are only one kind of "AI" and they only do well what LLMs do well. Is AGI -- what you really need to eliminate actual jobs -- right around the corner. If you listen to the techbros trying to gather financing, yes. Inreality we are still a long way off from AGI, and will probably never have what people actually think of when they think."AI."

All that said, massive models that can work to find new drugs and develop new physics are coming fast and will change the world in collaboration with experts. Folks need to stop worrying about crappy generative art and text. That stuff is not going to be important in the long run.
 

It's also important to understand that in the media, "AI" is a misleading and often outright wrong term for what is happening. LLMs are only one kind of "AI" and they only do well what LLMs do well. Is AGI -- what you really need to eliminate actual jobs -- right around the corner. If you listen to the techbros trying to gather financing, yes. Inreality we are still a long way off from AGI, and will probably never have what people actually think of when they think."AI."

All that said, massive models that can work to find new drugs and develop new physics are coming fast and will change the world in collaboration with experts. Folks need to stop worrying about crappy generative art and text. That stuff is not going to be important in the long run.
The problem in the short term is that corporations are always looking for ways to pay less to produce the same amount of product or more, regardless of quality, and screw people in the field out of pay. When people like Zazlav and Strauss Zelnick are running corporations I really cannot blame people for being concerned with what they'll do.
 

In the Hasbro job description, this part is fine:

"Design, build, and deploy systems for intelligent generation of text dialog, audio, art assets, [and] NPC behaviors."

However the rest of the sentence seems ill-conceived:

"and real time bot frameworks".


In the hands of an artist and designer, AI is a powerful and helpful tool. But the art designer must supervise every moment that AI operates − at all times. The utilizer must decide if the AI happens to be doing something useful or nonsensical. So to use AI to construct a game product is great.

But to allow AI to interact with customers in "real time" seems like the stupidest use of AI I could ever imagine.

Everything an AI does, human intention must curate it.
 

It's kind of odd they are specifically looking for an engineer named Al. You'd think that it wouldn't matter what their name was so long as they could do the job. I wonder if it matters if they are an 'Albert' or 'Allen', or 'Alfred'?
 

It's also important to understand that in the media, "AI" is a misleading and often outright wrong term for what is happening. LLMs are only one kind of "AI" and they only do well what LLMs do well. Is AGI -- what you really need to eliminate actual jobs -- right around the corner. If you listen to the techbros trying to gather financing, yes. Inreality we are still a long way off from AGI, and will probably never have what people actually think of when they think."AI."

All that said, massive models that can work to find new drugs and develop new physics are coming fast and will change the world in collaboration with experts. Folks need to stop worrying about crappy generative art and text. That stuff is not going to be important in the long run.
Correct.
 

The problem in the short term is that corporations are always looking for ways to pay less to produce the same amount of product or more, regardless of quality, and screw people in the field out of pay. When people like Zazlav and Strauss Zelnick are running corporations I really cannot blame people for being concerned with what they'll do.
If people are happy with AI generated art and text in D&D, they get what they deserve.
 

If people are happy with AI generated art and text in D&D, they get what they deserve.
I'm not going to defend stealing, but the AI art is a million times better than what I can do. You can really get great art for evocative scenes. I don't use it, I regret generating art before I knew better, but it's actually quite good for personal use. Also, I've seen some great adventure generated this way, and some bad ones, bit that's hardly different than what we get from people.
 

I'm not going to defend stealing, but the AI art is a million times better than what I can do. You can really get great art for evocative scenes. I don't use it, I regret generating art before I knew better, but it's actually quite good for personal use. Also, I've seen some great adventure generated this way, and some bad ones, bit that's hardly different than what we get from people.
The stuff generative AI produces is decent, sometimes even good, on initial impression but almost any level of actual inspection shows its very serious flaws. While I think it is great for personal use, and dedicated RPG tools like an NPC image generator are very likely coming, D&D would very quickly become a steaming pile of worthless drek if they decided they would remove the human creative element. I can imagine some C-suite short sighted techbro suggesting it would be so much cheaper to do it with AI, but one hopes that would get shouted down pretty quickly by those that actually have to put out a product. Despiet all the complaining in the recent 2025 art reveal threads, WotC recruits absolutely top tier talen for their books and a switch to generative AI would be catastrophic.
 

Remove ads

Top