Prompt blocked because I wanted my armor SILVER steel, instead of steel?!
Just dont Dall E.
Two more guesses at the strange behaviour.
1. Typo correction.
We've seen that D3 is able to do heavy orthographic corrections (understanding that Micky is Mickey, but far more involved sometimes). Maybe he corrupts some prompts by correcting the language to another, unacceptable word. You want to describe a slot machine and he corrects you to a slut machine. BAM, banned. Why would it correct an existing word, you'll say to me? Because I am more and more convinced that the whole process is AI driven. He's correcting based on probabilities, not the actual existence of a word. Heck, Google Doc insists on correcting my grammar by underlining "errors" while my original writing is right, so maybe it's just that. Especially if there are actual typos in the prompt.
2. Prompt rewriting.
From the video where people interact with the engine through ChatGPT 4 instead of Bing, we can see the prompts are rewritten before being fed to the engine. If you ask for 4 persons, he'll rewrite it to have a mix of men and women (beware if you're asking for four popes...) of mixed ethnicity. We also know, by trial and error, that bing doesn't do that as much (because we don't get default diverse pictures). But can we discount the idea that prompt rewording isn't happening at all to create 4 distinctive pictures each time? If this was the case, some image would get problematic prompt and be blocked, while other, with the same human-provided description, would pass. And the dog block would only appear when all 4 images are blocked.
So maybe you asked for a knight in silver steel armor and the problem isn't what you typed or not, but the added elements of prompt diversification? Like "silver sometimes is associated with silver surfer" "let's add that to the prompt" "booh, the prompt refers to a commercial property, let's ban this prompt".