Even the article explains it: the prompt used for the joker is a case of overmemorization: they didn't curate their training data enough, so it led to a bug where there the enging concludes there is a single correct answer to a given prompt. There is no deying it happened, but it's not AI fault (unless you can generalize the finding to every engine, and the problem is specific to Midjourney since the NY Times articles is only mentionning them, probably because they couldn't replicate it with Dall-E or Stable Diffusion), it's MJ's fault. A design problem in a specific Toyota model's airbag can't be described as a "airbags in car don't work".