And that would be a function I wouldn't trust LLMs with, most things outside of 'creative' writing I wouldn't trust an LLM with. Why? Because I don't know the actual answer, never ask a LLM a question you don't know the answer to, so you can check the actual output.
So, if a book actually harms...