If we're clarifying, LLM's don't know things. An LLM knows nothing. They just guess based on what an algorithm decides is the most likely word next, not the most accurate one. That's why they hallucinate nonsense so much. They guess at everything because they don't have the capacity or...