SableWyvern
Cruel Despot
If I want accurate information on something I care about, I'm absolutely checking multiple sites and doing my best to verify their authenticity. That type of effort has always been necessary when conducting research.Let's compare like and like though. If I'm not using chat gpt then I'm googling in order to visit random websites for my information which isn't particularly great when it comes to finding accurate specific information about a topic. Generally it directs you to reddit or the like anyways. Until proven otherwise the information from chat gpt is probably just as reliable if not moreso than google search + random site or reddit subthread and 10x to 100x faster than navigating any site on my own to find the information i actually want.
If I don't care about accuracy, I'm not likely to care enough to search in the first place, I'll just invent my own answer.
I would posit that learning to look for signs of accuracy and reliability in a source is an extremely valuable skill, and relying on an LLM instead of developing that skill is probably not a good long term strategy.







