You keep repeating this, but I never once put it through AI. I happened to agree with AI on that point only because that was my independent conclusion. Just because AI says it doesn't mean it is wrong.
I said "y'all" as a reference to multiple people. And while never accused you of putting it through an AI, you went along with its bad response. You can say "independent conclusion", but seems weird that you and the AI somehow managed to reach the same weird, limited conclusion.
I did read the intro though (and more). I have no idea if you read more than me or not. I was a bit confused by it at first when it kept referencing natural rights in reference to literary property, so i did read quite a bit trying to discern where that was coming from. No idea if it was more than you or not.
You can say this, but I don't think anyone could read the intro and somehow come to the immediate and wrong conclusion you did. It just comes off as incredibly weird and suspicious, to say the least.
I'm curious. What did you google to come across that 2012 article?
I'm trying to find something and I can't find an exact copy as to what I searched. I think I was looking up natural rights and copyright and that's where it may have come up, but I can't remember if it showed up on a search or if it was a footnote in an article. I've spent the last 5 minutes looking through the appropriate pages in my history and I still can't find where exactly I got it.
I trust experts with agendas even less than I trust LLM's. But maybe that's just me.
I would simply be skeptical of experts with biases (or at least do a cursory search to understand what their biases might be) and not trust LLMs until they can show more accuracy. Discernment and background research on a subject is the solution for the former, the solution for the latter is waiting for the technology to actually work as advertised.
Nope, it's not.
I do it find it hilarious/ironic when people who won't trust AI instead cite Wikipedia. Remember when more traditional sources... like encyclopedias...tried to convince us we should never trust crowdsourced info? That it couldn't possibly be reliable? How they would find examples of errors and say, "See!??!!? You can't trust this stuff! Buy our encyclopedias!!!"
Almost eerie, isn't it?
Wikipedia, while not great for citing, has (as
@SableWyvern says), a rather robust system of editing and citation. I would not trust it on its own, but you are far better able to track down citations and where information is coming from at any time. I've told my students "Wikipedia can be a start, but never an end", because it can guide you to interesting papers and sources. But even then, you need to teach people about understanding biases and framing.
LLMs have the problems you described, but the extra problem of something that is detached from the ability to reason trying to collate and discern things to fit a task. Sometimes that can work, other times it'll pick things like Reddit to be a source.
Let's compare like and like though. If I'm not using chat gpt then I'm googling in order to visit random websites for my information which isn't particularly great when it comes to finding accurate specific information about a topic. Generally it directs you to reddit or the like anyways. Until proven otherwise the information from chat gpt is probably just as reliable if not moreso than google search + random site or reddit subthread and 10x to 100x faster than navigating any site on my own to find the information i actually want.
Then you use your human discernment to judge the website as trustworthy, compare it to other facts or reliable sources, or look up the author to see their other works. ChatGPT is absolutely
not in any way, shape, or form more likely to get you a good answer because it's more likely to go to a place like Reddit or a message board than you are. Unlike you, it's just scrapping for answers and can't necessarily going to take into account sources. Unless you have an incredibly low opinion of yourself, it'll give you a faster answer but one much less likely to be right than if you did a bit of research on your own.