Why, oh why, would anyone use ChatGPT “to find sources”?!? It doesn’t find sources! It doesn’t run queries! What it does is produce text that follows patterns in the texts it has been trained on. Patterns. So it writes text that could be plausible, not text that is true. Why don’t people get this?!?
I wrote about an exchange I had with some of my computer-oriented friends yesterday and why ChatGPT is bad for academic papers, even if you're just using it to find sources. joshuapnudell.com/2024/10/04/s...
Yesterday I ended up in a long conversation with friends sparked by this article. They are computer people (programmers, IT, web admin types) and largely pro-AI, while I, the historian and teacher,…
“But computer am smart! Computer would never lie to me and make up sources that don’t exist!”
When it first came out I'd ask it to cite its sources and the links worked, but it was never for anything super academic. It was standards and government related, and it usually linked to those authorities.
One of my fellow students tested asking chatGPT for sources and it produced a list of studies, complete with DOI number, that don't exist.
I hope they’re not, but I wanted to show the people I was talking with that ChatGPT was useless for even the most mundane part of writing a paper, especially since they had been arguing that library searches could be one of the big benefits at least in the future.
People don't get this bc most people are about as intelligent as bread mold.