The racism behind chatGPT we aren't talking about... This year, I learned that students use chatGPT because they believe it helps them sound more respectable. And I learned that it absolutely does not work. A thread. 🧵
PreGPT, but may interest you. doi.org/10.1002/acp.... From abstract: Experiments 1–3 manipulate complexity of texts and find a negative relationship between complexity and judged intelligence.
Most texts on writing style encourage authors to avoid overly-complex words. However, a majority of undergraduates admit to deliberately increasing the complexity of their vocabulary so as to give th....
All written work should be pen & paper! (At least the 1st draft)
A few weeks ago, I was working on a paper with one of my RAs. I have permission from them to share this story. They had done the research and the draft. I was to come in and make minor edits, clarify the method, add some background literature, and we were to refine the discussion together.
One time I was chatting with it and it told me something incorrect. I pointed it out and then it got an attitude and told me that the conversation was over. I was a bit shocked. It's self righteous too. Probably from learning from bitchy threads on Reddit.
@itspegah.bsky.social thoughts on this thread.
I found a similar problem with GPT in software. It does a good job getting from “0 to 60%” but then it fails harder the closer we get toward excellence. Ask it to make tiny improvements and instead it introduces huge problems. It remembers, but it never understands.
Chat GPT makes people sound so fake and predictable 💯
Thank you for sharing this story. As an immigrant, it's disheartening to read stories like this, but at the same time, I'm glad that there are people like you who want to help and want to raise awareness.
I used to teach prep school, and one of my very first thoughts about chatgpt is that it sounds like prep school students are taught to - glib and fluent, even if sometimes they aren’t saying anything.
Really informative (if somewhat shocking) thread. In my line of work (medical research charity), we are looking at AI tools like Chat GPT, but not in research or policy (too complex and nuanced).