I'm used to students relying on AI to answer questions and write essays at this point. It's not preferable, of course, especially since I teach entirely online, but I can still usually lead discussions in a productive way. Plus, it's u...


I'm used to students relying on AI to answer questions and write essays at this point. It's not preferable, of course, especially since I teach entirely online, but I can still usually lead discussions in a productive way. Plus, it's usually quite easy to tell when the student is leaning on it too heavily. A few weeks ago I had an experience that really shocked me, though. I had a student for often would not respond to questions for a minute or so; I usually assumed that she was distracted on another tab, or that she just liked to think about her answers for awhile. Then, in one of our classes together, I asked her to analyze this political cartoon. We went over the really basic questions in the link, so just simple stuff like "what does the word literacy mean" and "who is the man on top of the wall." I noticed that she was answering more slowly than normal, but didn't think about it much. Then I asked her what the words on the wall were; they're right in the center of the image in giant letters, so it should take all of three seconds. Instead, she sat there for about two minutes without responding, even with additional prompting. Okay, I figured she was definitely distracted, but I don't have any way to enforce participation, so I just accepted it. Then she finally responded. "It says 'add to the peel'" I had no idea what she was even saying. I asked her to type it out in the online meeting chat, and she did so. I asked her what that means; "I don't know." I asked her if it was slang I hadn't heard of; she said no. I drew over the words on the wall and she confirmed that it was where she was looking. I asked her to read it again and she repeated herself. Finally, after asking her another time to explain what she thought it meant, she said "I don't know, that's just what ChatGPT said." I was completely flabbergasted. I asked her and confirmed that she had, in fact, asked ChatGPT to tell her what the words in a political cartoon said. Not to analyze the image, or even to identify harder elements like the pens or books on top of the wall, but just to read off the words. Suddenly, a lot more made sense. I thought back to all the times that she'd been slow to answer, compared to other very similar questions where she answered instantly. I'd never be able to prove it, but I'm certain that those were all instances of her asking the AI to answer questions regardless of how much it would even make sense to do so. This also explained the sometimes strange responses, like when I asked her to read off her own words and found her unable to do so. Indeed, I asked her if she had used AI in a few specific moments before, and she said she had. I know I'm rambling at this point, but I'm just baffled. It's not surprising to me at all that students would use AI to write essays, but I don't see how even the highest degree of usage could result in students deferring 100% of their thinking to it, to the point where they're asking it questions that are completely circumstantial. submitted by /u/Frosty-Suspect-9423 [link] [comments]