Q&A: To find out more about AI, an NC State professor asked his students to cheat

0
388

At the end of a recent semester, North Carolina State University English professor Paul Fyfe told his students to cheat. Specifically, use the text-generating AI software ChatGPT to help write their final essays.

The rise of AI software brings with it the prospect of making our lives easier. But could it also be unethical? Is it plagiarism? And could it chip away at our own creativity?

WUNC’s Will Michaels spoke with Fyfe about what AI can and can’t do.

This conversation has been lightly edited for brevity and clarity.

Paul Fyfe: “I made them cheat on their final paper by using a text generating AI to ultimately fool me into not being able to tell what was theirs and what came from a computer. They turned in essays that ultimately did reveal which was which, and with that included a reflection on what they took away from the experience.

“The most interesting thing to me was that almost all of the students thought this was going to be easy — like, what was this professor letting them get away with? And all of them found out the opposite. Not only does the software not deliver what we expect it to, usually because we’re expecting AI to represent some free-floating super consciousness, but often it goes in directions that we don’t want or produces things that we don’t like or are flatly false. And some students really had a hard time with this. One even complained it was like being matched up in a nightmare group project with the class slacker.”

I’ve asked AI machines to generate summaries of articles for me just to see how well they do. And sometimes, as you say, they’re just plain wrong, which kind of brings up the question for me: How useful of a tool is this?

Fyfe: “I think that’s exactly what we’re all trying to figure out. Where we were really worried when ChatGPT hit the web, that all of a sudden this is going to be the end of writing, the end of English. In fact, AI seems to excel at our other kinds of outputs, including writing code. I’ve even anecdotally heard now it’s no longer an English major who’s worried about their jobs as the cliche goes, but students in computer science that are facing potentially a very different landscape. And you can imagine the awkward conversation at Thanksgiving, ‘Well, son, what are you going to do with your computer science major?’ But we’re also seeing students adapt it for brainstorming and idea generation fairly unproblematically. For example, I had a student tell me, she had a friend who used that only for discussion posts rather than to write whole papers.”

Why do you think AI literacy is so important in school curriculum?

Fyfe: “I think we’ve seen this before in some ways when Wikipedia came around, for instance, and suddenly there was deep concern about the fate of education and the quality of research. But what happened in response was — especially from our colleagues in Libraries and Information Science — the development of information literacy, responsible evaluation of sources and citational practices. So, I think AI literacy is the necessary expected next step. We are in a unique position to help students not only develop skills potentially for an AI enabled world, but also bring the critical perspectives and understanding about its cultural impacts for better and worse.”



Source link