Even if most students concede it’s in their best interest to use their own critical thinking skills on school assignments, it can be difficult for educators to detect when ChatGPT or another generative AI tool was used.
Current conversational AI models like ChatGPT are imperfect, and their outputs can carry telltale signs that a block of text wasn’t produced by a human—chief among them plagiarized or false information.
Should ChatGPT be used to complete homework? Pose this question to the tool itself and even it will advise users: “Be cautious.”
“It is recommended to approach ChatGPT as a tool for inspiration, idea generation, or to gain a different perspective on a topic,” the tool spits out, adding that its responses “may not always be accurate, reliable, or suitable for academic purposes.”
But while engineers are still tweaking how ChatGPT approaches writing, the tool is fairly good at numerous other tasks. For instance, math professors have said detecting the use of AI to solve math equations is more difficult for them than searching text for a plagiarized line since math problems can have very specific solutions.
“If ChatGPT does the math for you, what are students learning?” David Kahn, assistant mathematics professor at New York’s Stony Brook University, asked in a conversation with the school’s student news organization.
Despite concerns, most teachers are embracing the technology against a backdrop of a decade-long decline in testing scores for U.S. students.
A 51% majority of teachers and 1 in 3 students aged 12-17 reported using AI in the classroom, according to a March 2023 study funded by the Walton Family Foundation, the charitable arm of the Walton family that owns Walmart.