Technology

What counts as cheating with AI? Schools grapple with drawing the line

What counts as cheating with AI? Schools grapple with drawing the line

A teacher tells a student not to use AI in a research assignment. But when the student does a browser search, an AI-generated explanation pops up unprompted.
Has the student just cheated? What now?
Navigating the use and misuse of artificial intelligence in school is complex and confusing — especially when it comes to cheating.
“The cheating is off the charts. It’s the worst I’ve seen in my entire career, “ said Valencia High School English teacher Casey Cuny, a 23-year veteran. “Anything you send home, you have to assume is being AI’ed,” he said.
In late 2022, after ChatGPT launched, many schools initially banned AI, fearing it would be used to churn out term papers, compose presentations and farm out math homework. And even though such uses have come to pass, views on how to respond have shifted dramatically.
Like many concerned educators, Cuny is not calling for an AI ban. Instead, “AI literacy” has become a buzzword of the back-to-school season, with a focus on how to leverage the potential of AI while minimizing its risks.
Ultimately, students will need to know how to use AI effectively and ethically, said Denise Pope, a senior lecturer at Stanford who is the co-lead researcher of a long-term, ongoing study of student cheating.
“Let’s really look at what is the purpose of education,” Pope said. “What are the skills that kids will need to know when they get out of this sort of particular environment of school.”
Cheating was already happening
Researchers at Stanford, led by Pope and colleague Victor Lee, have concluded that the prevalence of cheating does not appear to be greater than before AI. What’s changed is the technology that underpins cheating.
In the Stanford study, which began well before the public availability of ChatGPT, students report anonymously on behaviors within the last month, including:
Looking at someone else’s answer during a test
Using crib sheets
Hiding textbooks in bathroom stalls and using bathroom passes during exams
Paying students from earlier periods to leak test questions to later test-takers.
New behaviors include using AI to write all or parts of papers or using it to summarize books that the student will never crack open.
The Stanford researchers concluded that cheating was common before AI — and it remains so. It is the nature of cheating that is evolving.
“This year’s data is showing a decline in copying off a peer and it seems there is more use of AI instead,” said Lee, an associate professor at the Stanford Graduate School of Education.
In these surveys, about 3 in 4 students reported behaviors in the last month that qualify as cheating, figures similar to what was reported prior to AI.
To conduct the survey, researchers partner with individual schools across the United States — to examine each school’s own cheating patterns for grades four and higher. Then the school information is combined for cumulative data analysis. Hundreds of schools have participated since 2009.
What to do about cheating
Given what AI can do, graded work that is completed at home — such as a book report or a five-paragraph essay — could become an assignment of the past.
Instead teachers are moving to timed in-class essays written by hand, for example. But every potential solution also has limitations. Some students underperform on a high-stakes, timed test, while many lack fast and fluid handwriting skills — having moved to keyboarding. Some teachers, including Cuny, lock down classroom computers during a test — to allow keyboarding but no internet access.
There’s also rapidly improving technology to defeat just about any cheating remedy: smart glasses, smart earbuds, special smartwatches and even smart pens with tiny screens that can scan test content.
The more elevated strategy, said Pope, is to address why students cheat.
Sometimes the issue is “overload in terms of work — homework or job responsibilities, taking care of family,” Pope said. Students can feel that an assignment is busy work or not understand its purpose. Also, the decision to cheat can be motivated by “how you feel about the teacher or professor.” There’s also the pressure to perform at all costs to make the team or get into college.
At least some of the whys of cheating can be tamped down, resulting in less cheating, Pope said.
“Learning can be fun and joyful, and I think we’ve conflated pain and suffering with learning,” said Michael Hernandez, an L.A.-area high school teacher and author during a recent webinar for educators on AI and cheating. “Go back to the basics of what good learning is about. And it doesn’t mean going back to the basics of handwriting essays in class. It means going back to purpose, passion, agency, inquiry, curiosity and excitement.”
He added that if teachers set up an assessment system that expects the exact same answer from every student at the same time, the teacher is inviting cheating. Yet this is the practice involved in much of standardized testing — a fundamental tool to evaluate students, schools and often teachers.
When they can, teachers may want to choose harder-to-cheat-on assignments. Experts cited the performance of a play or skit or writing an article for a school newspaper — although AI can inform and improve the latter — that can be part of the learning. Students also could be asked to give oral presentations without notes to show what they know.
To replace traditional tests, some experts want teachers to rely more on assigning group and individual longer-term projects and on building portfolios over time that demonstrate academic progress and in-depth knowledge.
Instructors “are the linchpin in whether AI enhances or undermines learning,” according to a new USC study. “Students are more likely to use AI in deeper, more educational ways when professors provide clear guidance.”
AI use expanding quickly
One recent analysis found that the share of 13- to 17-year olds using AI doubled from 13% to 26% in one year, from 2023 to 2024, and experts believe the explosive increase has continued. Studies suggest that nearly all college students are using AI to some degree.
As with adults at work, AI has opened up strategies for students that save time and improve accuracy — which could include cheating. But educators are grappling with many nuanced scenarios.
If AI solves a calculus problem or writes an essay for a student, that would match just about any definition of cheating.
But what if a student did not understand the calculus lesson — and what if an AI explanation provided for one problem helped the student solve the next three on his own? What if the student integrated various AI answers into her own essay — but did not understand when it is acceptable to quote verbatim or what needs to be referenced to original sources? Is that a cheating issue or a learning challenge?
In January 2023, the New York City school system banned ChatGPT, citing cheating as a concern along with intellectual dependency, and the accuracy and safety of content. The ban was reversed four months later. The district opted instead to manage the use of AI, including by providing educators with examples how to use it to ease administrative tasks and improve teaching.