Copyright The Boston Globe

A year after an AI-plagiarism lawsuit in Hingham High School made national headlines, few schools in Massachusetts have set clear boundaries on how the technology should be used in class, often leaving teachers and students to navigate their own paths. Some students are using apps such as ChatGPT to write whole assignments while teachers are left paying for their own AI-detection software or scrambling to figure out ways to positively use it in class. In Hingham, a student was accused of using AI to write a history assignment and kept out of the National Honor Society. His parents sued, alleging the school had no guidelines prohibiting AI usage. A judge initially ruled in favor of the school and the case was dismissed in February. In August, the Department of Elementary and Secondary Education put out guidelines for the responsible use of AI in classrooms. But the Massachusetts Association of School Committees, which develops model polices for districts, is still crafting its recommendations. Of 300 districts statewide, the association has seen at least nine adopt AI polices. The policies typically offer guidance for incorporating AI into classrooms and insist on ethical uses. Meanwhile, the Trump administration is urging greater use of AI into classrooms through an executive order. And Massachusetts rolled out a pilot program this fall to help incorporate AI into curriculums in 30 school districts. The new landscape requires more ground rules to prevent cheating, so students don’t lose their ability to develop new skills, teachers said. Robert Comeau, an AP literature teacher at the John D. O’Bryant School of Mathematics and Science in Boston, is paying out of pocket for an AI-detection software called Draftback. The tool, which costs $6.99 a month or $40 a year, analyzes a student’s Google document revision history to count the number of key strokes that were used to write an essay and highlights any content that was quickly pasted into the assignment. “Trust is such an important aspect of teaching,” Comeau said. “If this is making us not trust each other, that’s a real problem.” But Comeau also sees AI tools as potentially useful for helping kids learn. He had students use Google Gemini to find relevant quotes from the book “Frankenstein” before they sat down to write an essay. And at a national AI symposium hosted by the American Federation of Teachers, Comeau learned how to set up an AI-powered “Socrates bot.” The bot won’t give students direct answers but instead responds with another question to inspire critical thought. Comeau said he welcomed Boston’s participation in the state-run pilot, which has already offered guidance about safe uses of AI tools, like Google Gemini and Google Classroom. But teachers are still struggling with detecting plagiarism, he said. In a statement to the Globe, Boston Public Schools said the district is excited to be part of the state’s pilot, which “aligns with BPS’s commitment to ensure educators are equipped with the knowledge and skills to teach students how to use and integrate AI responsibly into their learning as they continue to expand their creative and critical thinking skills.” The AI pilot is just one part of Governor Maura Healey’s $100 million AI initiative to stimulate government and private sector efforts in the same way state funding helped build the state’s life sciences sector. The schools pilot is spearheaded by the Massachusetts STEM Advisory Council and the Massachusetts Technology Collaborative in partnership with the national non-profit Project Lead The Way. It aims to teach about 1,600 students the fundamentals of machine learning as well as the “societal implications of AI.” Most states have issued AI guidelines for schools but only a handful, including Massachusetts, have established their own pilot programs. Sabrina Mansur, who heads up the state’s AI push as director of the Massachusetts Artificial Intelligence Hub, said the goal is make the state a leader in “the responsible use of AI.” When it comes to plagiarism, only 18 percent of teachers said they were getting help detecting if students submitted AI-generated work, according to a nationwide survey by the digital rights nonprofit Center for Democracy and Technology in Washington, D.C. And only 15 percent said they knew how to respond if they suspected AI was being used “in ways that are not allowed.” Fewer than one-third of teachers said they received guidance on how to use AI tools effectively. Students are also lacking direction on how to use AI in school. Many teenagers are hungry for more resources and more conversation around the proper uses of AI. Almost nine out of 10 students have already tried AI for school work or personal uses, according to the survey. “I think more in depth discussion of it would be helpful because there are good ways and bad ways to use it,” said Noe Voskuil, a junior at Lexington High School. Using AI improperly in school can be “harmful” to developing “critical thinking skills,” she said. Curious to learn more, Voskuil joined the Student Publication and Research Collaborative. The group is made up of students that interview other kids on their views about AI andpublishes their findings in academic journals. The collaborative started in an English class after students wanted to go beyond an assigned research project on AI and keep the conversation going. The first study found that student attitudes toward using ChatGPT tended to be colored by various factors including whether they believed it could help learning or improve their grades. “We haven’t had a bunch of actual training on how to use AI [in school], Voskuil, 16, said. ”Mostly it’s a few sentences at the beginning of the year [from teachers] either saying don’t use AI, or you can use AI for x, y, and z, and then that’s the last they talk about AI for most of the year." Lexington’s school department did not respond to a request for comment. Teachers won’t be able to satisfy students’ curiosity and deal with the challenges of AI without more help, said Jessica Tang, president of the American Federation of Teachers in Massachusetts. “The use of AI in schools cannot be solved or monitored by educators alone,” she said. When it comes to developing and implementing policies around AI in classrooms “schools are playing catch up,” said Elizabeth Laird, director of equity and civic technology at Center for Democracy and Technology. “The adoption of this technology, in many cases, is ahead of the policy.” In Massachusetts, teachers in participating districts of the state-wide AI-pilot program received training on how to implement the new curriculum. And the Department of Elementary and Secondary Education (DESE) also created an optional AI Literacy for Educators course that certifies teachers have a basic understanding of artificial intelligence. A bill introduced in the state Legislature earlier this year by Senator Jake Oliveira would establish a committee to study how AI is used in schools, advise on best practices, and suggest if it could be “regulated better,” Oliveira said. While some teachers who spoke with the Globe were wary of rigid state mandates, they said they craved more discussion among educators and school officials. For Erik Berg, president of the Boston Teachers Union, that means creating more “common sense guidelines” around the use of AI, “both for our students and by our students.”