N.J. professor: AI won’t replace learning if we prepare students to lead, not cheat
N.J. professor: AI won’t replace learning if we prepare students to lead, not cheat
Homepage   /    technology   /    N.J. professor: AI won’t replace learning if we prepare students to lead, not cheat

N.J. professor: AI won’t replace learning if we prepare students to lead, not cheat

🕒︎ 2025-11-10

Copyright NJ.com

N.J. professor: AI won’t replace learning if we prepare students to lead, not cheat

By Maurice J. Elias AI has the potential to short-circuit the learning process for many students by giving them a tool that will do much of their work for them. To avoid this, schools must adopt clear value statements about learning and academic integrity and ensure that students have the social-emotional competencies to use the technology in prosocial ways “even when no one is looking.” Neither the requisite value statements nor emphasis on students’ social-emotional competencies are the norm in schools at present. Putting emphasis on the students is not an example of “blaming the victim.” Most technological innovations — and certainly AI —should be conceptualized as “operator dependent.” A clear way to understand this is to consider a Stradivarius, certainly an innovation in the creation of violins. Stradivari are operator dependent — the sound they produce depends on the human being “operating” the instrument. Different humans will generate different sounds from the same instrument. It is therefore appropriate to consider the instrument and the player as a single unit of analysis. Evaluating the impact of either depends on the joint impact of both. The same has been true throughout the history of technological innovation — writing, the printing press, the telegraph, radio, telephone, and smart phones come quickly to mind — and is true for AI. We must look not only at the innovation but also the “operator.” Characteristics of the operator tend to be neglected in considering the impact of innovations, particularly those that are technological in nature. This cannot be the case with AI. Education systems are far from embracing the systematic teaching of the skills needed to prepare students to use AI constructively and ethically, which include empathy, problem solving, emotional self-control, capability of working in many different kinds of groups, organization, focus, and emotional self-awareness. Further, we cannot assume that children — or anyone — will direct their social-emotional skills for prosocial ends. As Theodore Roosevelt said, in a speech in Harrisburg, PA, Oct. 4, 1906: “To educate a person in mind and not in morals is to create a menace to society.” Martin Luther King, Jr. updated this in 1947: “The function of education, therefore, is to teach one to think intensively and to think critically. But education which stops with efficiency may prove the greatest menace to society. The most dangerous criminal may be the person gifted with reason , but with no morals.” We can anticipate much about potential trajectories of AI use from examining cyberbullying. Cyberbullying involves the use of social media technology to spread harmful, often vindictive, degrading, insulting, and false information about other people. Often, those individuals are members of “protected classes” (such as LGBTQ+, people with disabilities, racially minoritized students) who are reluctant to disclose what is happening to them. Estimates are that almost 30% of students in the United States have experienced cyberbullying at least once. Responses to cyberbullying include monitoring of children’s use of technology and social media, but what matters most is children’s moral compass and social-emotional problem-solving skills. Do they understand that what they are doing is harmful and wrong? Do they understand the short and long-term consequences of their actions for others, as well as the risks to their own reputation, freedoms, and privileges? And do they perceive why they are acting toward others in these unkind and disrespectful ways? Do they know their own emotions and their own goals? Evidence suggests that the answer to these questions is most often, “No.” We also must ask where they received the idea that abusing others is a good and reasonable thing to do. Is bullying tolerated in their classroom and school environments? Have they heard messages from influential adults in their lives that certain individuals “deserve” to be maltreated because they are somehow “less than” others? Looking ahead, ensuring the use of AI with integrity and in ethical ways represents at least as great a challenge as implementing its technology. The human operators of AI-infused learning systems must learn to use it with academic integrity, to find information and arrive at answers, not having them generated by AI engines. This requires schools to invest in systematic efforts to improve their humane culture and climate, articulate core values around honesty and integrity, and build students’ social-emotional problem-solving skills and their application across academic subject areas. Let’s equip all of our students to pick up the Stradivarius of AI and use it to make their own, unique, beautiful music.

Guess You Like