Generative AI is robbing students of critical thinking — and handing Big Tech more power over how knowledge is created.
That’s the warning from Kimberley Hardcastle, an assistant professor of business and marketing at Northumbria University.
In a piece for The Conversation, she argued that universities are distracted by plagiarism concerns while missing the deeper shift: students are outsourcing thought itself.
Students are skipping the hard part
Hardcastle pointed to data from Anthropic, the company behind Claude, which analyzed about one million anonymized conversations over an 18-day period in April.
After filtering to 574,740 education-related chats tied to verified university email accounts, the company found that 39.3% of student interactions involved creating or polishing educational content like essay drafts, practice questions, or study summaries. Another 33.5% asked Claude to solve assignments directly.
In doing so, Hardcastle said, students are delegating key parts of the learning process to machines.
“Students can produce sophisticated outputs without the cognitive journey traditionally required to create them,” she wrote.
She said the risk is that students are starting to validate ideas by how convincingly AI explains them rather than through their own independent analysis.
From learning to outsourcing to Big Tech
For centuries, education relied on teachers guiding students through messy reasoning and debate.
Generative AI is upending that, she said, by producing instant, authoritative-sounding answers that blur the line between original thought and machine-assisted shortcuts.
Related stories
Business Insider tells the innovative stories you want to know
Business Insider tells the innovative stories you want to know
Hardcastle called it an “intellectual revolution,” one where traditional skills like weighing evidence or evaluating sources risk being sidelined when the “source” is a black-box algorithm trained on data no one can fully see.
And what makes this shift more alarming, she said, is who controls it: a handful of tech companies now own the pipelines of knowledge, with their biases, design choices, and commercial incentives shaping what students learn and how they learn it.
It’s not the first time, though, Hardcastle said — social media already exploited attention for profit.
But this time, Hardcastle warned, the stakes are higher. It’s not just about what distracts us — it’s about how we think.
“It risks granting power over how knowledge is created to the tech companies producing generative AI tools,” she said.
Universities can’t sit back
Most universities are still reacting to surface issues like catching plagiarism, tweaking assessment, and teaching AI literacy, but that’s not enough, Hardcastle said.
The real task, she said, is ensuring pedagogy, not profit, defines how AI is used.
She, however, highlighted some progress, like centers for responsible AI, including at her own institution, working to put educators in the lead.
But without deliberate action, Big Tech could end up deciding what knowledge looks like for the next generation, she warned.
“Generative AI isn’t just a sophisticated calculator,” Hardcastle said, “it changes how we understand knowledge.”