As I round the corner toward a more serious and pragmatic consideration of retiring from teaching, I have begun to notice a disturbing pattern in the use and promotion of AI (Artificial Intelligence) in education, specifically the kinds of AI crafted by the likes of ChatGTP, Google’s Gemini, and Microsoft’s Copilot. These apps and AI agents have been “approved for use” in the school district where I teach, a district whose IT department has grown out of proportion at the Board Office level to day-to-day operational needs of the schools it purports to serve. From the district level to my own administrator, there has been a growing promotion for teachers to use these AI tools in classrooms and in teachers’ communications, organization, and scheduling. The nagging question at the back of many teachers’ minds is “Just how much of what is being taught will eventually end up as AI generated content, and when will teachers be replaced by lower-paying software application trouble-shooters and “human connection” guides in the classroom?” Perhaps a farther-distant future will see little or no need for human interaction in the traditional academic learning environment. This kind of future may already be upon us, a profit-driven industry disguised as the natural evolution of education to anyone with a bank account or a limited version of the antecedent MOOCs (Massive Open Online Courses) accessible to everyone.
While AI has the fascinating ability to automate much of the administrative work of teachers such as building lessons, quizzes, rubrics and exams, writing emails, generating personalized feedback, assessment and directions for further learning, what cannot be understated is how AI undermines the actual process of developing a way and habit of learning unique to the individual. We should not fool ourselves into believing the hype about how much time AI will save teachers from marking, developing lessons, individualizing assessment, and even providing more time to reflect on pedagogy and practice. Without realizing it, a teacher taking full advantage of AI and taking it for granted will succumb to the collective pedagogy and best practice that AI has scraped from the internet and offered a soup whose base is the same for everyone and without guardrails to protect users from “flavours” designed specifically to maintain engagement and grow dependency on it. This is the extended proliferation of AI slop that we see in the advertising banners and what YouTubers and influencers are dumping onto the net to keep us engaged and keep the cash flowing. As a tool for profit, AI agents can invest for you, build a website, populate your feed with AI subscribers, create online workshops, podcasts and seminars, and have their own pool of AI influencers “support your goals” all with AI generated characters that are quickly becoming indistinguishable from real people. It is Pavlovian operant conditioning on steroids.
You can wrap AI up in a very attractive package, but beyond the paywall are hordes of artificial creations with no lived experience that would otherwise ground them to the daily function of a classroom with living, breathing students, all with different approaches, ideas, distractions, and needs. Feedback from Generative AI (March, 2025) shows that although AI offers time saving options for teachers, prior studies examined in the study show that between 20% to 70% of students don’t even bother to revise their work after receiving individualized feedback generated by AI. The study itself showed specifically that half of the 14,236 students across grades 1-12 did not change a single character after receiving generative AI feedback. Much of the latest research focusses on generative AI feedback and how much students engage with it. One study showed that 71% of students did not revise a single word when given feedback in their English as a second language written work.
In responses to surveys conducted by the College Board between June 2024 and June 2025, three trends emerged in relation to generative AI: 69% of students reported using ChatGTP to help with school assignments and homework; students, parents, teachers, and administrators all agree that there are both benefits and risks involved in using generatiive AI tools in education; and a minority of schools are developing official GenAI (Generative Artificial Intelligence) use policies. This speaks to relative weight of issues that are not being resolved as fast as the companies are developing and selling these “tools” to education.
In a survey done by KPMG on AI use in Canadian schools, the emotional cost on students is revealed as a detrimental factor.
“65 per cent say they feel that they are cheating when they use generative AI
63 per cent worry they will get caught by their educator/educational institution for using generative AI tools without their knowledge”.
This may be accounted by early use of AI detection tools used by teachers and professors to ensure that students were not plagarizing, but new GenAI tools allow students to have AI write essays and work with subtle styles that mimic their own previous original works or prompt the AI agent to avoid the risk of plagarism altogether. However, ChatGTP-4 has passed the multistate bar exam with points well above the human average. While this might show how well AI can perform in specific data-driven tasks, this also helps to illuminate the need for strict policy use guidelines and guardrails for when and where AI should have its algorithm restricted.
Perhaps the weakest yet most damning effect GenAI has in the classroom is the looming proliferation of “learning slop” that simply does not get acted upon or even read by students. Add to that the cyclical nature of students using AI to produce work that is then assessed by AI in an effort to save educators time and encourage personalized feedback, and the result is a delivery and response system in education where neither the teacher nor the student have any incentive to engage in or develop a conscious process of learning. AI will imperceptibly homogenize delivery methods according to the programmers’ intentions and, with advanced LLMs (Large Language Models), GenAI may slow down or take over the evolution of human language. The functioning of society will change as AI begins to influence how we even develop soft skills such as problem-solving, interacting, and non-verbal communication. Intangible literacy like reading someone’s body language may already be a skill that our younger generations will need to be taught in a formal classroom setting. AI may create a paradigm shift in what schools teach and what values will sustain humanity.
References
Thorben Jansen, Andrea Horbach, and Jennifer Meyer. 2025. Feedback from Generative AI: Correlates of Student Engagement in Text Revision from 655 Classes from Primary and Secondary School. In Proceedings of the 15th International Learning Analytics and Knowledge Conference (LAK ’25). Association for Computing Machinery, New York, NY, USA, 831–836. https://doi.org/10.1145/3706468.3706494
College Board. October 6, 2025. New Research: Majority of High School Students Use Generative AI for Schoolwork. https://newsroom.collegeboard.org/new-research-majority-high-school-students-use-generative-ai-schoolwork
Katz, D. M., Bommarito, M. J., Gao, S., & Arredondo, P. (2024). GPT-4 passes the bar exam. Philosophical transactions. Series A, Mathematical, physical, and engineering sciences, 382(2270), 20230254. https://doi.org/10.1098/rsta.2023.0254


Recent Comments