There's a lot of noise right now about AI in education—new tools, new threats, new opportunities. For every headline touting transformation, there's another warning of collapse. Depending on who you ask, ChatGPT is either a miracle tutor or the death of thinking. As usual, the truth lives in the weeds, and one thing is becoming increasingly clear: AI is not just a supplement to education. It's a force that's pushing us to reconsider the structure, purpose, and values of school itself. That's exciting and terrifying and long overdue.
What We Built—and Why It's Breaking
School has always been a bundle—instruction, supervision, socialization, meals, emotional development, credentialing—all wrapped up into one overworked, underfunded institution. Teachers have been expected to deliver all of it, usually with shrinking support and rising demands. We've patched and propped this system for decades, but AI didn't break it. What AI is doing is revealing just how strained and unsustainable it already was.
Because suddenly, we have something new. Something that can explain, scaffold, assess, and personalize. Something that never gets tired or distracted or pulled into cafeteria duty. Something that can, in theory, offload a significant portion of what we used to call "teaching." So what do we do now?
AI as the Great Unbundler
AI enables disaggregation. Students can now access high-quality explanations, feedback, and examples without waiting for class or raising a hand. A motivated student with a laptop and curiosity can get more targeted help in an afternoon than some get in a semester. But this kind of access is not evenly distributed, and motivation is not magic, nor is it developmentally guaranteed. Recent studies from MIT and Stanford have begun documenting how AI tutoring systems can provide highly personalized instruction, but they've also revealed significant disparities in who benefits most from these tools.
AI's presence reveals how much of education is built not on instruction, but on structure—on showing up, being seen, being part of a rhythm. Unbundling instruction from everything else doesn't solve the problem; it shows us how entangled and fragile our solution really was.
The Human Role Isn't Gone—But It's Changing
If AI can teach the content, what's left for the human teacher? The answer isn't "nothing," but it's not the same "something" we've always assumed. Human educators will need to become facilitators of complexity, mentors in a sea of information, curators of dialogue, and builders of trust. That's harder than it sounds, and it also means teachers need time to do that work.
Right now, we're adding AI to an already overloaded role, without stripping away the tasks AI could and maybe should take on. That's not transformation—it's acceleration toward burnout. Educational researchers like Cathy Davidson at CUNY and Justin Reich at MIT have been documenting this pattern for years: new technologies getting layered onto existing structures rather than fundamentally reshaping them. To make this shift real, we need to make choices about what we want humans to do and what we can finally let go.
The Trouble with Transformation
This isn't just a technical shift—it's a philosophical one, and philosophy, as always, is messy. Do we want schools to optimize for mastery or meaning? Will students engage with AI to deepen thinking or to shortcut it? Is AI making learning more personalized or more isolated?
There's no guarantee this goes well. Plenty of incentives push toward efficiency over empathy, cost-cutting over care. If we're not careful, we may find ourselves with AI-powered education that's cheaper and faster but emptier. We should also be wary of narratives that sell AI as the savior. It is a tool, and like all tools, it reflects the values of its users and designers. Garbage in, garbage out—but now at scale.
Equity: The Deepest Risk
The divide is already growing. Wealthier families are incorporating AI tutors, generative tools, and custom learning pathways into their children's routines. Meanwhile, many schools still block the same tools out of fear or policy lag. Teachers worry—reasonably—about cheating. Administrators worry about liability. Parents are split. This pattern mirrors what sociologists call "digital redlining"—the same technologies that could democratize access instead amplify existing inequalities when implemented without intentional equity frameworks.
But fear won't protect the students most at risk. In fact, it may leave them further behind. If AI becomes the new private tutor—available to those who are taught how to use it and forbidden to those who aren't —we're not heading into a tech utopia. We're accelerating a caste system of cognitive capital.
Equity doesn't mean shielding kids from AI. It means teaching them to use it critically, creatively, and with care. Anything less is malpractice. As danah boyd has argued in her research on algorithmic literacy, students need to understand not just how to use these tools, but how these tools work—and how they might be working on us.
What Comes Next?
The future of education won't be written by algorithms. It will be written in the decisions we make about how to live alongside them. We can use AI to automate more worksheets and standardize more learning, or we can use it to free teachers to focus on the hardest, most human parts of their job: listening, questioning, mentoring, and inspiring.
That choice is still ours—for now. But it requires us to move past the hype and the hand-wringing, and start doing the real work: rethinking the purpose of school, the shape of teaching, and the kind of society we want education to serve. Because whether we like it or not, AI is in the classroom. The question is whether we are, too.
Ha, I have just written a very similar thing, with more rant. 😊