Here are my thoughts regarding the adoption of AI technologies into K-12 education. I’ll start by providing some background and context for my perspectives.
Context: I began teaching in 1986. For technology, we had overhead projectors, 16 mm film projectors, slide projectors, opaque projectors, and ditto machines. Copy machines were several years away, as were laserdiscs and microcomputers. The Internet and cell phones were science fiction. Amazingly, we were still able to teach kids.
Context: I served as an instructional technology coordinator at both the district and school levels, managing and supporting all school instructional technologies for 12 years. I have also designed and provided professional development experiences for educators on various educational technologies for several decades. My career in and around education spans 39 years.
My perspective: Use technology when it advances and improves the educational experience, whether for teachers, students, or both. If it doesn’t, don’t.
Disclosure: I have been using artificial intelligence for almost two years. I have used it in my design work on creating school learning spaces. Over time, I have identified areas where the tool provides value, and other areas of my work where it doesn’t.
Disruption
I’m weary of the handwringing over the past year about AI, which has been extensive. This is typical of any emerging technology X education, but it seems much worse with AI. It’s exhausting and tiresome. And in many cases, it’s hyperbolic and counterproductive.
Since it's a disruptive technology, it presents a threat. Its use challenges the status quo of schools and what teachers have always done with their students. Its use challenges the traditional culture of schooling, just like the Internet did, just like 1:1 laptop programs did, and just like the technologies that come after AI will do.
Systemic thinking and planning
AI use in schools is a complex issue, involving a web of interconnected factors that confound adoption: the methods behind model development, training bias, the emergence of awful Nazi chatbots, the environmental impact of AI use, AI literacy, misuse, overuse, no use, cognitive offloading, over-dependency, and the list goes on. Addressing this complexity requires the development of a systemic strategic plan that includes a robust process for inclusive community engagement, prototyping, the development of policies and guidelines, budgeting, classroom application, professional learning, and the evaluation of the use and its implications.
However, it has been my experience that many schools struggle to develop strategic plans, especially for emerging technologies. While they are capable of managing complicated tasks like curriculum development or master scheduling, these are familiar processes with established routines. Applying AI to the educational experience of school, however, is an entirely different animal. This requires engaging with uncertainty and unfamiliar ethical considerations, along with an understanding of a technology that changes daily. That’s complex.
And to make it more difficult is that most school planning still happens within traditional systems and constraints. These systems, and the processes embodied within them, weren’t designed to address disruption at the scale that AI use potentially represents. Meeting the challenge of AI adoption requires broader and more divergent thinking, more innovative and inclusive planning and design, and an unwavering willingness to operate outside the conventional and traditional boundaries of school.
And as if things weren’t already challenging enough, schools are operating in a political climate where they are staring down massive funding cuts while the person overseeing U.S. education seems more likely to confuse AI with a steak sauce than to offer any meaningful leadership. In that context, along with all the other factors I just mentioned, it's no surprise that successful AI adoption and use will be highly challenging for most schools.
In response to the complexity of AI adoption, many schools have opted for turn-key AI solutions where the thinking, planning, and decision-making have already been done for them. It’s an easy way to check the box: We have AI. But this approach often bypasses the deeper work of building capacity and shaping AI use in ways that align with a school’s unique community values, student needs, teacher engagement, and other essential contexts. Such an approach dismisses the opportunity of a school and its community to grow with the technology, to try things and course correct, to improve in real time, and to create an environment of understanding, ownership, and strategic applicability. This approach contradicts the fundamental mission of schools as places of learning and wastes a crucial opportunity to cultivate a stronger, more inclusive school culture.
Use?
I don’t use AI for writing blog posts or reports, but I may ask AI to rework a paragraph that I am struggling with. And in many cases, it returns a pretty good response that I can use, although I don’t have to use anything it gives me. But my process always starts with entering my writing into ChatGPT and asking it to improve my passage. I don’t expect it to write for me from scratch, but I am comfortable with it reworking what I have given it. I think that is acceptable and something that kids should be given the opportunity to try, with the guidance of professional educators.
After all my use, I’m surprisingly still capable of critical thinking, even though many authors have written about the death of my critical thinking due to AI use. I’ve had to develop different dimensions of what that means so that I can deal with AI deepfakes and hallucinations. Honestly, I’m most worried about over-dependence, which I consider to be a significant issue. It's too easy to use it for various purposes, and I have to be mindful of that. However, over time, I have reached a point where I understand how and when to use AI, and my use is reflective of a healthy balance between AI and my intellectual thought and work. Shouldn’t students have the same opportunity to engage in this negotiation and learn from it?
Let’s be honest. Technology in schools has always been optional, which is just another way of saying, “Feel free to ignore it.” With that said, good things can happen in classrooms where technology is never used. But does that serve the future of kids, especially when other teachers are using technology? Such expectations create technology haves and have-nots among the student population, which results in a student experience across a school where students are not prepared equally.
The current focus of AI implementation in schools is primarily centered on improving teacher productivity and developing AI tutoring systems for students. The current efforts around implementation focus on applying AI within existing practices and frameworks, essentially trying to fit new capabilities into old models. To me, that feels like we're simply finding a place for AI within the status quo, rather than rethinking what’s possible. That approach fits within the educational mindset of technology integration, where the technology must find its place within the current practice of school.
Even more concerning about AI adoption is education’s continued reliance on outdated constructs, such as Bloom’s Taxonomy (which I first encountered in 1985) and models like TPACK and SAMR, to frame thinking about AI. These models had their place, but they weren’t built for the capabilities or challenges of AI. Yet instead of imagining what education could become, we continue to force new technologies into outdated thinking, clinging to familiar constructs rather than designing more contemporary and useful models.
This limited framing not only constrains potentially innovative new approaches, but it also leads to narrow assumptions about what AI should do in schools. The most common and first assumption about AI use by teachers is that it will save them time with tasks such as lesson planning, grading, and other administrative responsibilities. In my opinion, such a perspective reflects a simplistic and unimaginative view of technology's relationship to teaching and learning, especially when considering the potential of artificial intelligence. If you believe productivity is the primary benefit of AI, then tell me you haven’t really thought deeply about AI use in education without telling me you haven’t really thought deeply about AI use in education. And, if you think productivity is the big benefit of AI, read this research.
And if that’s not enough, the next knee-jerk reaction is even more telling: “The kids will cheat.” Yeah, maybe they will. But let’s be honest. They’ve always cheated when given crappy assignments that feel like busywork, especially when the teacher's only interaction with the students concerning the assignment is when it is assigned and when it’s collected. If your first instinct is to worry about cheating, what you’re saying is that you don’t trust your students. And that’s a bigger problem than AI. A better starting point? Collaborate with your students. Help them understand how AI works, what it can do, and how it can support their thinking and school work. And design assignments that are actually interesting, relevant, and worth doing.
Imagine a school where students co-design AI policies, where assignments are built around inquiry and co-creation with AI, and where teachers are supported to develop new literacies alongside their students.
However, one of the most common excuses is that teachers aren’t ready for AI because they don’t understand it, and their school hasn’t provided training. Honestly, that’s hard to swallow from a profession that prides itself on being made up of lifelong learners. Since when did learning become something educators wait to be handed? You don’t need a PD day, Technology Tuesdays, or a Lunch and Learn to start figuring out AI. Open a laptop, ask a question, explore. Challenge yourself as a professional learner to figure it out. Work with others to do this. That’s what learning looks like, or at least, what it should look like.
Which leads me to this - where is the curiosity and imagination in education? Don’t you have questions about AI and classroom teaching and learning? Aren’t you excited about the potential of AI and sharing it with your kids? Don’t you want to understand the complexities and nuances of AI so that you can help your kids understand the benefits and challenges of the dominant technology of their lifetimes? What do you wonder about? What are you curious about? Couldn’t you figure this out with the help of your kids? What a staggering opportunity for educators and their students! But it appears to be lost by focusing on cheating and how AI can help teachers write better lesson plans.
Am I being unfair to schools? Maybe, but maybe not. I realize that there are significant issues associated with AI use in schools. But that’s been true of other technologies. Honestly, schools face a wide range of issues, from budgets and funding to teacher recruitment and retention, and lingering impacts of the pandemic, among others. Schools certainly don’t have it easy.
So, where is all of this headed? My guess is that AI use in schools will follow the traditional trajectories of other technologies. There will be uneven use, with some teachers using AI and others not using it. Its use will be left up to the individual teacher. Some use will be effective and beneficial, most will be low-level use that supports current practice. There will be some interesting work by some teachers, with serious and thoughtful use designed to better understand AI and how it supports teaching and learning. The student experience will be variable, with some students getting exposure to AI, while others will not, based on the luck of the draw in their scheduling and what teachers they have and the teacher’s perspective on the classroom use of AI. But for the most part, the textbook, assigning the odds in math class, and the five-paragraph essay will live on, continuing to threaten the relevance of schools in a world rapidly reshaped by AI. Without a strategic, system-wide reimagining of what learning looks like in an AI-rich era, schools risk becoming more disconnected from the tools, skills, and ways of thinking that define the present and will undoubtedly determine the future. If schools continue to rely on outdated approaches and thinking, familiar curriculum and assignments, and fragmented implementation, they will not only miss an opportunity to evolve but also fail to prepare students for the complexities and demands of the world they are stepping into.