Adapted from Oxford University's "Use of generative AI tools to support learning" for secondary education in the United States
The Challenge We Face
AI tools like ChatGPT, Claude, and Grammarly are already in your students' hands—whether you've acknowledged it or not. Some students are using them thoughtfully to enhance their learning, others are using them to shortcut assignments, and many are simply confused about what's appropriate. Rather than trying to ban these tools entirely (an increasingly difficult proposition), educators need frameworks for helping students use AI responsibly while preserving the learning objectives that make education meaningful.
The key insight from Oxford's research is that AI use exists on a spectrum. Some applications clearly support learning, others clearly undermine it, and many fall into a gray area that requires careful consideration of your specific learning objectives. Your role as an educator is to help students navigate this spectrum thoughtfully, establishing clear expectations while teaching them to make good decisions about when and how to engage with AI tools.
Developing Your AI Policy Framework
Start with Learning Objectives
Before establishing AI policies for your classroom, clarify what you're actually trying to assess. Is this assignment meant to evaluate students' ability to organize information, their understanding of content, their writing skills, or their ability to think critically? Different learning objectives suggest different approaches to AI use.
For assignments focused on content mastery, some AI assistance with organization or clarity might be perfectly appropriate—the goal is demonstrating understanding, not perfect prose. But for assignments designed to develop writing skills or assess individual comprehension, AI use becomes more problematic. Being explicit about these distinctions helps students understand not just your rules, but the reasoning behind them.
Three Categories for Classroom Discussion
When introducing AI policies to your students, consider framing the conversation around three categories that you can adapt to your specific context:
Generally Appropriate Uses are those that enhance learning without replacing the intellectual work you want students to do. These might include using AI to brainstorm topics, explain difficult concepts, check grammar, or generate study questions. The key characteristic is that AI serves as a learning partner—helping students engage more deeply with material rather than avoiding that engagement.
Requires Permission covers the gray area where appropriateness depends on your specific assignment and objectives. This might include getting help with organization, using AI for translation assistance, or having AI walk through problem-solving steps. By establishing this middle category, you create space for nuanced conversations about AI use rather than forcing everything into binary allowed/forbidden categories.
Generally Inappropriate includes uses where AI replaces the work that's meant to demonstrate student learning—having AI write essays, complete homework assignments, or solve problems without student input. This category also includes any use during assessments unless explicitly permitted.
Supporting Academic Reading with AI
One of the most promising applications of AI in education is supporting students' development as readers of complex texts. However, there's a crucial difference between using AI to enhance reading comprehension and using it to avoid reading altogether.
The most effective approaches involve students doing their own thinking first, then using AI to check and expand their understanding. For example, after students read a challenging article, they might create their own list of key terms and main points, then ask AI to do the same and compare results. This process helps students see what they understood well and what they missed, while ensuring they've done the primary work of engaging with the text.
You might also have students use AI to generate discussion questions about assigned readings, which they then answer to deepen their understanding. These AI-generated questions often reveal connections and implications students might not consider on their own, but the value comes from students wrestling with the questions, not from AI providing answers.
For texts with particularly dense or technical language, AI can help by rephrasing complex passages in simpler terms. This is especially valuable when working with primary sources or academic papers. However, encourage students to check their understanding against the original text, as important nuances can get lost in translation.
Consider having students write their own summaries of readings, then compare them with AI-generated summaries. This comparison helps students identify gaps in their comprehension while giving them practice in distilling complex information—a crucial academic skill.
Teaching Critical Evaluation Skills
Perhaps the most important lesson you can teach about AI is how to critically evaluate its outputs. AI systems can make mistakes, misinterpret complex texts, or fabricate plausible-sounding but inaccurate information. Teaching students to verify AI responses against original sources and reliable references isn't just about catching errors—it's about developing information literacy skills they'll need throughout their lives.
Make verification part of your explicit instruction. When students use AI for research or explanation, require them to fact-check key claims against multiple sources. When AI provides definitions or explanations, have students compare them with authoritative sources. This verification process itself becomes a valuable learning experience that develops critical thinking skills.
Practical Implementation Strategies
Start Small and Iterate
Rather than trying to develop comprehensive AI policies immediately, consider starting with one or two clear guidelines and expanding as you gain experience. You might begin by allowing AI for brainstorming and grammar checking while prohibiting it for final draft writing, then adjust based on how students respond and what you observe about the impact on learning.
Make Expectations Explicit
Students need clear guidance about when and how AI use is appropriate in your class. Consider providing specific examples: "You may use AI to help brainstorm essay topics and to check grammar, but the ideas, arguments, and writing should be your own." Ambiguity breeds anxiety and inconsistent application.
Require Documentation
When students use AI, ask them to document how they used it. This might be as simple as a note at the end of assignments: "I used ChatGPT to help brainstorm potential thesis statements and Grammarly to check for grammatical errors." This documentation serves multiple purposes—it promotes transparency, helps you understand how students are engaging with AI, and encourages students to be intentional about their AI use.
Address Equity Concerns
Not all students have equal access to AI tools or equal comfort using them. Consider how your AI policies might advantage some students over others. You might need to provide in-class time for students to experiment with AI tools, offer alternatives for students without access, or adjust expectations based on varying levels of AI literacy.
Managing Common Challenges
The Detection Question
Many teachers wonder whether they should use AI detection tools to catch unauthorized AI use. The research suggests these tools are unreliable and can produce false positives that unfairly penalize students. Instead of focusing on detection, consider designing assessments that are naturally resistant to inappropriate AI use—emphasizing process over product, requiring personal reflection, or incorporating in-class components.
When Students Cross Lines
When students use AI inappropriately, treat it as a learning opportunity rather than just a disciplinary issue. Help them understand how their AI use prevented them from achieving the learning objectives of the assignment. Often, students who misuse AI haven't considered the difference between getting a task done and actually learning from it.
Adapting Existing Assignments
Many traditional assignments can be modified to work effectively in an AI-enabled environment. Instead of eliminating essay assignments, you might require students to submit their brainstorming notes, outline drafts, and revision process along with final papers. Instead of avoiding research projects, you might explicitly teach students to use AI for initial exploration while requiring them to verify and synthesize information from primary sources.
Professional Development Considerations
Developing effective approaches to AI in education requires ongoing learning and adaptation. Consider collaborating with colleagues to share experiences and strategies. What works in English class might not work in math, and what's appropriate for ninth graders might not suit AP students.
Stay informed about developments in AI technology and educational research on AI use. The landscape is evolving rapidly, and policies that make sense today might need adjustment as tools become more sophisticated or as we learn more about their educational impact.
Most importantly, remember that your students are often experimenting with these tools whether you know it or not. Creating space for open dialogue about AI use—its benefits and limitations—helps students develop the critical thinking skills they need to navigate an AI-enabled world responsibly.
Moving Forward
The goal isn't to eliminate AI from your classroom—it's to harness its potential while preserving what's most valuable about education: the development of students' capacity to think critically, communicate effectively, and engage meaningfully with complex ideas. This requires thoughtful policies, clear communication, and ongoing dialogue with students about how to use these powerful tools responsibly.
By approaching AI as an opportunity to clarify learning objectives and teach critical evaluation skills, rather than just as a threat to academic integrity, you can help your students develop the skills they need to thrive in an AI-enabled world while maintaining the intellectual rigor that makes education worthwhile.
This guidance is adapted from Oxford University's "Use of generative AI tools to support learning." Individual schools and districts may have additional or different policies that take precedence over these general recommendations.
I enjoyed your post, Sean, particularly your thoughts about supporting academic reading with AI. Useful ideas