What If the Problem Isn’t the Tools—But Our Literacy Around Them?
A secondary English teacher reviews student essays. Three are suspiciously polished. Two feel oddly generic. She wants to trust her students, but something feels off—and she has no framework for the conversation she needs to have.
For many educators, the conversation about AI in schools has felt rushed and reactive. New tools appear weekly. Policies lag behind practice. Teachers make judgment calls in real time—often without shared language or frameworks to guide them.
In a recent conversation with Med Kharbach, educator and founder of Educators Technology, we explored a reframing: what if the real challenge isn’t AI itself, but our literacy around it?
From Web 2.0 to Today’s Classroom Reality
Med’s work with educators stretches back to the early days of classroom blogging and Web 2.0 tools. Educators Technology grew from a personal blog into a globally trusted resource because it focused on pedagogy over novelty.
That lens matters now. Today’s generative AI tools don’t just speed up tasks—they shape how students write, research, and express thinking. Unlike calculators or spell-checkers, AI can generate entire essays, create study guides, and mimic original thinking.
This shift requires more than approved app lists. It requires educators to develop AI literacy as core professional practice—just as we’ve done with digital literacy and media literacy.
What AI Literacy Really Means for Educators
AI literacy isn’t about coding or technical expertise. It’s about making informed, ethical, and pedagogically sound decisions. This means:
- Understanding what AI tools can and cannot do
- Knowing how prompts shape outputs
- Recognizing bias, limitations, and data concerns
- Making intentional choices aligned with learning goals
A Practical Starting Point
Rather than chasing dozens of tools, Med encourages educators to master one chatbot and use it well:
- ChatGPT
- Gemini
- Claude
Example: A Secondary 4 history teacher might use one of these tools to generate primary source questions at different reading levels, create Socratic dialogues between historical figures, or brainstorm differentiation strategies—understanding how the tool works, when to use it, and where human judgment must override automated output.
A Simple Lens for Evaluating AI Tools
When considering specialized education tools, ask:
- Privacy & data: What’s collected? Where is it stored?
- Age appropriateness: Suitable for your students’ developmental stage?
- Transparency: Can users understand how outputs are generated?
- Workflow support: Does it genuinely save time or improve learning?
Often, thoughtful prompting in a general-purpose chatbot achieves the same outcomes as niche tools—with greater control and flexibility.
Assessment: Why Design Matters More Than Detection
AI detectors produce false positives and false negatives. They create adversarial relationships. Most importantly, they don’t address the real question: Did the student learn?
Med’s position is clear: detectors aren’t the solution—assessment design is.
Designing for Visible Thinking
Instead of proving whether AI was used, design tasks that make thinking visible:
- Oral explanations – “Walk me through your reasoning”
- In-class problem solving – Low-stakes practice with teacher observation
- Portfolios and journals – Track growth over time
- Process documentation – Show the messy middle of learning
- Reflective prompts – “What did you try first? Why this approach?”
Example Redesign:
Instead of: “Write an essay on the causes of WWI” (easily AI-generated)
Try: “Using our class sources, argue which cause was most significant. Tomorrow, defend your choice in a Socratic seminar.”
This supports a hybrid assessment model aligned with Quebec’s competency-based approach: students may use AI for formative learning and feedback, while teachers create moments to assess independent understanding. The goal isn’t restriction—it’s clarity about what matters and when.
Keeping the Human Core of Teaching Front and Center
Technology doesn’t replace teaching—it reshapes where teacher expertise matters most.
AI can: Draft, summarize, scaffold, generate examples
But it can’t: Read a classroom’s emotional climate, build trust, notice hesitation or curiosity, make ethical judgment calls about what a specific student needs
That’s the opportunity. When used thoughtfully, AI helps educators reclaim time for meaningful feedback, authentic relationships, and thoughtful learning design—the “What if you pushed this idea further?” conversations that truly matter.
Moving from Panic to Purpose: Getting Started
This moment calls for professional literacy, shared language, and intentional design.
Three Actions This Week:
- Experiment personally – Spend 30 minutes using one AI chatbot for a task you’d normally do (rubric creation, discussion questions). Notice what works and what doesn’t.
- Start a conversation – Ask a colleague: “Have you tried using AI for lesson planning?” Share this article with your team.
- Redesign one assessment – Pick an upcoming assignment and ask: “How could I make student thinking more visible?”
For School Boards and Administrators:
Professional development should focus on pedagogy first, tools second. In line with Quebec’s focus on competency development and the Référentiel de compétences professionnelles, create space for teachers to share experiments, develop shared norms around AI use, and collaborate on assessment redesign.
The question isn’t whether AI belongs in education.
It’s how thoughtfully we choose to use it.
By focusing on literacy over lockdown, and design over detection, educators can integrate new tools in ways that strengthen the work that matters most: helping students think, create, and grow.