Schools are under pressure to develop student AI literacy. The message from governments, international frameworks, and employers is consistent: young people need to understand AI, use it critically, and navigate a world in which it is everywhere. Most school leaders accept this. The question is how to actually bring it about.
Many schools have started with the students. They have introduced AI modules into the computing curriculum, added AI literacy to form time programmes, or simply told students to be careful about using AI tools for assessed work. These are reasonable instincts. They are also, largely, the wrong place to start.
The evidence from educational research, and the explicit position of every major international framework that addresses this question, is that student AI literacy cannot be effectively developed without teacher AI competency coming first. Not alongside. Not at the same time. First.
This article sets out why that sequencing matters, what teacher AI competency actually involves, what happens when schools get it the wrong way round, and how to build it properly.
Every major international framework reviewed for this article identifies the same root problem: schools are adopting AI tools faster than they are developing the structures to use them safely and well.
The case for starting with students
Before making the argument for teacher-first sequencing, it is worth taking the opposing position seriously. There are genuine reasons why schools gravitate towards starting with students.
Students are, in many cases, more fluent with AI tools than their teachers. They have been using large language models for personal tasks since well before most schools had formed a view on the matter. Starting with students means meeting them where they are, and there is an argument that young people's existing AI use habits, once acknowledged and discussed openly, are themselves a legitimate starting point for developing literacy.
There is also a practical argument: curriculum time for student-facing AI education is easier to identify than protected time for staff development. Slotting an AI literacy unit into a PSHE or computing scheme of work is straightforward. Creating the conditions for genuine teacher professional development is considerably harder.
These are real constraints, and they deserve acknowledgement. But neither of them changes the fundamental sequencing problem.
Why teacher competency must come first
The argument for teacher-first sequencing rests on three foundations: the nature of modelling in education, the limits of decontextualised AI literacy curricula, and the evidence on what effective professional development in technology looks like.
Modelling is how values transfer
Students learn dispositions, not just skills, from observing their teachers. A teacher who uses AI tools fluently, critically, and transparently in front of students is doing something far more powerful than any AI literacy module. They are demonstrating what a thoughtful relationship with AI looks like in practice.
The inverse is equally true. A teacher who is anxious about AI, who avoids mentioning it, or who treats it purely as a cheating risk communicates a relationship with AI that students absorb even when it is never stated explicitly. You cannot teach a critical relationship with AI if you do not have one yourself. This is not a criticism of teachers. It is a statement about how professional knowledge works.
UNESCO's AI Competency Framework for Teachers makes exactly this point. It argues that teacher AI competency is not a precondition for AI education in the way that knowing a subject is a precondition for teaching it. It is more than that: the teacher's own relationship with AI is part of the curriculum.
AI literacy without context does not stick
Student AI literacy programmes that run independently of classroom practice have a well-documented limitation: students compartmentalise the learning. They understand AI literacy as a lesson, not as a lens they apply to their actual work across subjects.
Effective AI literacy is developed when students are asked to think critically about AI tools in the context of real learning tasks: when a history teacher draws attention to the limitations of AI-generated sources, when an English teacher discusses what AI writing lacks and why, when a science teacher interrogates the training data behind an AI prediction. These moments of embedded, subject-specific AI literacy are where genuine competency develops.
None of this happens without teachers who are confident enough in their own AI understanding to create those moments. A standalone AI literacy unit delivered by a form tutor who has not engaged with AI professionally is not worthless, but it is a pale substitute for the accumulated effect of AI-literate teachers across the curriculum.
The research on technology CPD is clear
Twenty years of research on technology integration in schools points to a consistent finding: technology adoption by students is only as deep as technology adoption by teachers. Schools that invested in student devices without investing equally in teacher professional development created a generation of students who were digitally fluent in consumption and digitally shallow in creation and critical evaluation.
The pattern is repeating with AI. Schools that introduce AI tools for students without preparing teachers are likely to produce students who are AI-fluent in the same shallow sense: able to use the tools, unprepared to evaluate them.
What teacher AI competency actually means
This is where many schools make a second mistake, even those that accept the teacher-first argument. They equate teacher AI competency with familiarity with AI tools: if teachers know how to use ChatGPT and Copilot, the thinking goes, they have the competency needed to develop student literacy.
They do not. Tool familiarity is the starting point, not the destination.
The distinction matters because the kind of competency that goes beyond tool familiarity is not acquired through a training day. It is developed over time, through practice, reflection, and professional dialogue. It requires a school to treat teacher AI competency as a genuine professional development priority rather than a compliance exercise.
What happens when schools get the sequence wrong
The consequences of student-first sequencing are not immediately visible, which is part of why it persists. They tend to appear in three ways.
Students develop tool fluency without critical literacy
When AI literacy is introduced to students by teachers who are not themselves AI competent, what gets transmitted is usually surface-level guidance: cite your sources, do not submit AI-generated work as your own, be aware that AI can get things wrong. These are useful messages, but they fall short of developing the evaluative capacity that genuine AI literacy requires. Students become better at following AI use rules, not better at thinking critically about AI.
Assessment integrity approaches become punitive rather than educational
Schools that have not developed teacher competency first tend to approach AI in assessment primarily as a detection and enforcement problem. The conversation in staff meetings becomes one about tools for identifying AI-generated submissions rather than one about redesigning assessment to develop and reward genuine thinking. This is understandable given the pressure schools face on academic integrity, but it treats the symptom rather than the cause.
A staff body with strong AI competency approaches assessment integrity differently. They design tasks that make AI assistance visible, discussable, and educationally productive, rather than tasks that attempt to exclude AI and then police the boundary.
The gap between confident and anxious AI users widens
Without structured professional development, schools tend to polarise. A minority of teachers, usually younger staff and those with a personal interest in technology, become confident AI users who integrate it thoughtfully into their practice. The majority remain uncomfortable, avoidant, or superficially compliant. The students of the confident minority receive incidental AI literacy through classroom exposure. The students of the anxious majority do not.
How to build teacher AI competency that actually transfers
The following principles are drawn from the UNESCO Teacher AI Competency Framework, the ISTE Standards for Educators, and the DfE's responsible use guidance. They reflect what the evidence says about professional development that changes practice rather than just awareness.
Start with a baseline, not an assumption
Before designing a professional development programme, understand where your staff actually are. Not where you assume they are. A structured audit of teacher AI competency does not require a lengthy survey. It requires knowing which tools staff are using, how confident they feel using them for different purposes, and whether they can articulate the limits of those tools to students. The answers to those questions should shape everything that follows.
One practical way to establish that baseline is to audit your existing documentation: your CPD records, your staff handbook, your AI policy, and your Schemes of Work. Together, these reveal a great deal about where teacher AI competency currently sits, where it is absent entirely, and where the gap between formal policy and actual classroom practice is widest. The AI Literacy Audit Tool does exactly this analysis automatically, scoring your school's Teacher AI Competency dimension against international frameworks and surfacing the specific evidence gaps that a CPD programme needs to address. It takes under ten minutes and gives you a scored baseline you can share with governors before a single training session has been designed.
Differentiate by role, not just by comfort level
A head of department needs different AI competency from a classroom teacher. A form tutor needs different competency from a SENCO. A senior leader needs different competency from both. Professional development programmes that treat staff as a homogeneous group tend to pitch at the middle and leave both ends of the distribution underserved.
Build in practice and structured reflection
A training session that shows teachers how to use an AI tool is not professional development. Professional development involves using the tool in the context of real teaching tasks, reflecting on what worked and what did not, discussing that reflection with colleagues, and adjusting practice accordingly. This cycle, which is sometimes called deliberate practice, takes time that schools are reluctant to protect. It is also the only kind of professional development that reliably changes classroom behaviour.
Connect teacher development explicitly to student outcomes
The purpose of teacher AI competency is not to make staff feel more comfortable with technology. It is to improve student learning and protect student welfare. CPD programmes that keep this connection visible, that ask teachers to explicitly plan how their developing competency will change what students experience, tend to be more effective than those that treat teacher and student development as separate tracks.
Track progression, not participation
Logging that a member of staff attended an AI training session is not evidence of professional development. Tracking what has changed in their practice as a result is. Schools that build light-touch progression tracking into their AI CPD programmes, even something as simple as termly professional conversations about AI practice, are better placed to identify who needs what next, and to demonstrate to governors that professional development is having an impact.
The answer to the question
Teacher competency comes first. Not because students are less important, but because student AI literacy depends on teacher AI competency in a way that cannot be shortcut.
The good news is that the two are not in competition. A school that invests seriously in teacher AI competency will find, within one to two academic years, that student AI literacy begins to develop across the curriculum through the accumulated effect of confident, knowledgeable teachers creating moments of genuine AI-critical thinking in their subjects.
A school that invests in student AI literacy programmes without developing teacher competency first will find itself repeating those programmes annually, without the organic curriculum-wide development that actually builds the skills students need.
The sequencing is not complicated. What is complicated is creating the conditions for genuine teacher professional development in a sector where time is always scarce and where AI is moving faster than most CPD calendars can accommodate. That is the real challenge, and it starts with understanding where your staff are now.
Governors are already asking. Ofsted's research review on technology in schools is paying closer attention to staff development than to student outcomes, on the grounds that one produces the other. Schools that can demonstrate a structured, evidenced approach to teacher AI competency are in a considerably stronger position than those that can demonstrate only that they ran an INSET day. The time to build that evidence base is before the conversation is forced.
Ready to find out where your school's teacher AI competency stands?
The AI Literacy Audit Tool scores your school's Teacher AI Competency dimension alongside eight other dimensions of AI readiness, cross-referenced against 33 international frameworks. Upload your existing documents and receive a scored, evidenced report in under 10 minutes. No surveys. No forms. Just your documents.
Run your free AI Literacy Audit at audit.deepeducationnetwork.com
About DEEP Education Network: DEEP Education Network is a professional development platform supporting over 1,000 educators across 50+ countries. We specialise in helping schools and school leaders navigate AI integration through courses, training, and practical frameworks grounded in education research. The AI Literacy Audit Tool was built from our analysis of 33 international AI frameworks and is designed to give school leaders a rigorous, evidenced picture of their AI readiness in minutes, not months. deepeducationnetwork.com
If this was useful, share it with a colleague. The conversation about AI in schools is too important to leave to chance.