Skip to main content
DEEP
AI EducationFebruary 20268 min read

Something Big Is Happening — And Schools Aren't Ready

Matt Shumer's viral warning about AI has the tech world rattled. But what does it actually mean for the classroom? A plain-English breakdown for educators.

Share:

A few weeks ago, Matt Shumer — CEO of HyperWrite and one of the sharper voices in AI — published an article called "Something Big Is Happening." It spread fast. Tech founders shared it. Investors shared it. People who normally don't read articles about AI shared it.

And I think most teachers missed it entirely.

That's a problem. Not because teachers need to follow every tech trend — we've got enough on our plates. But because what Shumer describes isn't a tech trend. It's a shift that's heading straight for our classrooms, our curricula, and the futures we're supposed to be preparing children for.

So I sat down, read the whole thing, and translated it out of Silicon Valley and into something useful. Here's what's actually going on, where Shumer gets it right, and where we — as educators — see things differently.

• • •

What Shumer is actually saying

If you strip away the technical language, Shumer's argument comes down to four ideas. Each one matters for schools.

1. We're in the quiet bit before everything changes

Shumer compares right now to those strange few weeks in early 2020 — before lockdowns, before anyone really understood what was coming. Life felt normal. The data said otherwise.

He's making the same argument about AI. On the surface, things look steady. Schools are still debating whether to block ChatGPT. Meanwhile, behind the scenes, the technology is accelerating at a pace that's difficult to overstate.

For us, the takeaway is simple: don't confuse the slow pace of school policy with the actual speed of change. They're not the same thing.

2. AI is moving from "tool" to "colleague"

Until now, most of us have treated AI like a better search engine. You ask it something, it gives you an answer, you decide what to do with it. That's the "tool" phase, and it's already ending.

What's coming next is what Shumer calls "Agents" — AI that doesn't just respond to questions but takes action. It books things. It plans things. It completes entire workflows with minimal human oversight.

What this means in the classroom

We've been teaching students how to search for information. We need to start teaching them how to delegate — how to brief an AI clearly, check its work, and steer it toward better outcomes. The student of the future isn't the one doing the task. They're the one managing it.

3. AI is now building itself

This is the part that sounds like science fiction, except it's already happening. Shumer explains that the latest AI models were used to debug and improve themselves. The technology is, quite literally, helping to create the next version of itself.

The result? A speed of improvement that humans can't keep pace with. By the time we print a resource about AI, it's out of date. By the time a new policy goes through committee, the thing it was written about has already changed three times over.

4. AI is developing "judgement"

This is perhaps the most unsettling claim. Shumer argues that AI has started to demonstrate something like "taste" — the ability to make reasonable calls in ambiguous situations. Not just getting facts right, but choosing well when there's no clear right answer.

For educators, that lands differently. We've always said critical thinking and judgement are the things that make us irreplaceably human. If AI starts doing those things — even partially — we need to think carefully about what we're actually preparing students for.

• • •

Where we agree with Shumer

Reading his article as an educator, there are three points where I found myself nodding along.

Busy work is finished. Shumer points out that entry-level office work is being automated rapidly. For schools, this is the final nail in the coffin for worksheets, copy-and-answer tasks, and anything a student could hand to an AI and get back in three seconds. If the machine can do the assignment, the assignment isn't teaching anything worth learning.

Personalisation is finally possible — properly. If AI can build entire software applications on the fly, it can certainly build a learning pathway for one student. The "one lesson fits thirty kids" model was always a compromise. Now, for the first time, the technology exists to actually do something about it.

"Wait and see" is the worst strategy. Shumer is clear that hesitation is expensive. We agree. If we don't help students understand how to work alongside AI — how to delegate to it, oversee it, challenge it — we're sending them into a world they haven't been equipped for.

If an AI can do the assignment in three seconds, the assignment has no value. The worksheet era is over.

• • •

Where the classroom is different

This is where things get interesting — and where, respectfully, the tech world and the teaching world see things through different lenses.

Judgement in code isn't the same as judgement in a classroom

Shumer's AI has "taste" when it comes to writing software and making design decisions. Fair enough. But in a classroom, judgement isn't just about getting the right answer. It's about the right moment.

An AI might know the most effective way to explain fractions. But it doesn't know that the child in front of you is having a terrible day because their dog died that morning. Our judgement is rooted in empathy — in reading a room, sensing what's unspoken, knowing when to push and when to back off. That's not a data point. It's a human skill, and it's the core of what we do.

The "human in the loop" isn't a flaw — it's the whole point

In the tech world, needing a human to check the AI's work is treated as a temporary limitation. Something to be engineered out. In education, the human relationship is the mechanism of learning.

Children don't learn just from information. They learn from people. From feeling seen, challenged, supported, believed in. AI can provide content. It can't provide aspiration. A chatbot doesn't make a child want to become a scientist. A teacher who notices their curiosity and fans it — that's something else entirely.

We can't "move fast and break things" when the "things" are children

Shumer warns that the window to act is short. And for startups, he's probably right. But schools don't move at startup speed — and they shouldn't. Our pace is slower because the stakes are different.

We have to think about equity. Will AI tools widen the gap between well-funded schools and the rest? We have to think about privacy. What happens to student data inside these systems? We have to think about safeguarding. These aren't obstacles to progress — they're the reason progress needs to be done carefully.

• • •

So what do we actually do?

If you're a teacher reading this — or a school leader, a parent, anyone involved in education — here's what I'd take away from Shumer's article and our response to it.

Move from prompting to partnering. Don't just ask AI to write a lesson plan and accept it. Give it your context — your students' gaps, your learning objectives, the misconceptions you keep seeing — and use it as a thinking partner. Not a replacement for your planning, but a sharpener of it.

Look for the "human residue." Take any lesson you teach and ask: what's left if I remove the AI? If the answer is "nothing," that lesson needs redesigning. Build around discussion, debate, physical making, ethical reasoning — the things that require a human in the room.

Drop the ego, keep the authority. Shumer says we need to stop pretending we're smarter than the machine. He's right — for raw information processing, we're not. But knowing more facts was never the job. The job is guiding young people through complexity. We're not the sage on the stage any more. But we are the guide on a ride that's about to get very fast indeed.

Something big is happening. The tech world is worried about replacing workers. Our job is to make sure it doesn't replace thinking.

• • •

This is what we explore at the DEEP Education Network — the intersection of AI, education, and the practical reality of classroom life. No jargon, no hype, just honest thinking from people who work with young people every day.

If this resonated, share it with a colleague. The conversation is too important to leave to the tech industry alone.

AG

Alex Gray

Head of Sixth Form & BSME Network Lead for AI in Education. Alex explores how artificial intelligence is reshaping teaching, learning, and the future of work — with honesty, clarity, and a focus on what matters most for educators and students.

Discussion

Sign in to join the discussion.

Join the Conversation

DEEP Education Network is a growing community of educators navigating AI with honesty and purpose. No hype. No fear. Just practical thinking for the people who matter most — teachers and the young people they serve.

Explore DEEP Education Network