Skip to content

What Happens When a School Commits to 24 Months of AI Training

Last April, I stood in front of 400 educators at the Hong Kong College of Technology and watched skepticism turn into experimentation in 75 minutes. I wrote about that session at the time — the three-part framework, the energy shift, the questions that came after. What I didn't write about was what happened next.

HKCT came back. Not for another workshop. For a 24-month AI Learning Community — monthly sessions, a WhatsApp support channel, and a mandate I wasn't expecting: from September 2026, NotebookLM will be mandatory for all students and staff.

That's the jump. From "can you show us what AI does" to "we're making this part of how the school operates." And the gap between those two decisions is where the interesting problems live.

The Knowledge Gap in the Room

The first session of the new program ran on March 27. About 60 attendees — in-person at the main campus, with staff from other locations joining online. The topic was NotebookLM: not features, but how it changes the way teachers prepare materials, evaluate student work, and manage the information overload that comes with running courses across multiple campuses.

What struck me was the range. In the same room: a staff member who had already built an AI-powered chatbot for student queries, and a colleague who didn't know what Google Gemini was. That's not unusual — I see it in every corporate engagement too. But in a school setting, the stakes feel different. These aren't sales reps missing a productivity tool. These are teachers whose students will be expected to use AI fluently in six months, whether or not their teacher can demonstrate it.

I spent the first ten minutes on what I always spend the first ten minutes on — safety. What data you can put into AI tools, what you can't. Student records: off limits. Assignment briefs: fine. Exam content: absolutely not. The traffic light framework works in every industry, but in education the yellow zone is wider and the consequences of getting red wrong are more personal.

What a 24-Month Commitment Actually Looks Like

The first session was a deep dive. Session two will focus on Copilot's advanced features for the staff who are ready for it. After that, the plan is hybrid: three months of fixed topics building shared fundamentals, then three months of flexible sessions driven by whatever the staff are actually struggling with.

The WhatsApp community launched alongside the first session. The idea is simple — between monthly sessions, staff can ask questions, share what they tried, flag what didn't work. I've seen this model work in corporate settings. A continuous engagement produces different behavior than a one-off. People stop asking "how do I use this" and start asking "can AI do this thing I'm stuck on." That shift usually happens around month three.

But I don't know if it works the same way in education. Corporate participants have repetitive tasks they can immediately test AI on — weekly reports, email drafts, data summaries. Teachers have a different rhythm. Their repetitive work is seasonal (grading, course planning, admin reporting), and the most impactful use case — changing how they teach — requires curriculum redesign, not just tool adoption. That's a slower loop.

The Part That Surprised Me

In April, the program shifted from staff training to a student workshop. Joanne, my contact at HKCT, sent over a poster for review. I wasn't expecting it. The original plan was 24 months of staff development. But the Student Affairs office had been watching, and they wanted in.

This is the pattern I've been describing in corporate settings for two years — train the pioneers, and demand spreads organically. I just hadn't seen it happen in a school before. The staff program is two sessions old, and the institution is already expanding scope beyond the original brief. Whether that's a sign of healthy adoption or scope creep I need to manage, I honestly don't know yet.

What I do know: the mandatory September rollout means this stops being optional in five months. Every staff member who hasn't engaged by then will need to catch up in a compressed timeline. The 24-month runway was designed for gradual adoption. The mandatory date compresses it. I'm building the sessions as if everyone has until month 12 to be comfortable. In reality, the ones who haven't started by September will feel it.

Why This Matters Beyond HKCT

Education is where the AI adoption gap hurts most. When the Hong Kong government allocated HK$2 billion for AI in schools, the fund explicitly excluded teacher training. Schools can buy AI systems but can't use the money to train the teachers who operate them. That's the procurement-training gap I've written about in corporate settings, now embedded in government policy.

HKCT decided to solve it themselves. Twenty-four months. Monthly sessions. A WhatsApp community. And a mandatory deadline. It's the most structurally committed approach to institutional AI adoption I've seen from any school in Hong Kong. Whether the execution matches the ambition — I'll be reporting from inside the program for the next two years.


I write about AI adoption as it actually happens — in training rooms, WhatsApp groups, and the gap between policy and practice. More at hyperfocusam.com/blog or connect on LinkedIn.

Want this for your team? Sam delivers AI workshops for enterprises across Hong Kong and Asia-Pacific.

Explore Corporate AI Workshops

Like what you see? Teach this.

Adaptig is growing a global trainer network across four continents. If you facilitate learning and want to lead practical AI workshops, we should talk.

Apply to Join the Network