I Rebuilt My Entire SEO in One Day. Here's What Actually Moved the Needle.

My website had been live for months. It had 30+ blog posts, case studies across banking, retail, education, and engineering. Decent traffic. Good content.

It also had zero internal links. No structured data. No schema markup. A meta description that called me a "private tutor." And a sitemap that mangled apostrophes in every title.

I'm an AI adoption trainer who tells companies to audit their workflows before buying tools. My own website was the cobbler's shoes.

The Audit That Started Everything

I ran a full SEO and GEO audit on hyperfocusam.com. Not just traditional SEO -- I looked at GEO: Generative Engine Optimization. That's the practice of structuring your content so AI search engines (Perplexity, ChatGPT search, Google AI Overviews) can find, understand, and cite your work. If you're only optimizing for Google in 2026, you're optimizing for half the discovery surface.

The results were grim. Google had indexed most of my pages, but without structured data, my content was invisible to rich snippets. Without internal links, Google couldn't see the topical relationships between 30+ articles. Without hreflang tags, my bilingual content (English and Traditional Chinese) was competing against itself. And without schema markup, AI engines had no structured way to reference my training programs, methodologies, or case studies.

Everything needed fixing. All of it.

Claude Code as Co-Pilot

Here's where it gets interesting. I sat down with Claude Code at 1 AM and didn't stop until 8:30 AM.

Claude Code didn't decide my SEO strategy. I chose which keywords to target, which case studies to feature on the pillar page, which internal linking structure made sense for my content architecture. The strategy was mine.

What Claude Code did was execute at a scale I couldn't match alone. When I said "add internal links to all 26 blog posts pointing to the two pillar pages," Claude Code read every post, identified natural insertion points in each article's existing narrative, and made 26 edits in minutes. What would have taken me an afternoon took three minutes.

When I asked it to add Course structured data to every blog post that mentioned a training program, it processed 26 files, identified which posts referenced workshops versus methodology articles, and inserted the correct schema blocks. It caught an RSS feed bug I would have missed entirely -- an apostrophe escaping issue in generate-sitemap.js that was truncating titles like "Why AI Training Doesn't Stick" to "Why AI Training Doesn."

The dynamic was clear: I directed, Claude Code executed. Strategy stayed human. Implementation went from days to hours.

Structured Data: From Invisible to Citable

Before: zero schema markup across the entire site.

After: Course schema on 26 blog posts covering three training programs (Workshop, Pioneer Program, Train-the-Trainer). Service schema on both pillar pages. Organization schema with proper geographic targeting.

Why this matters beyond Google: AI search engines rely heavily on structured data to generate accurate citations. When Perplexity or ChatGPT search encounters a page with Course schema, it can reference specific program names, durations, and descriptions rather than guessing from paragraph text. Structured data is the difference between being cited accurately and being paraphrased badly.

Internal Linking: Building the Web

Before: 30+ blog posts, each an island. Zero cross-links. No topical structure.

After: 26 posts now link to two pillar pages -- Services and Corporate AI Training Hong Kong. The pillar pages link back to supporting case studies. Every article about a specific client engagement connects to the methodology articles that explain the framework behind it.

This creates what SEOs call topical silos. Instead of 30 disconnected pages, Google now sees two content clusters with clear hierarchies. The pillar pages signal authority. The supporting posts signal depth. Together, they tell Google -- and AI engines -- "this person doesn't just mention AI training. He has 30 articles proving he actually does it."

Bilingual Optimization: Serving Two Audiences Properly

I publish in English and Traditional Chinese. Before today, both versions existed but neither told search engines about the other.

After: proper hreflang tags in the sitemap for every bilingual pair. English pages declare their Chinese counterpart. Chinese pages declare their English counterpart. The x-default points to English.

For the Hong Kong market, this matters. Someone searching in Cantonese should find my Chinese content. Someone searching in English should find my English content. Without hreflang, Google was making that decision randomly. Now it's explicit.

For GEO, bilingual optimization has an additional benefit: AI engines serving multilingual markets (common in Hong Kong) can now correctly identify which version to cite based on the user's language context.

The Pillar Page Rebuild

The corporate training page went from a basic service description to a comprehensive resource. I added four featured case studies with specific metrics and internal links. Four proprietary methodology sections (Traffic Light Protocol, AI Pioneer Model, the 70/30 Split, AI Maturity Model). A government funding section covering BUD Fund, Upskill HK, and RTTP -- directly relevant to Hong Kong companies evaluating AI training.

This is content depth as an SEO signal. A thin page says "I offer AI training." A deep page with named methodologies, specific client outcomes, and government funding pathways says "I'm the person you should talk to about AI training in Hong Kong."

The Scoreboard

One session. 11 commits. 3 new blog posts targeting unclaimed Hong Kong keywords. 26 existing posts updated with internal links and structured data. Course schema deployed site-wide. GA4 migrated to the business property. RSS feed fixed. Google Search Console verified and submitted.

All in about seven hours.

What I Learned

The technical work wasn't hard. Schema markup, hreflang tags, internal linking -- these are well-documented practices. What was hard was doing all of it at once across 30+ files without introducing errors. That's where AI-assisted execution changes the equation.

I didn't use Claude Code because I couldn't do this work manually. I used it because the gap between knowing what to do and actually doing it across an entire site is where most solo practitioners get stuck. You know you should add internal links. You know you should have structured data. But touching 26 files for a "should" task always loses to client work and deadlines.

AI didn't do this for me. It made the "should" tasks take hours instead of weeks. That's the difference.

The best time to fix your SEO was six months ago. The second best time is one focused night with a good co-pilot.


I help companies adopt AI through workshops, coaching, and pioneer programs. If your team is navigating AI adoption, I'd welcome a conversation.