How to use Claude Code for SEO: a complete workflow guide

This guide covers how to build and optimize a website for search engine visibility using Claude Code, without requiring any coding knowledge. The approach combines AI-assisted site generation with keyword research, on-page optimization, and technical SEO auditing. What follows is a structured breakdown of the full workflow, from site setup to Google indexing.

TL;DR

  • Problem: Building an SEO-optimized website from scratch requires technical skills and significant time investment across multiple disciplines.
  • Solution: Claude Code automates site generation, content creation, on-page SEO, and technical optimization through a single AI-assisted workflow.
  • Outcome: A static, Google-indexable website with optimized blog posts, service pages, and a Lighthouse score close to 100 across all categories.

What is the Claude Code SEO workflow and who is it for?

Claude Code is an AI coding assistant that operates inside a local development environment. In an SEO context, it functions as a site and content builder: it generates pages, writes blog posts, applies optimization rules, and handles technical configuration based on prompts you provide.

The workflow described here is designed for business owners, freelancers, and service providers who want search-visible websites without hiring a development or SEO team. No coding experience is required. If you want a broader overview of what Claude Code can do beyond SEO, the Claude Code review on AI Start Me Up covers its core capabilities in more detail.

How do you set up a website with Claude Code?

Every Claude Code project lives in a local folder on your computer. The key file is called CLAUDE.md. This document acts as a training file for Claude: it specifies how the site should be built, what rules to follow, and what the output requirements are. A pre-written version of this file is typically shared alongside tutorials of this type and handles the core configuration automatically.

From there, a single prompt instructs Claude to generate the initial site structure: a homepage, a blog index page, and a service page index. Attaching a design reference screenshot from a site like Dribbble gives Claude a visual target, which significantly improves the quality of the first output compared to a text-only prompt.

Why does site generation method matter for SEO?

There are three ways websites can render their pages: static site generation, server-side rendering, and client-side rendering. For SEO, static site generation is the only reliable option. When Google crawls a statically generated site, the pages are available immediately with no processing delay.

With server-side rendering, Google has to wait for the page to finish loading before indexing it. With client-side rendering, the content may not be accessible to crawlers at all. A site that cannot be crawled cannot rank, regardless of content quality. The CLAUDE.md file should include a hard requirement for static site generation to ensure this is handled correctly from the start.

How do you find the right keywords to target?

Keyword selection is where most AI-assisted SEO workflows fall short. Asking Claude directly for keyword suggestions produces a general list with no volume data, no difficulty scoring, and no intent filtering. A dedicated SEO tool is required for this step.

SEMrush is one widely used option. Its keyword magic tool lets you enter a root keyword and filter results by three criteria: keyword difficulty of 30 or below, monthly search volume of at least 100, and informational intent. Applying these filters reduces a raw list of hundreds of thousands of keywords to a workable set of several hundred. You then export the filtered list as a CSV file and load it directly into your Claude Code project. For a complementary approach to keyword research using AI tools, the 10-minute SEO hack guide on this site shows a faster method for identifying quick-win ranking opportunities.

Additional keyword sources worth exploring: the questions tab inside keyword tools surfaces direct customer queries, adjacent topic searches capture prospects earlier in the decision process, and competitor keyword analysis shows what similar sites already rank for.

What is the difference between blog posts and service pages?

Both content types serve distinct functions and should be developed in parallel. Blog posts target informational keywords. Someone searching “how much does a plumber cost” is looking for information, not ready to book. Blog content builds topical authority and signals to Google that the site covers a subject in depth. Over time, this lifts the ranking potential of all pages on the domain.

Service pages target commercial keywords: searches where the intent is to hire or purchase. “Plumber Toronto” or “drain cleaning Vancouver” are commercial searches. These pages are built around a service-plus-location structure. If you offer three services across four cities, that produces twelve service pages. The goal on these pages is conversion: the visitor fills out a form or makes contact. The relationship between the two is cumulative – blog posts raise domain authority, which improves the ranking position of service pages, which generates more leads.

How do you avoid generic AI-generated content?

The default output from a content generation prompt is flat and generic. It reads like a brochure and performs poorly on the behavioral signals Google uses to assess content quality: time on page, scroll depth, and bounce rate.

The solution is to train Claude on a writing style before generating content. Feed reference material into the project – LinkedIn posts, email newsletters, transcripts, or any written samples that reflect how the business communicates. Claude builds reference files covering voice, tone, opinions, and relevant statistics, and draws on these when writing. A second step involves analyzing the top-ranking pages for a given keyword before writing. Claude retrieves the top three results, extracts the average word count, heading structure, and topic coverage, and uses that as a structural baseline. The output then matches the format that is already working in search results, with your positioning applied on top.

What does on-page SEO optimization involve?

On-page SEO refers to everything within the page itself that affects how Google reads and ranks it. There are over 80 individual signals involved, and running through them manually for every page is not practical. The approach here is to compile the full checklist into a single prompt and instruct Claude to apply it without rewriting the voice or tone.

Key items on that checklist:

  • Primary keyword appearing in the first 100 words
  • Exactly one H1 heading per page
  • Two to three external links to authoritative sources, such as Google's own SEO documentation
  • Three to five internal links to other pages on the same site
  • Four to eight FAQ-style questions within the content
  • A meta title and meta description optimized for the target keyword

Claude works through the list systematically and returns the updated page. The same prompt applies to both blog posts and service pages.

What does technical SEO cover and how does Claude Code handle it?

Technical SEO addresses how the site is built rather than what it says. Three files are required for a site to perform well technically.

A sitemap.xml file lists all pages on the site and is submitted to Google via Search Console. Without it, Google may not discover or index all pages. A robots.txt file tells Google which pages it is allowed to crawl. Admin pages or login screens should typically be excluded; everything else should be accessible.

The third element is a strong Google Lighthouse score. Lighthouse is a built-in browser tool that scores your site across four categories: performance, accessibility, best practices, and SEO. The target is 100 across all four. To fix a low score, copy the full Lighthouse report and paste it into Claude with a prompt to resolve all flagged issues. Claude identifies the specific problems – render-blocking scripts, unoptimized images, legacy JavaScript – and applies the fixes. Rerunning the audit after changes shows the updated score. For teams also working with automated file and data workflows alongside their site, the n8n file collection workflow covers how to connect external data sources to your content pipeline.

How do you scale content production with a Claude Code skill?

Once the individual steps are working, they can be packaged into a single reusable instruction set called a Claude Code skill. Typing a short trigger command into a new chat session runs the full workflow automatically: it pulls a keyword from the CSV, creates a keyword cluster, retrieves images from a stock API, applies the writing style references, runs the on-page SEO checklist, and uses the pre-optimized page template.

This process is also schedulable. The skill can run at a fixed time each day, generating and publishing content without manual input. The practical constraint is publishing cadence: releasing too many pages too quickly creates an unnatural spike that Google may flag. A gradual ramp-up starting with one page per day and increasing slowly over several weeks is the more sustainable approach.

What are the steps to deploy and index the site?

Once the site is ready locally, it needs to be deployed to a public server. The standard approach uses two free tools: GitHub for code storage and Vercel for hosting. The process involves uploading the local project to a GitHub repository, connecting that repository to Vercel, and deploying with one click. The application preset in Vercel should be set to Next.js. A custom domain can be added via Vercel's domain settings or imported from an external registrar.

After deployment, four steps complete the setup:

  • Create a Google Business Profile listing to capture local search clicks
  • Add the site to Google Search Console and submit the sitemap.xml
  • Use Search Console's URL inspection tool to request indexing for individual pages – this can get new content indexed within one day rather than waiting weeks
  • Connect Google Analytics to track visitor behavior and measure which pages drive engagement

FAQ

Does Claude Code replace the need for an SEO specialist?

Claude Code automates a significant portion of the technical and on-page SEO work that previously required dedicated specialists. It does not replace strategic judgment, particularly around keyword selection, content positioning, and long-term link acquisition. A tool like SEMrush is still necessary for keyword research, and decisions about which pages to prioritize require human input. What Claude Code changes is the execution speed and the technical barrier to entry.

Is AI-generated SEO content penalized by Google?

Google's publicly stated position is that it evaluates content based on quality and relevance, not based on how it was produced. Content that is thin, repetitive, or provides no genuine value to the reader is more likely to underperform, regardless of whether a human or an AI wrote it. The workflow described here addresses this directly by training Claude on a writing voice and anchoring content structure to what is already ranking for a given keyword.

How many service pages is too many?

There is no fixed threshold, and Google has not published a specific limit. The concern with large numbers of service pages is content similarity: if dozens of pages are near-identical except for the city name, Google may treat them as duplicate or low-quality content. A selective approach – covering the most commercially relevant service-location combinations rather than generating every possible pairing – reduces that risk. Monitoring performance in Search Console after publishing provides data to guide further decisions.

Some links may be affiliate links. This helps support the site at no additional cost and does not influence the content or reviews.


Discover more from AI Start Me Up

Subscribe to get the latest posts sent to your email.

Scroll to Top