Back to list

Claude Code for SEO: Automated Site Analysis via Google Search Console and Analytics

aiclaudeseoproductivity
Claude Code for SEO: Automated Site Analysis via Google Search Console and Analytics

Any website you're even slightly trying to promote and optimize eventually develops a ritual: once a week, open Google Search Console, check positions, switch to Google Analytics, review traffic, jot down the numbers in a spreadsheet, compare with last week, and think about what to do next. Done manually, it's half an hour of clicking through dashboards and switching between tabs. Skip it, and you quickly lose track of what's happening with your site.

I lived with the manual approach for a long time, then gave up and built a slash command /seo-refresh for Claude Code. Now the entire ritual is a single line in the chat. Claude pulls data from GSC and GA on its own, updates the metrics log, shows a comparison with the previous measurement, and highlights where to look first.

Here's how it works, why I built it, and why it's a command rather than a skill.

What happens when you run it

I type /seo-refresh in the chat. Claude:

  1. Pulls data from Google Search Console for the last 3 months — total impressions, clicks, average position, positions for key queries and for each blog article.
  2. Pulls data from Google Analytics for the last 28 days — unique users and a breakdown by source (Google, Facebook, Direct, Instagram).
  3. Opens the file marketing/metrics-log.md and adds a new row to each of the three tables.
  4. Shows me a short comparison: what went up, what went down, which positions shifted by more than 5 places, which traffic sources dropped.

The whole process takes about 30 seconds. And the key part — I always have a historical log where I can look at trends over any period, not just "how are things today."

Why a command, not a skill

Claude Code has two things that look similar on the surface: skills and slash commands. I've already written about skills as "cheat sheets that load into context when needed." A command works differently. A command is a named prompt that you trigger explicitly. No auto-activation magic: you type /seo-refresh — the command runs. You don't type it — it doesn't run.

For my use case, this is perfect. I don't want Claude to ever decide on its own to check Search Console "just to clarify." I want to press a button that says "build me a report," get the report, and close the tab. Explicit trigger — explicit result.

Plus, the command lives inside the project (.claude/commands/seo-refresh.md), not in system-level skills. It's tied to a specific site, knows its Search Console ID, GA property_id, the list of articles I track, and the queries I care about. Different site — different command with different IDs. No mixing.

Where Claude gets the data

The whole thing runs on two MCP servers:

  • gsc MCP — provides access to the Google Search Console API. Through it, Claude makes a search_analytics request with the right site, date window, and grouping by pages or queries.
  • analytics-mcp — provides access to the Google Analytics Data API. Through it, Claude fetches unique users broken down by sessionSource.

Set up both servers once — and Claude can work with them from any session going forward. Setup is straightforward: claude mcp add for each server, OAuth authorization in the browser, done. This is actually a nice part of the story on its own — previously, this kind of automation would require writing a Python script, dealing with service accounts, JSON keys, and quotas. Here you just give Claude access once, and it behaves like a regular person with GSC open in a browser.

What the command file looks like

The command itself is a Markdown file with a YAML header, located at .claude/commands/seo-refresh.md:

--- description: Pull fresh SEO & traffic metrics from GSC + GA and update metrics-log.md --- # SEO Metrics Refresh Pull fresh data from Google Search Console and Google Analytics, then update `marketing/metrics-log.md` with a new row. ## Data Sources & IDs - **Google Search Console**: site = `sc-domain:hior.ru` - **Google Analytics**: property_id = `520460441` - **Metrics file**: `marketing/metrics-log.md` ## Step 1: Pull GSC data (3-month window) Use `mcp__gsc__search_analytics` with `siteUrl: "sc-domain:hior.ru"`. Start date = 3 months ago, end date = today. Run these 3 queries in parallel: 1. Overall totals (no dimensions) — impressions, clicks, avg position 2. By page (dimensions: page) — indexed page count and article positions 3. By query (dimensions: query, rowLimit: 20) — top query positions ...

The remaining steps 2, 3, and 4 follow the same pattern — pulling GA data, updating the log file, outputting the comparison. Nothing clever: a pure recipe that Claude follows top to bottom. The more specific the steps, the less chance it'll go off on a tangent like "let me also check this other thing."

A key trick — telling Claude to run three GSC queries in parallel rather than sequentially. It saves about 10 seconds, but small things like this add up to the feeling of "fast."

What you get as output

After the command finishes, metrics-log.md has a new row in each of the three tables. The first one — overall traffic and indexing:

| Date | Indexed | GSC Impr | GSC Clicks | Avg Pos | GA Users | Google | FB | Direct | | 2026-04-10 | 27 | 12937 | 187 | 8.1 | 217 | 115 | 2 | 96 | | 2026-04-11 | 27 | 13217 | 192 | 8.1 | 221 | 114 | 2 | 101 |

The second — blog article positions in the format position (N impressions). The third — key query positions. On top of that, Claude outputs a human-readable summary in the chat: "Impressions +280, clicks +5, positions stable, Google traffic dipped slightly, but Direct +5 — worth investigating where that's coming from." It doesn't just update the file — it thinks out loud about the numbers.

Bonus: chaining into content

The most interesting part starts after the command finishes. The metrics log is already in the session context. You can immediately write: "look at which queries have high impressions but average position 10-15, and suggest how to update the corresponding pages to improve rankings." Claude will go through the data it just collected, open the relevant pages in the repository, look at the meta tags, and suggest edits.

And at that point it stops being an "SEO report" and becomes "thought about SEO, made fixes, deployed." The entire chain — in one session, without switching between browser, editor, and terminal.

When to build a command like this

The rule is roughly this: if you notice yourself repeatedly asking Claude for the same task with the same details (which IDs, which files, which specific queries), that's a candidate for a command. If the task is different every time and only the general direction is the same, that's more of a skill.

For SEO, this is an ideal case because all the parameters are stable: one site, one property_id, the query list changes maybe twice a year, and the log file is always the same. One command covers everything.

Bottom line

/seo-refresh isn't complex automation — it's just a written-down recipe of ten steps that I used to perform by clicking through dashboards. All the infrastructure is already there: GSC and GA MCP servers handle the most annoying part — authentication and APIs, while slash commands in Claude Code handle the second part — explicit triggering and repeatability.

If you have a ritual you perform once a week following the same pattern, turn it into a command. Twenty minutes of setup pays for itself on the very first run.

© 2026 Ivan Bezdenezhnykh. All rights reserved.