v0.3 - public preview

Ask Claude about your Google Search Console.

Sign in with Google. Copy a token. Paste it into Claude, Cursor, or Claude Code. Ask anything about your sites - top queries, decaying pages, ranking trends. GSC PAP is a hosted bridge that exposes your Google Search Console data to AI assistants over the Model Context Protocol. No GCP project, no Python, no JSON files.

OAuth via Google · Read-only Search Console scope · Revoke anytime · Privacy Policy

claude · gsc-pap
48
Tool calls served (live)
30
MCP tools availablecovering the full GSC API surface
16+
Months of search historywe archive past Google's 16-month limit
Any MCP
AI clients supportedClaude Desktop, Cursor, Claude Code

What GSC PAP does

A hosted web service that connects your Google Search Console account to AI assistants over the Model Context Protocol. Read our Privacy Policy for the full data & storage details.

Hosted bridge

Sign in with Google once. We store an encrypted refresh token and call the Search Console API on your behalf when your AI assistant asks a question. No Python process to run, no JSON files to manage.

Read-only data access

We request the webmasters.readonly scope - the same data you see in your own Search Console UI: sites, search analytics, sitemaps, URL inspection. We never modify, delete, or write anything in Search Console.

Model Context Protocol

Claude Desktop, Cursor, and Claude Code call external tools through MCP. We expose your Search Console as a single MCP endpoint with one bearer token you paste into your AI client config.

Encrypted + revocable

Refresh tokens are AES-256-GCM encrypted at rest. Bearer tokens stored as SHA-256 hashes only. Revoke any token in one click; account deletion is real and final within 30 days.

30 tools, one MCP server

Every Search Console + GA4 + PageSpeed capability your AI assistant might need. Click a category to expand. Linking to #tool_name opens the right one automatically.

Browse the full 30-tool catalog

Spot quick SEO wins

2 tools

Surface low-hanging fruit your team can ship this week. Pages ranking just outside the top 10, queries cannibalizing each other.

find_quick_wins

Identify pages ranking positions 11 to 20 with high impressions: the easiest promote-to-top-10 candidates in your account.

find_cannibalization

Detect keyword cannibalization where multiple URLs compete for the same query, splitting click-through and confusing Google.

Find what's losing traffic

3 tools

Catch decaying pages and ranking drops before they tank your monthly traffic numbers. Compare any two periods, track positions over time.

find_content_decay

Identify pages losing organic traffic over time so you can refresh the content before rankings collapse.

compare_periods

Diff any two date ranges (e.g. last 28 days vs prior 28) to spot wins and losses across queries, pages, or countries.

Beat the 16-month limit

2 tools

Google Search Console only retains 16 months of search data. GSC PAP runs a per-account archive so you can query years of history through Claude or Cursor.

search_analytics_archive

Query your full GSC history beyond the 16-month limit. Same dimensions and filters as the live API, but unlocked for long-term trend analysis.

archive_status

Check archive coverage per site: earliest date, latest date, row count, last successful run.

Diagnose performance & Core Web Vitals

2 tools

Run a Lighthouse audit or pull real-user (CrUX) Core Web Vitals for any public URL, right from your AI client. Lab data plus actual visitor metrics, with the top optimization opportunities ranked by potential time savings.

pagespeed_insights

Run a Google PageSpeed Insights (Lighthouse) audit. Returns performance / accessibility / best-practices / SEO scores, Core Web Vitals lab values (LCP, FCP, CLS, TBT), and the top opportunities ranked by potential savings.

pagespeed_field_data

Pull real-user Core Web Vitals from the Chrome User Experience Report (CrUX). LCP, INP, CLS at the 75th percentile, plus FAST/AVERAGE/SLOW rating, for both URL and origin level.

Diagnose indexing issues

3 tools

Find out exactly why pages aren't indexed. URL Inspection API in bulk, full coverage summary, single-URL deep dives without tab-switching the GSC UI.

get_index_coverage_summary

Get an account-wide breakdown of indexing states (indexed, crawled-not-indexed, discovered-not-indexed) so you know where to focus.

inspect_url

Single-URL deep dive: indexability, last crawl, canonical, mobile usability, structured data. The full URL Inspection API result.

batch_inspect_urls

Bulk URL Inspection up to 2,000 URLs at once. Diagnose why an entire content cluster or sitemap isn't getting indexed.

Pull performance data

3 tools

All the Search Analytics API capabilities, exposed natively to AI assistants. Skip the Google Cloud project setup, the OAuth dance, and the JSON parsing.

get_performance_overview

High-level performance snapshot for a site: clicks, impressions, CTR, position over the last N days.

search_analytics_query

Run any Search Analytics query with full dimension and filter support, identical to what you'd hit the GSC API for, but inline in chat.

get_search_by_page_query

Drill into a specific page to see which queries drive its traffic, impressions, and average position.

Connect search to engagement & revenue (GA4)

10 tools

Pair GSC discovery with GA4 conversion data. Sessions, real-user engagement, top landing pages, conversions and revenue by source. The killer cross-tool joins GSC clicks per page to GA4 sessions, conversions, and revenue so you can see which top-ranking pages actually convert.

list_ga4_properties

List every Google Analytics 4 account and property the user can access. Always call first to discover the property_id needed for the other GA4 tools.

ga4_overview

High-level GA4 snapshot for a property: active users, sessions, new users, page views, bounce rate, average session duration, engagement rate, plus a per-day breakdown.

ga4_top_pages

Top landing pages by sessions over the last N days. One row per pagePath with sessions, active users, page views, bounce rate, average session duration.

ga4_traffic_sources

Acquisition breakdown by channel, source, medium, or campaign. Rows with sessions, active users, conversions per bucket. Choose the dimension to slice by.

ga4_realtime

Active users in the last 30 minutes broken down by country, device, city, current page, or minutes-ago. Plus the total active count.

ga4_conversions_by_source

Conversion attribution by source × medium with sessions, conversions, total revenue, and purchase revenue. Sorted by conversions to surface the channels driving outcomes.

ga4_user_demographics

Audience snapshot: top countries, device categories (desktop/mobile/tablet), and browsers. Useful for segmenting reports or identifying mobile-vs-desktop performance gaps.

ga4_engagement

Per-page engagement metrics. Engaged sessions, engagement rate, average session duration, bounce rate, event count. Spot pages that bring traffic but lose users.

ga4_compare_periods

Diff any two windows for any GA4 metric (sessions, conversions, revenue, etc.). Week-over-week, month-over-month, with optional dimension slice. Returns absolute and percentage delta.

correlate_gsc_to_ga4

Cross-tool killer: join GSC clicks per page with GA4 sessions, conversions, and revenue per page. Answer 'which top-ranking organic pages actually convert?' in one tool call instead of two manual exports.

Manage sites & sitemaps

5 tools

List your verified properties, audit sitemap health, find duplicate property registrations that split your reporting.

list_sites

List every Search Console property you have access to, with verification state and permission level.

get_site_details

Permission level, verification state, and metadata for a single site URL.

list_sitemaps

List all sitemaps submitted for a property along with last download time and status.

get_sitemap_details

Per-sitemap detail: index status, errors, warnings, total URLs submitted vs indexed.

detect_duplicate_properties

Find duplicate Search Console properties (e.g. http vs https, www vs apex) that split your reporting and waste verification slots.

Who uses GSC PAP?

Anyone whose work touches Google Search Console and who already uses an AI assistant. The product makes the second tool answer questions about the first one.

SEO consultants

Audit client sites in Claude Desktop. Run quick-wins, find cannibalization, spot decaying pages without opening a separate dashboard for each client.

In-house SEO + content teams

Ask "what's broken this week" in your team chat without anyone touching the Search Console UI. Pipe answers into Slack via your AI client.

Solo developers

Connect your personal site. Get AI-driven SEO insights inline while you write content or fix code in Cursor / Claude Code.

Agencies

Run cross-client SEO reports through one Claude session. Each user keeps their own scoped GSC PAP token; no shared dashboards or seat licenses.

How it works

Three steps, sixty seconds. No GCP project setup, no client_id JSON, no Python venv.

01

Sign in with Google

Standard OAuth, read-only Search Console scope. Your refresh token is encrypted at rest with AES-256-GCM.

02

Get your token

We mint a long-lived bearer token, shown once. Copy it into your MCP client config - Claude Desktop, Cursor, or anything else.

03

Ask anything

"What are my decaying queries?" "Find cannibalization on this domain." 15 GSC tools, real data, real time.

Built for technical users who care about access

We built this because we were tired of the OAuth client setup dance. Same here.

Your data, yours alone

Your Search Console archive is visible only to you. No other GSC PAP user can see it - ever.

Revoke anytime

One click revokes your token at Google + in our DB. Account deletion is real and permanent within 30 days.

Encrypted at rest

Refresh tokens are AES-256-GCM encrypted with envelope keys. Bearer tokens stored as SHA-256 hashes only.

Read-only by design

We only hold the read-only Google scope. Even if our service were compromised, no one can edit or delete anything in your Search Console.

vs self-hosting

Self-host is great for ops teams. For everyone else, the GCP setup kills momentum before the first query.

Self-host

Run gsc-pap-server on your own infra

  • Set up a Google Cloud project
  • Configure an OAuth web client
  • Manage a Python venv + dependencies
  • Edit JSON config files for each AI client
  • Maintain Postgres + Redis + Cloudflare Tunnel
Time to first query~45 minutes
Recommended

GSC PAP (hosted)

Free, plug & play, no infra

  • No Google Cloud project
  • No OAuth client config
  • No Python or dependencies to manage
  • No JSON files (one paste in your AI client)
  • No Postgres, no Redis, no infra ops
Time to first query60 seconds

Common questions

Quick answers. The full FAQ covers setup, privacy, supported clients, limits, and more.

Do I need a Google Cloud project to use GSC PAP?

No. The whole reason GSC PAP exists is to remove the Google Cloud setup step. When you sign in with Google through gscpap.com, you grant permission to our OAuth client (which we registered with Google one time). We then call the Search Console API on your behalf using your refresh token. You never touch the Google Cloud Console.

Read full answer

What Search Console data can GSC PAP access?

GSC PAP requests the webmasters.readonly OAuth scope - the same read-only level you would grant to any third-party Search Console viewer. Specifically we can read your verified site list, search analytics (clicks, impressions, CTR, position by query, page, country, device, date), sitemap submission status, and URL inspection results. This is exactly the data you see in your own Search Console UI.

Read full answer

Is GSC PAP free? What are the limits?

Yes, GSC PAP is completely free during the public preview (v0.x). There is no credit card required, no trial expiration, and no premium feature gate. The only limits are technical ones to protect the service: 60 MCP tool calls per minute and 1,000 per hour per user. These limits are generous - typical AI assistant usage is 5-20 calls per question, so you can ask hundreds of questions per hour without hitting them.

Read full answer

How far back does my data go?

Google Search Console itself only retains the most recent 16 months of data. Anything older drops off Google's side and cannot be recovered, even with API access - this is a hard limit Google enforces on every Search Console user, including direct users of the official Google Search Console UI.

Read full answer

Ready in 60 seconds.

Sign in with Google, copy your token, paste it into Claude.

Connect Google account