find_quick_wins
Identify pages ranking positions 11 to 20 with high impressions: the easiest promote-to-top-10 candidates in your account.
Sign in with Google. Copy a token. Paste it into Claude, Cursor, or Claude Code. Ask anything about your sites - top queries, decaying pages, ranking trends. GSC PAP is a hosted bridge that exposes your Google Search Console data to AI assistants over the Model Context Protocol. No GCP project, no Python, no JSON files.
OAuth via Google · Read-only Search Console scope · Revoke anytime · Privacy Policy
A hosted web service that connects your Google Search Console account to AI assistants over the Model Context Protocol. Read our Privacy Policy for the full data & storage details.
Sign in with Google once. We store an encrypted refresh token and call the Search Console API on your behalf when your AI assistant asks a question. No Python process to run, no JSON files to manage.
We request the webmasters.readonly scope - the same data you see in your own Search Console UI: sites, search analytics, sitemaps, URL inspection. We never modify, delete, or write anything in Search Console.
Claude Desktop, Cursor, and Claude Code call external tools through MCP. We expose your Search Console as a single MCP endpoint with one bearer token you paste into your AI client config.
Refresh tokens are AES-256-GCM encrypted at rest. Bearer tokens stored as SHA-256 hashes only. Revoke any token in one click; account deletion is real and final within 30 days.
Every Search Console + GA4 + PageSpeed capability your AI assistant might need. Click a category to expand. Linking to #tool_name opens the right one automatically.
Surface low-hanging fruit your team can ship this week. Pages ranking just outside the top 10, queries cannibalizing each other.
Identify pages ranking positions 11 to 20 with high impressions: the easiest promote-to-top-10 candidates in your account.
Detect keyword cannibalization where multiple URLs compete for the same query, splitting click-through and confusing Google.
Catch decaying pages and ranking drops before they tank your monthly traffic numbers. Compare any two periods, track positions over time.
Identify pages losing organic traffic over time so you can refresh the content before rankings collapse.
Plot a query's position over the last N days to see whether you're trending up, down, or volatile.
Diff any two date ranges (e.g. last 28 days vs prior 28) to spot wins and losses across queries, pages, or countries.
Google Search Console only retains 16 months of search data. GSC PAP runs a per-account archive so you can query years of history through Claude or Cursor.
Query your full GSC history beyond the 16-month limit. Same dimensions and filters as the live API, but unlocked for long-term trend analysis.
Check archive coverage per site: earliest date, latest date, row count, last successful run.
Run a Lighthouse audit or pull real-user (CrUX) Core Web Vitals for any public URL, right from your AI client. Lab data plus actual visitor metrics, with the top optimization opportunities ranked by potential time savings.
Run a Google PageSpeed Insights (Lighthouse) audit. Returns performance / accessibility / best-practices / SEO scores, Core Web Vitals lab values (LCP, FCP, CLS, TBT), and the top opportunities ranked by potential savings.
Pull real-user Core Web Vitals from the Chrome User Experience Report (CrUX). LCP, INP, CLS at the 75th percentile, plus FAST/AVERAGE/SLOW rating, for both URL and origin level.
Find out exactly why pages aren't indexed. URL Inspection API in bulk, full coverage summary, single-URL deep dives without tab-switching the GSC UI.
Get an account-wide breakdown of indexing states (indexed, crawled-not-indexed, discovered-not-indexed) so you know where to focus.
Single-URL deep dive: indexability, last crawl, canonical, mobile usability, structured data. The full URL Inspection API result.
Bulk URL Inspection up to 2,000 URLs at once. Diagnose why an entire content cluster or sitemap isn't getting indexed.
All the Search Analytics API capabilities, exposed natively to AI assistants. Skip the Google Cloud project setup, the OAuth dance, and the JSON parsing.
High-level performance snapshot for a site: clicks, impressions, CTR, position over the last N days.
Run any Search Analytics query with full dimension and filter support, identical to what you'd hit the GSC API for, but inline in chat.
Drill into a specific page to see which queries drive its traffic, impressions, and average position.
Pair GSC discovery with GA4 conversion data. Sessions, real-user engagement, top landing pages, conversions and revenue by source. The killer cross-tool joins GSC clicks per page to GA4 sessions, conversions, and revenue so you can see which top-ranking pages actually convert.
List every Google Analytics 4 account and property the user can access. Always call first to discover the property_id needed for the other GA4 tools.
High-level GA4 snapshot for a property: active users, sessions, new users, page views, bounce rate, average session duration, engagement rate, plus a per-day breakdown.
Top landing pages by sessions over the last N days. One row per pagePath with sessions, active users, page views, bounce rate, average session duration.
Acquisition breakdown by channel, source, medium, or campaign. Rows with sessions, active users, conversions per bucket. Choose the dimension to slice by.
Active users in the last 30 minutes broken down by country, device, city, current page, or minutes-ago. Plus the total active count.
Conversion attribution by source × medium with sessions, conversions, total revenue, and purchase revenue. Sorted by conversions to surface the channels driving outcomes.
Audience snapshot: top countries, device categories (desktop/mobile/tablet), and browsers. Useful for segmenting reports or identifying mobile-vs-desktop performance gaps.
Per-page engagement metrics. Engaged sessions, engagement rate, average session duration, bounce rate, event count. Spot pages that bring traffic but lose users.
Diff any two windows for any GA4 metric (sessions, conversions, revenue, etc.). Week-over-week, month-over-month, with optional dimension slice. Returns absolute and percentage delta.
Cross-tool killer: join GSC clicks per page with GA4 sessions, conversions, and revenue per page. Answer 'which top-ranking organic pages actually convert?' in one tool call instead of two manual exports.
List your verified properties, audit sitemap health, find duplicate property registrations that split your reporting.
List every Search Console property you have access to, with verification state and permission level.
Permission level, verification state, and metadata for a single site URL.
List all sitemaps submitted for a property along with last download time and status.
Per-sitemap detail: index status, errors, warnings, total URLs submitted vs indexed.
Find duplicate Search Console properties (e.g. http vs https, www vs apex) that split your reporting and waste verification slots.
Anyone whose work touches Google Search Console and who already uses an AI assistant. The product makes the second tool answer questions about the first one.
Audit client sites in Claude Desktop. Run quick-wins, find cannibalization, spot decaying pages without opening a separate dashboard for each client.
Ask "what's broken this week" in your team chat without anyone touching the Search Console UI. Pipe answers into Slack via your AI client.
Connect your personal site. Get AI-driven SEO insights inline while you write content or fix code in Cursor / Claude Code.
Run cross-client SEO reports through one Claude session. Each user keeps their own scoped GSC PAP token; no shared dashboards or seat licenses.
Three steps, sixty seconds. No GCP project setup, no client_id JSON, no Python venv.
Standard OAuth, read-only Search Console scope. Your refresh token is encrypted at rest with AES-256-GCM.
We mint a long-lived bearer token, shown once. Copy it into your MCP client config - Claude Desktop, Cursor, or anything else.
"What are my decaying queries?" "Find cannibalization on this domain." 15 GSC tools, real data, real time.
We built this because we were tired of the OAuth client setup dance. Same here.
Your Search Console archive is visible only to you. No other GSC PAP user can see it - ever.
One click revokes your token at Google + in our DB. Account deletion is real and permanent within 30 days.
Refresh tokens are AES-256-GCM encrypted with envelope keys. Bearer tokens stored as SHA-256 hashes only.
We only hold the read-only Google scope. Even if our service were compromised, no one can edit or delete anything in your Search Console.
Self-host is great for ops teams. For everyone else, the GCP setup kills momentum before the first query.
Run gsc-pap-server on your own infra
Free, plug & play, no infra
Quick answers. The full FAQ covers setup, privacy, supported clients, limits, and more.
No. The whole reason GSC PAP exists is to remove the Google Cloud setup step. When you sign in with Google through gscpap.com, you grant permission to our OAuth client (which we registered with Google one time). We then call the Search Console API on your behalf using your refresh token. You never touch the Google Cloud Console.
Read full answerGSC PAP requests the webmasters.readonly OAuth scope - the same read-only level you would grant to any third-party Search Console viewer. Specifically we can read your verified site list, search analytics (clicks, impressions, CTR, position by query, page, country, device, date), sitemap submission status, and URL inspection results. This is exactly the data you see in your own Search Console UI.
Read full answerYes, GSC PAP is completely free during the public preview (v0.x). There is no credit card required, no trial expiration, and no premium feature gate. The only limits are technical ones to protect the service: 60 MCP tool calls per minute and 1,000 per hour per user. These limits are generous - typical AI assistant usage is 5-20 calls per question, so you can ask hundreds of questions per hour without hitting them.
Read full answerGoogle Search Console itself only retains the most recent 16 months of data. Anything older drops off Google's side and cannot be recovered, even with API access - this is a hard limit Google enforces on every Search Console user, including direct users of the official Google Search Console UI.
Read full answerSign in with Google, copy your token, paste it into Claude.
Connect Google account