Guide · about 6 minutes
Diagnose indexing issues with bulk URL Inspection
URL Inspection API supports bulk querying. Google allows up to 2,000 URLs in a single batch, but the GSC UI never exposed it. GSC PAP's `batch_inspect_urls` tool unlocks the full quota in one MCP call.
Combined with `get_index_coverage_summary` (account-wide indexing-state breakdown) and single-URL `inspect_url` for deep dives, you get a complete indexing-diagnostic toolkit in chat.
The scenario
Half your new blog posts aren't showing up on Google. Search Console's URL Inspection works on one URL at a time, manually. With 200+ posts in a content cluster, that's 200 clicks, 200 wait-for-results, 200 copy-pastes. You want a one-shot bulk audit that tells you which specific URLs are crawled-not-indexed, which are discovered-not-indexed, and which fail Mobile Usability.
The prompt
Use the gsc-pap MCP. For https://example.com, first call get_index_coverage_summary to see the high-level distribution of indexing states. Then list the top 50 most-recently-published URLs in /blog/* (you can ask me to provide these or pull them from a sitemap). Run batch_inspect_urls on those URLs. Group results by `indexingResult.coverageState`: how many are INDEXED vs CRAWLED_NOT_INDEXED vs DISCOVERED_NOT_INDEXED vs other? For any not indexed, give me the URL + the specific verdict + the most likely fix.What happens
- 1
AI gets the account-wide picture first
`get_index_coverage_summary` returns counts of pages in each indexing state across all your verified properties. This sets the baseline, if you have 10k pages and 200 are unindexed, that's normal; if you have 500 and 200 are unindexed, that's a content-engineering problem.
- 2
AI runs batch_inspect_urls on the candidate set
You provide the URL list (from a sitemap, recent publish list, or a content cluster). The tool batches up to 2,000 URLs in a single call. Results include lastCrawlTime, indexingState, mobile usability, structured-data validity, all per URL.
- 3
AI clusters and prescribes
Output is the bucketed list with a one-line fix per pattern: discovered-not-indexed → likely thin content or low internal linking; crawled-not-indexed → likely a quality signal; mobile-usability fail → specific viewport / tap-target / font-size issue. The AI prioritises by URL importance (you can tell it 'these are revenue pages').
Outcome
Most content sites discover one repeating pattern (e.g. 80% of unindexed posts share the same low-link, thin-intro template). Fix the template, re-request indexing, watch the bulk inspect run a week later show the bucket emptying. What used to be a 'we'll get to it eventually' problem becomes a 30-minute weekend project.
Where to take it next
- ·Pair with `list_sitemaps` + `get_sitemap_details` to confirm the URLs are at least submitted via sitemap.
- ·After fixes ship, re-run batch_inspect_urls to confirm the indexing state moved.
- ·Build a Claude Code script that re-runs this audit every two weeks and emails you only the diff.
Other guides
All guidesFind your fastest SEO quick wins with Claude
Spot pages stuck on page 2 of Google with high impressions but low clicks. The fastest organic traffic gains are usually a single sentence, title tag, or meta description away.
Detect keyword cannibalization across your site
Two of your pages are competing for the same query - splitting clicks, dragging both rankings down. Find them in one MCP call and decide which to merge, redirect, or differentiate.
Correlate organic search clicks with GA4 conversions
Find the pages that bring search clicks but don't convert, and the ones that do. One tool call joins Search Console clicks per page with GA4 sessions, conversions, and revenue per page.
Audit Core Web Vitals across your site with AI
Run Lighthouse audits and pull real-user (CrUX) Core Web Vitals data for any URL through Claude or Cursor. Surface LCP, INP, CLS regressions and the top opportunities to fix them.
Query Search Console data older than Google's 16-month limit
Google retains Search Console data for only 16 months. GSC PAP archives every authorised property continuously so you can query years of history through Claude or Cursor, same dimensions, no truncation.
Build a weekly SEO health report with Claude
One prompt that pulls performance, quick wins, decaying pages, ranking shifts, GA4 conversions, and CWV regressions into a single ready-to-share report. Run it every Monday.
Track keyword ranking trends across years (not months)
Plot how a single query's average position has moved across the last 24-36 months. Spot the algorithm-update spike, the AI Overview-driven decline, the slow consolidation toward your canonical page.