Why Is Google Search Console Showing Errors

Why Is Google Search Console Showing Errors

Why Is Google Search Console Showing Errors? Every Error and Every Fix (2026)

Red numbers in Search Console. Here's exactly what every error means — and how to fix each one.

You opened Google Search Console expecting good news. Instead, red numbers are everywhere. Coverage errors. Core Web Vitals failures. Mobile usability issues. Crawl anomalies.

Most bloggers see these errors and do one of two things. They panic and start changing everything at once — making the problem worse. Or they ignore the errors completely — assuming they're not important enough to affect anything.

Both responses are wrong.

Search Console errors are Google talking directly to you — telling you specifically what's preventing your blog from performing at its best. Each error has a specific meaning. Each meaning has a specific fix. And fixing the right errors in the right order produces measurable improvements in rankings, traffic, and indexing within weeks.

This guide explains every common Search Console error — in plain English — and gives you the exact fix for each one.

Context: Search Console errors are one of ten specific problems that stall a blog's growth. See our complete blog diagnose guide to identify every issue affecting your blog simultaneously.

First — Navigate Search Console Correctly

Before diagnosing specific errors, understand where to find them.

Search Console Dashboard → Key Sections:

Section What It Shows Priority
Performance Clicks, impressions, CTR, position Check weekly
URL Inspection Individual page indexing status Check when troubleshooting
Coverage Indexed vs excluded vs error pages Check weekly
Sitemaps Sitemap submission and processing Check monthly
Core Web Vitals Page speed real-world data Check monthly
Mobile Usability Mobile display problems Check monthly
Manual Actions Google penalties Check immediately if traffic drops
Security Issues Hacking or malware Check immediately if flagged

Start with Coverage. It's the most critical section for diagnosing indexing and crawling problems — the foundation of everything else.

Coverage Errors — The Most Important Section

Error 1 — "Submitted URL Not Found (404)"

What it means: You submitted a URL in your sitemap that returns a 404 — page not found — error when Google tries to crawl it.

This happens when you delete an article but leave its URL in your sitemap. Or when you changed a URL after publishing without setting up a redirect. Google is trying to index a page that no longer exists.

How to confirm: Go to Coverage → Error → click "Submitted URL not found (404)" → see the list of affected URLs. Visit each URL in your browser — confirm it returns 404.

The Fix:

Option 1 — Restore the page if you deleted it accidentally or want it back.

Option 2 — Set up a 301 redirect from the deleted URL to the most relevant existing page. For Blogger: Settings → Errors and Redirects → Custom Redirects → add old URL → new URL.

Option 3 — Remove from sitemap if the page is intentionally deleted and has no natural redirect destination. Update your sitemap to exclude the deleted URLs.

After fixing — go to URL Inspection → paste the URL → click Request Indexing. This tells Google to recheck the URL and update its records.

Related: 404 errors from URL changes are one of the causes of traffic drops. See → Why Is My Website Traffic Dropping? — specifically the URL structure change section.

Error 2 — "Crawled — Currently Not Indexed"

What it means: Google crawled your page — read its content — and decided not to include it in search results. This is different from a technical error. Google made an editorial decision that your page doesn't meet its quality threshold for inclusion.

This is one of the most common and most misunderstood Search Console statuses. Many bloggers assume it means a technical problem — it usually means a content quality problem.

Common reasons Google doesn't index crawled pages:

  • Content is too thin — under 300 words with little value
  • Content is near-duplicate of another page on your blog or elsewhere online
  • Page has very low E-E-A-T signals — no author, no credentials, no expertise demonstrated
  • Content appears auto-generated or templated
  • Page has no internal links pointing to it — Google considers it unimportant

The Fix:

For thin content — expand the article substantially. Add real examples, a comparison table, an FAQ section, and step-by-step instructions. Target 800+ words of genuine value.

For duplicate content, rewrite completely from a different angle. Add original research, personal experience, or a unique format that doesn't exist elsewhere.

For orphaned pages — add internal links from your other articles pointing to this page. Google needs to see that your own blog considers this page important enough to reference.

After improving, use URL Inspection → Request Indexing to ask Google to recrawl and reconsider.

Related: "Crawled — Currently Not Indexed" on multiple pages is often connected to the broader content quality issues that cause deindexing. See → Why Did Google Deindex My Blog?

Error 3 — "Discovered — Currently Not Indexed"

What it means: Google found your URL — through a sitemap or a link — but hasn't crawled it yet. It's in the queue but hasn't been processed.

This is not an error in the traditional sense — it's a queue status. But if important articles remain in this status for weeks, it indicates a crawl budget or crawl frequency problem.

The Fix:

For new articles — wait 1–2 weeks. New content on established blogs gets crawled within days. New blogs with low authority may take weeks.

For persistent "Discovered" status after 3+ weeks — use URL Inspection → Request Indexing to move your URL to the front of the crawl queue.

For systematic "Discovered" across many articles, you have a crawl budget issue. Too many low-value pages are consuming Google's crawl budget before it reaches your important content. Block non-content pages (tag pages, archive pages) from crawling via robots.txt. See → Why Did Google Deindex My Blog? for the crawl budget section.

Error 4 — "Duplicate, Google Chose Different Canonical Than User"

What it means: You specified a canonical URL for your page, but Google decided a different URL is the canonical (primary) version. Google is indexing a different version of your content than the one you intended.

This typically happens when:

  • You have HTTP and HTTPS versions of the same page
  • Your blog is accessible at both www and non-www versions
  • Blogger generates multiple URLs for the same content (label pages, archive pages)

The Fix:

Ensure consistent URL structure — all internal links should point to your preferred URL format. If you want https://www.panstag.com as canonical — every internal link should use exactly that format — not http://panstag.com or https://panstag.com without www.

Add explicit canonical tags — in your Blogger template, ensure canonical tags match your preferred URL exactly. Check your theme's <head> section for:

<link rel="canonical" href="YOUR-PREFERRED-URL"/>
```

**Resolve HTTP vs HTTPS** — ensure your blog enforces HTTPS and all HTTP requests redirect to HTTPS automatically. For Blogger with custom domains — check Settings → HTTPS → enable HTTPS redirect.

---

### Error 5 — "Page With Redirect."

**What it means:** A URL in your sitemap redirects to another URL. Google is pointing out that your sitemap contains URLs that don't exist at their listed address — they redirect somewhere else.

**The Fix:**

Update your sitemap to contain the final destination URLs — not the redirecting ones. If you changed article URLs and set up redirects, update your sitemap to reflect the new canonical URLs directly. This saves Google a redirect hop and ensures the correct URLs are indexed.

---

### Error 6 — "Blocked by robots.txt."

**What it means:** Your robots.txt file is telling Google not to crawl specific pages — or in serious cases — your entire blog.

**How to confirm:**
Visit `yourblog.com/robots.txt` in your browser. Look for:
```
Disallow: /
```
This blocks everything. Or look for Disallow rules blocking your important content directories.

**The Fix:**

For Blogger — your robots.txt is partially auto-generated. Check Settings → Crawlers and indexing → Custom robots.txt. Remove any Disallow rules that are blocking important content.

The correct robots.txt for most Blogger blogs allows Google to crawl everything, with optional blocking of tag and archive pages that add no unique value:
```
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://yourblog.com/sitemap.xml

Core Web Vitals Errors

Core Web Vitals are Google's specific page experience metrics. Poor scores here directly impact rankings.

Error 7 — LCP (Largest Contentful Paint) — Poor

What it means: The largest visible element on your page — usually your featured image or main heading — takes too long to load. Google's threshold for "Good" LCP is under 2.5 seconds. "Poor" is above 4 seconds.

Impact: LCP is a direct ranking signal. Poor LCP means Google penalises your ranking compared to faster competitors.

The Fix:

For image LCP — your featured image is almost certainly your LCP element. Compress it aggressively — target under 50KB. Convert to WebP format. Add fetchpriority="high" and loading="eager" to your featured image specifically.

For text LCP — your main heading is loading slowly because fonts are blocking the render. Fix Google Fonts loading with font-display: swap. See → Why Is My Blogger Page Speed So Low? for the complete font and image speed fix guide.

Error 8 — CLS (Cumulative Layout Shift) — Poor

What it means: Elements on your page move around after the initial load — causing the page to shift visibly. Google's threshold for "Good" CLS is under 0.1.

Layout shifts happen when:

  • Images load without explicit width and height dimensions
  • AdSense ad units load and push content down
  • Fonts swap from system font to custom font, causing text reflow
  • Late-loading widgets push existing content downward

Impact: High CLS frustrates users and signals poor page experience to Google — resulting in ranking penalties.

The Fix:

For images — add explicit width and height to every image tag. This reserves space before the image loads — preventing content shift.

For AdSense — reserve space for ad containers in CSS before ads load:css

.adsbygoogle {
  min-height: 100px;
  display: block;
}

For fonts — add font-display: swap and use size-adjust, ascent-override, and descent-override properties to match your fallback font's metrics to your custom font — minimising the visual shift when fonts swap.


Error 9 — INP (Interaction to Next Paint) — Poor

What it means: Your blog responds slowly to user interactions — clicks, taps, keyboard input. Google's threshold for "Good" INP is under 200 milliseconds. Poor is above 500 milliseconds.

INP replaced FID (First Input Delay) as a Core Web Vitals metric in 2024. It measures responsiveness throughout the entire page session — not just the first interaction.

Common causes of poor INP on Blogger:

  • Heavy JavaScript executing on the main thread blocks interactions
  • Large event listeners are attached to many elements simultaneously
  • AdSense scripts are blocking the interaction response
  • Heavy animations running continuously

The Fix:

Use passive event listeners for scroll and touch events:

window.addEventListener('scroll', handler, { passive: true });

Defer heavy JavaScript — ensure non-critical scripts load with defer attribute so they don't compete with interaction responses.

Limit continuous animations — CSS animations running continuously consume main thread resources. Limit to essential animations only.

Mobile Usability Errors

Error 10 — "Text Too Small to Read"

What it means: Your blog's font size is too small for comfortable reading on mobile — below Google's recommended 16px minimum for body text.

The Fix: Set your body font size to a minimum of 16px in your theme CSS:

body {

  font-size: 16px;

  line-height: 1.6;

}

Error 11 — "Clickable Elements Too Close Together"

What it means: Links, buttons, or navigation elements are positioned so close together that mobile users can't reliably tap the one they intend to tap.

The Fix: Ensure all tappable elements have a minimum 48×48px tap target size and a minimum 8px spacing between adjacent tappable elements:

a, button {

  min-height: 48px;

  min-width: 48px;

  padding: 8px;

}

Error 12 — "Content Wider Than Screen"

What it means: Some element on your page extends beyond the screen width — causing horizontal scrolling on mobile. This is the mobile slide-left problem.

Common causes:

  • Images without max-width constraints
  • Code blocks or tables extending beyond the container width
  • Fixed-width elements wider than the mobile viewport
  • Third-party widgets with fixed pixel widths

The Fix: Add to your theme CSS:

css

html, body {

  overflow-x: hidden !important;

  max-width: 100%;

}

img, table, pre, code, iframe {

  max-width: 100% !important;

}

Related: Content wider than screen is a CLS and mobile usability problem that also causes the layout shifts described in the Core Web Vitals section above.

Sitemap Errors

Error 13 — "Sitemap Could Not Be Read"

What it means: Google tried to fetch your sitemap and either couldn't access it or found it malformed.

The Fix:

Visit your sitemap URL directly in your browser — yourblog.com/sitemap.xml. If it doesn't load, your sitemap has a problem.

For Blogger, the sitemap is auto-generated. If it's not accessible — check that your blog is set to public in Settings → Privacy → Visible to search engines.

Resubmit your sitemap in Search Console → Sitemaps → delete the existing submission → resubmit.

Error 14 — "Sitemap Contains URLs Not Allowed"

What it means: Your sitemap includes URLs that are blocked by robots.txt — meaning Google can't crawl them even though your sitemap says they should be indexed.

The Fix: Align your sitemap and robots.txt. Remove URLs from your sitemap that are blocked in robots.txt — or remove the robots.txt blocking for URLs that should be indexed.

Manual Action Errors

Error 15 — Any Manual Action Listed

What it means: A Google employee manually reviewed your blog and found a specific policy violation. This is separate from algorithmic issues.

This is the most serious Search Console error — it requires direct human intervention from both Google's side (applying it) and yours (fixing it and requesting reconsideration).

The Fix: See the complete manual penalty guide in → Why Did Google Deindex My Blog? — specifically the Manual Penalty section, which covers every penalty type and the exact reconsideration request process.

Security Issues

Error 16 — "Hacked Content Detected"

What it means: Google found content on your blog that was injected by attackers — spam articles, hidden links, or malware scripts.

The Fix:

Clean all hacked content immediately. Change all passwords. Enable MFA on your Blogger and Google accounts — see our complete security guide → Cloud Security Tips for Beginners.

After cleaning, go to Search Console → Security Issues → Request Review. Google re-crawls your blog and removes the security warning once it confirms the hack is resolved.

The Search Console Error Priority Order

Not all errors need immediate attention. Fix them in this order:

Priority Error Type Impact Time to Fix
Immediate Manual Actions Severe ranking penalty Fix before anything else
Immediate Security Issues Full deindexing risk Fix same day
High Blocked by robots.txt Prevents indexing Fix within 24 hours
High 404 errors in sitemap Crawl waste Fix within 1 week
Medium Crawled — Not Indexed Content quality signal Fix within 2 weeks
Medium Core Web Vitals Poor Ranking penalty Fix within 1 month
Medium Mobile Usability Errors Mobile ranking impact Fix within 1 month
Lower Canonical mismatches Minor indexing inefficiency Fix when convenient
Lower Sitemap errors Crawling inefficiency Fix within 1 month
Lower Discovered — Not Indexed Queue position Monitor — fix if persistent

The Search Console Weekly Routine

Make this a 10-minute weekly habit:

Monday morning — 10 minutes:

  1. Open Search Console → Performance → check clicks and impressions vs last week
  2. Open Coverage → check if error count increased since last week
  3. Open Core Web Vitals → check if any pages moved to Poor status
  4. Open Manual Actions → confirm still clear
  5. Open Security Issues → confirm still clear

If anything changed negatively, diagnose and fix before the week progresses. Catching errors early prevents small problems from becoming major traffic losses.

Related: Search Console monitoring is the foundation of preventing the traffic drops and ranking losses covered in → Why Is My Website Traffic Dropping?

The Search Console Error Fix Checklist

Coverage

  • Zero 404 errors from submitted URLs — all pages redirect or exist
  • "Crawled — not indexed" pages improved or noindexed
  • "Discovered — not indexed" pages requested for indexing
  • Canonical tags match preferred URLs across all pages
  • Robots.txt is not blocking important content

Core Web Vitals

  • LCP score Good (under 2.5s) on mobile — images optimised
  • CLS score Good (under 0.1) — images have dimensions, ad space reserved
  • INP score Good (under 200ms) — passive event listeners, deferred scripts

Mobile Usability

  • No "text too small to read" errors — minimum 16px body font
  • No "clickable elements too close" errors — minimum 48px tap targets
  • No "content wider than screen" errors — max-width applied to all elements

Sitemaps

  • Sitemap accessible at yourblog.com/sitemap.xml
  • No sitemap could not be read errors
  • No URLs in sitemap blocked by robots.txt

Actions and Security

  • Manual Actions section — empty
  • Security Issues section — empty
  • If either has entries — fixed and reconsideration requested

Frequently Asked Questions

Q1. How often should I check Google Search Console? 

Check Performance weekly — 5 minutes every Monday. Check Coverage, Core Web Vitals, and Mobile Usability monthly. Check Manual Actions and Security Issues immediately if you notice a sudden traffic drop. The weekly routine catches problems before they compound into major traffic losses.

Q2. Does fixing Search Console errors directly improve rankings? 

Yes — for technical errors. Fixing 404 errors improves crawl efficiency. Fixing Core Web Vitals improves page experience ranking signals. Fixing mobile usability errors improves mobile ranking. "Crawled — not indexed" fixes improve your overall indexing rate. Each fix removes a specific barrier between your content and Google's rankings.

Q3. I have hundreds of "Crawled — Currently Not Indexed" pages — should I fix all of them? 

Prioritise by traffic potential. Focus on articles targeting keywords with real search volume. For pages with no keyword targeting and no traffic potential, add noindex rather than trying to improve them. This actually helps your overall blog health by focusing Google's attention on your best content.

Q4. Why are my Core Web Vitals good in PageSpeed Insights but poor in Search Console? 

PageSpeed Insights shows lab data — a simulated test under controlled conditions. Search Console Core Web Vitals shows field data — real measurements from actual visitors on real devices and connections. Field data is slower than lab data because real-world connections, devices, and browser caches vary significantly. Focus on fixing field data (Search Console) — it's what Google uses for ranking decisions.

Q5. My sitemap shows errors but my blog seems to be working fine — should I fix it?

Yes — always fix sitemap errors. A sitemap with errors reduces Google's crawling efficiency and can cause new articles to take significantly longer to be discovered and indexed. Sitemap errors don't always cause visible symptoms immediately — but they compound over time into indexing delays and coverage gaps.

Q6. Search Console shows my blog has good Core Web Vitals, but my rankings are still low — why? 

Core Web Vitals is one ranking factor among hundreds. Good page speed doesn't overcome weak content, poor keyword targeting, or lack of backlinks. If your technical signals are strong but your rankings are still low, the problem is in your content strategy or authority. See → Why Is My Blog Not Ranking on Google? for the complete ranking diagnosis.

Search Console Is Google Talking to You — Listen

Most bloggers ignore Search Console errors for months.

The bloggers who check it weekly, fix errors promptly, and monitor trends consistently are the ones whose blogs grow predictably — because they catch and fix problems before those problems compound into major traffic losses.

Your action plan — do it today:

  1. Open Search Console → Coverage — write down your current error count
  2. Open Manual Actions — confirm it's empty — fix immediately if not
  3. Open Security Issues — confirm it's empty — fix immediately if not
  4. Fix your highest-priority Coverage error — start with 404s
  5. Open Core Web Vitals — identify any Poor scores
  6. Fix LCP first—image compression is the highest-impact fix
  7. Set a weekly Monday reminder — 10 minutes every week
  8. Come back in 4 weeks — compare error counts with today's baseline

Search Console errors are not optional reading.

They're Google's direct instructions for improving your blog.

📌 Quick Summary: Most important Search Console errors — 404 submitted URLs (fix with redirects or sitemap update), Crawled Not Indexed (fix with content quality improvement), Discovered Not Indexed (request indexing), Core Web Vitals LCP (compress images, fix fonts), CLS (add image dimensions, reserve ad space), INP (passive listeners, defer scripts), Mobile Usability (font size, tap targets, max-width), Manual Actions (fix immediately and request reconsideration), Security Issues (clean hack, enable MFA). Fix priority: Manual Actions and Security Issues first — everything else second. Check weekly for 10 minutes every Monday.

Author Image

Hardeep Singh

Hardeep Singh is a tech and money-blogging enthusiast, sharing guides on earning apps, affiliate programs, online business tips, AI tools, SEO, and blogging tutorials. About Author.

Next Post Previous Post