It's frustrating when your Blogger pages aren't showing up in Google Search Console! There are several reasons this can happen, and thankfully, many common issues have straightforward solutions. Here's a breakdown of what to check and how to fix it:
1. Verify Your Blogger Blog in Google Search Console:
- Ensure
it's Added: First, make sure your Blogger blog is properly added and
verified as a property in Google Search Console. If you have a custom
domain (e.g., yourblog.com), make sure you've added and verified all
versions (http, https, www, non-www). Google treats them as separate
properties.
- Property
Type: It's best to verify using the "Domain property" option
if you have a custom domain, as it covers all subdomains and protocols
automatically. If you're using a blogspot.com address, verify the specific
URL prefix.
2. Check for "Noindex" Directives:
This is one of the most common reasons pages don't get
indexed.
- Blogger
Settings:
- Go
to your Blogger dashboard.
- Navigate
to Settings > Search preferences.
- Under
"Crawlers and indexing," ensure "Enable custom
robots.txt" is OFF (unless you know exactly what you're doing
with a custom robots.txt file, the default Blogger one is usually fine).
- Under
"Custom robots header tags," ensure "Noindex" is
NOT checked for "Home page," "Archive and search
pages," and "Post and page pages" unless you
specifically want those pages excluded from search. For regular blog
posts, you want them indexed, so these should be unchecked.
- Inspect
Individual Pages:
- In
Google Search Console, use the URL Inspection Tool. Enter the URL
of a specific Blogger page that isn't indexed.
- After
the inspection, look at the "Indexing" section. It will tell
you why the page isn't indexed. If it says "Excluded by 'noindex'
tag," this is your problem.
- You
might also see "Page with redirect" or "Alternate page
with proper canonical tag" for mobile versions (e.g., ?m=1). This is
normal for Blogger, as the desktop version is usually the canonical one.
Googlebot will still try to crawl and index the desktop version.
3. Check Your Robots.txt File:
While Blogger usually handles this well, a custom robots.txt
can sometimes inadvertently block Googlebot.
- How
to Check: Go to yourblogname.blogspot.com/robots.txt (or yourcustomdomain.com/robots.txt).
- What
to Look For:
- If
you have a custom robots.txt enabled, ensure there isn't a Disallow: /
line that blocks everything.
- Blogger's
default robots.txt typically disallows /search paths (archive pages,
label pages), which is generally fine. The important thing is that your
individual post URLs are not disallowed.
- Fix:
If you find issues and have custom robots.txt enabled, you can disable it
in Blogger settings or carefully edit it.
4. Submit/Resubmit Your Sitemap:
A sitemap helps Google discover all your pages.
- Blogger's
Default Sitemap: Blogger automatically generates a sitemap. For a
blogspot.com blog, it's usually yourblogname.blogspot.com/sitemap.xml or yourblogname.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500.
For custom domains, it's yourcustomdomain.com/sitemap.xml.
- Submit
in Search Console:
- Go
to Google Search Console.
- Select
your blog property.
- In
the left sidebar, go to Indexing > Sitemaps.
- Enter
your sitemap URL (e.g., sitemap.xml or atom.xml?redirect=false&start-index=1&max-results=500)
and click "Submit."
- Check
its status. It should say "Success."
5. Request Indexing for Specific Pages:
If a specific page isn't indexed after a reasonable time (a
few days to a week), you can manually request indexing.
- URL
Inspection Tool:
- In
Google Search Console, go to the URL Inspection Tool.
- Enter
the full URL of the problematic Blogger post.
- Click
"Request Indexing."
- Note:
Don't abuse this feature. Use it for new or updated important pages, not
for every single page on your site.
6. Content Quality and Uniqueness:
Google prioritizes helpful, reliable, and unique content.
- Thin
Content: Pages with very little text or just images might be
considered "thin" and not indexed.
- Duplicate
Content: If you have very similar content on multiple pages (even
within your own blog), Google might choose to index only one version or
none at all. Use canonical tags if you have intentionally similar pages
and want to signal the preferred version.
- Low
Quality: If your content is poorly written, not helpful, or appears
spammy, Google might de-prioritize it for indexing.
- Solution:
Focus on creating high-quality, original, and valuable content that truly
helps your readers.
7. Site Speed and Mobile-Friendliness:
While Blogger generally handles these well, significant
issues can hinder crawling and indexing.
- Check
in Search Console:
- Core
Web Vitals: See if there are any issues in the "Core Web
Vitals" report.
- Mobile
Usability: Check the "Mobile Usability" report for errors.
- Solutions:
Optimize images, reduce unnecessary widgets or scripts, and ensure your
Blogger theme is responsive.
8. Internal Linking:
Google discovers pages by following links.
- Solution:
Ensure your blog posts link to other relevant posts within your blog. This
helps Googlebot discover your content more effectively. Make sure your
navigation (menus, categories, labels) is clear and comprehensive.
9. New Blog/Posts:
If your blog is very new, or your posts are very recent, it
simply takes time for Google to crawl and index them. Patience is key.
10. Manual Actions/Penalties:
In rare cases, if your blog has violated Google's Webmaster
Guidelines, it might receive a manual action, which can lead to de-indexing.
- Check
in Search Console: Go to Security & Manual Actions > Manual
actions. If you see a manual action, follow Google's instructions to
resolve it.
Steps to Take in Summary:
- Check
Blogger Settings: Ensure "Enable custom robots.txt" is OFF
and "Custom robots header tags" are not set to
"noindex" for posts/pages you want indexed.
- Use
URL Inspection Tool: Input a problematic URL, check its status, and
identify the reason for non-indexing.
- Submit/Resubmit
Sitemap: Go to Search Console > Sitemaps and ensure your Blogger
sitemap is submitted and shows "Success."
- Request
Indexing: For important, non-indexed pages, use the "Request
Indexing" feature in the URL Inspection Tool.
- Review
Content Quality: Ensure your posts are unique, comprehensive, and
valuable.
- Patience:
Give Google time to crawl and index your content, especially for new blogs
or posts.
By systematically going through these steps, you should be able to identify and resolve most issues preventing your Blogger pages from being read by Google Search Console.