How to Fix Crawled – Currently Not Indexed

Author
Stephan
Category
Time to read
0 minutes
Date

Introduction

How to fix Crawled – currently not indexed

When a URL displays as index status “crawled – currently not indexed” in the page indexing report in Google Search Console, it means that while Google’s bots (googlebot) have visited the page, it is not part of the search result displayed to users when they search. In other words, google has found but the but it is not indexed by google.

You should not that this error is different from “Discovered – currently not indexed”. The latter indicates an issue with your crawl budget. Read here how to fix discovered – currently not indexed.

Common Reasons for Indexing Issues

There are multiple reasons why Google has crawled but not indexed a page

  • Poor content or thin content
  • Too much competition
  • Content is not readable by googlebot (technical issues)
  • (Absence of) Internal Linking

Poor content:  Pages deemed low quality or not aligned with user intent may not be indexed. Unedited ai generated content is unlikely to be indexed. If you have improved the quality, you can request indexing manually or using SEO Copilot. It will take a few days for google to crawl the page and for google to index the page. 

Too much competition: If other websites already provide great content why would google show your result? Websites that are older than yours, that have a stronger backlink profile or have a lot of authority in your niche will outrank you (even if your content is better).  

You can always check the websites that are ranking for the same search term as you.  If the keyword difficulty is very high and/or if the pages that are ranking for the keyword have a high page authority and many domains referring to them, it will be very hard to outrank them. Not all backlinks count, but you need to match or exceed strong backlinks (e.g. do-follow links from major news publications) that point to your competitors.

Technical SEO Issues: Pages built with javascript can have indexing issues. Content is hidden until javascript is executed. Since google “reads” pages in two steps (html) and rendered content (javascript), it can take time (if ever) until google renders the page and reads the content. 

Make sure that if you turn off javascript, the text of your website is still visible.

Internal Linking: The error “Crawled – currently not indexed” does not happen often if there are many internal links pointing to that page. Internal links ensure that the googlebot visits the page often and understands its importance. Especially, if you have a homepage link to the page, it is very unlikely that the page will face the “Crawled – currently not indexed” error. 

On the other hand, the “Crawled – currently not indexed” happens quite often when website owners publish thousands of ai-generated content pages per day or week.

Improving Site Indexability

To enhance the likelihood of pages being indexed by search engines like Google, focus on optimizing content for SEO, refining site structure and navigation, and applying technical SEO best practices.

Optimizing Content for SEO

Creating high-quality content is fundamental. 

Signs of high quality content are:

  • You answer the question fast and later expand on your answer. Don’t bore the reader, nobody has time for that. Ideally, you provide new insight that other pages do not offer. From google’s perspective, if a user searches something, the user clicks on your page, stays there and then does NOT visit any other search results, you have answered the question.
  • You aim for 1 keyword/topic = 1 page. Don’t confuse the reader by being all over the place. Clarity is king.
  • Ideally, you have backlinks. It is not very likely that high quality content automatically attracts backlinks. You can always share your content on social media to make it know. If you’re lucky, people might even link to it.
  • If you have tens or hundreds of pages of the same content you have a duplicate content issue. Google will not index duplicate pages.
  • Don’t use unedited ai content. At the time of writing (spring 2024), the ai content is very bland and merely repeating what other people have written already. Spammers index thousands of pages like this, but the content quality is so poor that they do not rank.

Enhancing Site Structure and Navigation

A cohesive site structure aids both users and search engine crawlers. Implement a logical internal link structure to connect quality pages and address orphan pages. Navigation should be intuitive, guiding visitors to relevant sections and signaling site quality to search engines.

What does this mean in practive?

Your home page should link to all pages that are of great importance to you.

Properly configured canonical tags help prevent issues with duplicate content across URLs.

Leveraging Technical SEO Best Practices

Technical SEO encompasses a spectrum of strategies to boost site indexability. Ensure robots.txt files permit crawling of important pages and that sitemap.xml is current, aiding in faster discovery of URLs. Employ internal linking wisely to distribute authority and facilitate re-indexing. Monitor the Google Search Console for crawled, currently not indexed pages and follow suggested actions to resolve exclusions.

Analyzing and Monitoring Site Performance

To effectively resolve issues like “Crawled – currently not indexed,” one must meticulously analyze and monitor the site’s performance. This process involves utilizing powerful tools like Google Search Console and conducting regular SEO audits to uncover any underlying problems affecting indexing and search rankings.

Using Google Search Console

Google Search Console is a fundamental tool for webmasters to track site performance in the SERPs. Through the Index Coverage Report, one can identify pages that have been crawled but not indexed. It’s critical to inspect these URLs using the URL Inspection Tool, checking for misconfigurations in the robots.txt file that could block indexing. This report also offers insights into the overall site quality, including EAT (Expertise, Authoritativeness, and Trustworthiness) factors that can influence search rankings.

Performing Regular SEO Audits

Regular SEO audits are instrumental in maintaining a website’s health. An audit should review the site’s adherence to SEO best practices, verify that the robots.txt file is correctly guiding search engine crawlers, and ensure that content is aligned with the criteria for EAT. In addition, audits can help detect issues affecting site quality and assess the site’s architecture and internal linking structure for any elements that could impede proper indexing.

Frequently Asked Questions

When web pages are crawled but not indexed, it signifies that search engines see the content but choose not to include it in their index. This section aims to clarify the reasons behind this and the steps one can take to resolve such issues across different website platforms.

Why might a page be crawled yet not indexed by search engines?

A page may be crawled but not indexed for several reasons, including poor content quality, duplication, or if it doesn’t align with the search engine’s guidelines. Understanding these specific issues is crucial to taking the proper corrective actions.

What steps should I take to address a ‘crawled but not indexed’ issue on my WordPress site?

For WordPress site owners dealing with a ‘crawled but not indexed’ problem, verifying that each page provides unique value and using plugins like Instant Indexing can often prompt search engines to re-evaluate and index the pages.

How can I fix ‘discovered but not indexed’ status for my pages in Google Search Console?

To resolve a ‘discovered but not indexed’ status in Google Search Console, ensuring that the content meets user intent is key. Additionally, constructing a temporary sitemap can help search engines better understand recent changes or redirects.

What are common reasons for ‘crawled – currently not indexed’ errors on Shopify stores?

Shopify store owners might encounter ‘crawled – currently not indexed’ issues due to duplicated content, low domain authority, or misconfigured noindex directives. This might not always be an issue since you do not need every product variant to appear on google. Correcting these mistakes typically improves the chances of pages being indexed.

How do I resolve ‘crawled – currently not indexed’ issues for a Wix website?

For a Wix website experiencing ‘crawled – currently not indexed’ challenges, it’s important to regularly monitor through SEO tools and ensure the site adheres to SEO best practices, such as using correct canonical tags and avoiding duplicate content.

What does a ‘validation failed’ message mean for not indexed URLs, and how is it corrected?

A ‘validation failed’ message implies that an attempt to index URLs was unsuccessful. One should recheck the URLs for errors or signals preventing indexing, like poor content or crawl errors, and address these issues before requesting a re-crawl.

Is this you?

💸 You have been spending thousands of dollars on buying backlinks in the last months. Your rankings are only growing slowly.


❌You have been writing more and more blog posts, but traffic is not really growing.


😱You are stuck. Something is wrong with your website, but you don`t know what.



Let the SEO Copilot give you the clicks you deserve.