Fix Robots Txt blocked issue

Fix Robots Txt blocked issuePlay button
Video Duration
~5 mins

Mastering Google Search Console: A Comprehensive Guide to Fixing Robots.txt Issues

Google Search Console is an essential tool for website owners, providing valuable insights into how Google views their site. However, it can sometimes throw up issues that are not immediately clear, such as robots.txt problems. This blog post provides a step-by-step guide on how to identify and fix these issues, ensuring your site is fully accessible to Google's crawlers. From understanding the coverage report to modifying your robots.txt file, this guide covers it all.

Key Takeaways

  • Understanding the Google Search Console coverage report
  • Identifying pages blocked by robots.txt
  • Modifying your robots.txt file to fix issues
  • Using external links to give Google access to blocked pages

Table of Contents

Understanding the Coverage Report

The first step in fixing robots.txt issues is understanding the Google Search Console coverage report. This report provides a snapshot of your site's status, showing any errors or warnings that may be affecting your site's visibility on Google. Ideally, you should have zero errors and zero warnings. However, some pages may be excluded due to various reasons, including being blocked by robots.txt.

Identifying Pages Blocked by Robots.txt

Within the coverage report, you can identify pages that are blocked by robots.txt. These are pages that Google is allowed to index but is prevented from crawling due to instructions in your robots.txt file. By clicking on the 'blocked by robots.txt' link, you can view the specific pages affected and delve into the issue further.

Modifying Your Robots.txt File

Once you've identified the pages blocked by robots.txt, the next step is to modify your robots.txt file. This file is typically located in the root directory of your site and can be accessed through your domain registrar or hosting provider. The process involves finding the specific line of code that is causing the block and deleting it. After saving the changes, the issue should be resolved.

Another strategy for overcoming robots.txt issues is to use external links. By linking to the blocked page from an external site, you can give Google access to index the page. This can be a useful workaround if modifying the robots.txt file is not possible or practical.


Fixing robots.txt issues in Google Search Console can seem daunting, but with a clear understanding of the coverage report and the right steps, it can be a straightforward process. Whether you're modifying your robots.txt file or using external links, these strategies can help ensure your site is fully accessible to Google's crawlers, improving your site's visibility and SEO performance.

Is this you?

💸 You have been spending thousands of dollars on buying backlinks in the last months. Your rankings are only growing slowly.

❌You have been writing more and more blog posts, but traffic is not really growing.

😱You are stuck. Something is wrong with your website, but you don`t know what.

Let the SEO Copilot give you the clicks you deserve.