Robot.txt for Blog SEO: List of Generator and Checker tool

Robot.txt for Blog SEO: List of Generator and Checker tool

Robot.txt file is simply a line of codes that tells crawler what part of site to index and which not. As a blogger you should at least know the basic of Robot.txt file. Although it sounds more technical word but it is just as simple as publishing post. 

In this tutorial I will be sharing about tools that will help you to easily create Robot.txt file and tools to check whether the file you created is right or not. Because just a line of wrong code can block your site from being index in Google which ultimately leads to no traffic.
You don't need to go into the technical aspects and meaning of each line and neither I am going to teach you, but I will be sharing simple meaning and a tool that makes it easy to handle all.

Advantage of using Robot.txt

As I said with Robot.txt file you can tell Google to not index certain URL or part of the website, therefore you can take advantage in many ways
  • For example there are many parts of blog like archive, labels, search pages which we usually don't like to display in search results. All these pages contains duplicate content in perspective of a crawler which in returns cause duplicate content issue and finally lower ranking.
  • You will get the notification for the same in webmasters tool. To fix that I already shared the ways previously.
Fix Duplicate Meta Description Error in Webmaster tool
How to Remove 404 Broken Links Errors
  • So by simply blocking all these type of pages in Robot.txt file you will save all these kind of issues.
  • Similarly on WordPress you can block wp-admin area from robots and similarly other directories.
  • Having a Robot.txt file gives you advantage of a better blog management in terms of seo and Google crawler love that too.

Disadvantages of Robot.txt

There are no disadvantages as such but wrong use of Robot.txt file could lead to serious problem in blog.
  • For example you could accidentally place a code that actually tells crawler to not index you blog.
  • Similarly you might end up yourself blocking important pages.
  • When I was newbie I also blocked my sitemap from Google crawler and that caused the problem of not indexing my further posts. But I figured out in webmasters tool and immediately fixed the problem.

But you don't need to worry if you make use of tools that I listed below. But using these tools you can check whether you have correctly placed the code or not.

According to google:
A robots.txt file restricts access to your site by search engine robots that crawl the web. These bots are automated, and before they access pages of a site, they check to see if a robots.txt file exists that prevents them from accessing certain pages.

How to see your robot.txt file

  • Enter following url in the address bar

  • Replace yoursiteurl with your own blog url

The simplest robots.txt file

  • User-agent: the robot the following rule applies to
  • Disallow: the URL you want to block
User-Agent: Googlebot
Disallow: /folder2/

Tools to create Robot.txt file:

These tools are simple to use and it makes all hectic of understanding Robot.txt file easiest. 

1. SEObook tool:

  • You just need to add the URL which you want to block and just hit enters. This tools will automatically add it to you file. After you complete just copy the code and paste it into your Robot.txt file. 
  • Finally use tools to check the correctness.
Alternate tool

Add robot.txt to Blogger blog:

  • In blogger you can easily add Robot.txt code by going to settings>Search preferences.

Tools to check/Test Robot.txt file:

I strongly recommend you to use these tools after you added the code to your blog, because if something went wrong you would not be able to recognize the problem.

1. Using Google webmaster tool

  • Webmasters tool for Google is best for testing purposes and for that you can go to Crawl section of Webmaster Tools and then Blocked URLs 
  • Just copy the code from your blogger Robot.txt file and paste it in the box provided.

  • Now in the lower box enter your URL which you want to check and click test.

  • You will see a message showing the status if the URL is blocked by Robot.txt or not. It also shows the line which is allowing or blocking.

Remember adding code here in webmasters tool is just for testing purposes, it doesn't add Robot.txt file to you blog. For that you have to add the file in you blog only.

2. Frbee Robot.txt check

This was the simplest yet effective tutorial to understand and implementation of Robot.txt file in your blog. All the probloggers use Robot.txt for their blogs and you should too, but make sure you don't end up your blog harming firm it. That's why I have shared the testing tools as well. 
Do you have any questions? Did you placed wrong code in your Robot.txt file? Do you want to know what's the meaning of each line is? Please feel free to ask me in the comments below. I am always here to help you. Also share you experience of using Robot.txt file. Don't forget to like us in Facebook and Google plus.


Popular posts from this blog

Truecaller hookup With Google Duo: Rolls Out New SMS and Flash Messaging Services to Users

Mobile Balance Transfer for different networks – Vodafone, Airtel, Aircel, Idea, BSNL and Tata DOCOMO