Robot.txt for Blog SEO: List of Generator and Checker tool
Advantage of using Robot.txt
- For example there are many parts of blog like archive, labels, search pages which we usually don't like to display in search results. All these pages contains duplicate content in perspective of a crawler which in returns cause duplicate content issue and finally lower ranking.
- You will get the notification for the same in webmasters tool. To fix that I already shared the ways previously.
How to Remove 404 Broken Links Errors
- So by simply blocking all these type of pages in Robot.txt file you will save all these kind of issues.
- Similarly on WordPress you can block wp-admin area from robots and similarly other directories.
- Having a Robot.txt file gives you advantage of a better blog management in terms of seo and Google crawler love that too.
Disadvantages of Robot.txt
- For example you could accidentally place a code that actually tells crawler to not index you blog.
- Similarly you might end up yourself blocking important pages.
- When I was newbie I also blocked my sitemap from Google crawler and that caused the problem of not indexing my further posts. But I figured out in webmasters tool and immediately fixed the problem.
According to google:
A robots.txt file restricts access to your site by search engine robots that crawl the web. These bots are automated, and before they access pages of a site, they check to see if a robots.txt file exists that prevents them from accessing certain pages.
How to see your robot.txt file
- Enter following url in the address bar
- Replace yoursiteurl with your own blog url
The simplest robots.txt file
- User-agent: the robot the following rule applies to
- Disallow: the URL you want to block
User-Agent: Googlebot
Disallow: /folder2/
Tools to create Robot.txt file:
1. SEObook tool:
- You just need to add the URL which you want to block and just hit enters. This tools will automatically add it to you file. After you complete just copy the code and paste it into your Robot.txt file.
- Finally use tools to check the correctness.
Add robot.txt to Blogger blog:
- In blogger you can easily add Robot.txt code by going to settings>Search preferences.
Tools to check/Test Robot.txt file:
1. Using Google webmaster tool
- Webmasters tool for Google is best for testing purposes and for that you can go to Crawl section of Webmaster Tools and then Blocked URLs
- Just copy the code from your blogger Robot.txt file and paste it in the box provided.
- Now in the lower box enter your URL which you want to check and click test.
- You will see a message showing the status if the URL is blocked by Robot.txt or not. It also shows the line which is allowing or blocking.
Remember adding code here in webmasters tool is just for testing purposes, it doesn't add Robot.txt file to you blog. For that you have to add the file in you blog only.
2. Frbee Robot.txt check
Alternative:
- www.robotstxt.org/checker.html
- http://phpweby.com/services/robots
No comments:
ehijoshua2 at facebook