There are many factors of SEO for your WordPress blog. While walking in the path of blogging journey you will come to know about Robots.txt file which is just a simple file created by robots.txt file generator or you can create it yourself by using Notepad. Are you wondering about this file? Do you know what is Robots.txt file? Many bloggers try to know about this simple file.
Today I am going to let you know about Robots.txt and how can you create it? Do you wonder why Google is indexing your admin folder? Do you think why Google is indexing your wp-folders/plugins and many other folders which shouldn’t get indexed by Google? The answers are in this post.
[caption id="" align="aligncenter" width="515"] What Is Robots.txt File And How To Use It?[/caption]
What is the use of Robots.txt file for your website?
Many people have confusion regarding Robots.txt. I must tell you that there are two different aspect i.e crawling and indexing. Most of bloggers think that the file is used to stop Google for indexing any particular place of your blog which is not true. Crawling and indexing are two different activities to know.
When you create Robots.txt file then you are telling Google bots, Bing bots or any other third party bots what to crawl and what not to crawl from your blog. This file doesn’t have any relation to indexing and no-indexing. If you don’t want Google to index any particular part of your blog then you can add no-index tag to them.
If you don’t have Robots.txt file for your blog then Google will not be able to crawl all the inner-links of your website which will signal that your website is not search engine friendly. There are many folders you don’t want to get crawled by Google bots for which you need to create Robots.txt file.
Sometime you may get messages in your Google Webmaster Tools about the error in your Robots.txt file. It is because you have configured it wrong. May be you haven’t followed the right format.
You can check Robots.txt file using GWT?
This file exists in the root directory of your domain. You can check this file by typing the URL of your website followed by /robots.txt.
e.g. = www.your website.com/robots.txt
You can check it using GWT (Google Webmaster Tools). You can see many options after logging in to your GWT account from which you have go to ->crawl section where you will find ->robots.txt Tester. If you have any error then you will see in the current status but still you should check it by using the tester.
After going to the tester you have to write Robots.txt after the URL given of your website. After adding the URL you need click at “TEST” and if you get result as “ALLOWED” then there is nothing to worry about. Robots.txt file for your website is working fine.
The common structure of Robots.txt file is here.
User-agent: *
Disallow: /cgi – bin/
Here you are telling Google bots not to crawl cgi – bin folder of your website. Similarly you can add many other folders which you don’t want to get crawled by Google bots.
Conclusion :
Have you created your Robots.txt file for your website? Do you get any error message from Google Webmaster Tools? Do you even know what is Robots.txt file? There are many frequently asked questions by beginners about this file. As you all know that SEO is the major concern for bloggers and your Robots.txt file plays main role in it.
1 comments:
Write comments[…] Robots.txt fetch errors :- You all are well aware about robots.txt file which tells Google bots what to crawl and what not to crawl in your website? When Google bots […]
ReplyWould love to here from you...