Wednesday 19 September 2012

How To Configure Robots.txt file for maximum results?

How To Configure Robots.txt file for maximum results?

How To Configure Robots.txt file for maximum results?

Want to manage how search engines Crawls yore site??
Want Search engine to update your sitemaps automatically??
USE ROBOTS.TXT

Robots.txt is a file which allows you to manage how search engines crawls your site.
you can allow and disallow webpages to be crawls using this file


HOW Robots.txt Works?






CODES USED IN robots.txt


User-agent: *
Disallow: this option allows you to hide some page from search engines


Allow:  this section allows you to allow pages for index

Sitemap: submit your sitemap to searchengines


HOW to find and configure your robots.txt file?

  • Go to Blogger Dashboard
  • Search preferences
  • Errors and redirections
  • Custom robots.txt 
now click on edit

for maximum result i recommend this code


User-agent: *
Disallow: /search
Allow: /

Sitemap:http://igawar.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500

Sitemap:http://igawar.blogspot.com/atom.xml?redirect=false&start-index=501&max-results=500

PASTE THIS CODE IN robots.txt file and replace http://igawar.blogspot.com to your blog/website url .Keep remaining code as it is...!!!


Now see the result in your google webmaster
or type below url in address bar of your browser

http://igawar.blogspot.com/robots.txt

replace http://igawar.blogspot.com with your blog url


If you have any problem with this Trick then feel free to ask in comment box below..!!


Keep visiting @iGAWAR


No comments:

Post a Comment