View Single Post
Old 03-05-2013, 04:18 AM   #5
jesse12
Registered User
 
Join Date: Mar 2012
Posts: 74
Robots file is a simple text file which is uploaded on root directory this is the file which is crawled by the crawler very first. So in this file we write the code which file we want to block from the crawler and other one we want to crawl.

Main use of this file is just to block the admin section which generally we wish do not crawl by the search engine.

Hope you understood.
jesse12 is offline   Reply With Quote