![]() |
What is the use of Robots.txt?
Hello Friends,
Please tell me about, What is the use of Robots.txt? |
Robots.txt is a text (not html) file you add on your website to let search robots know which pages you would like them not to visit.
|
It is a text file where you can allow or disallow folder file and image.
|
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.
|
Quote:
|
|
I just want to know which is most important search engine in India like Google being and yahoo can any body tell me ?
|
Robot.txt
Robot.txt or Robots exclusion standard or robots exclusion protocol is standard used by sites to communicate web crawlers and other web robots. Site owners use robots.txt file to give instructions about their site to web robots.
|
The robots exclusion protocol (REP), or robots.
|
Robots.txt is a text file where user or site owner uses to make allow or disallow respective folder file and image.
|
Robots.txt file which running 24/7 and crawl websites pages. First of All, you make a file namely robots.txt after that you can type with codes and you can told search engine robots allows or disallow to crawl your site indexes.
Furthermore, If you don't understand it please visit Robots.txt Video Tutorial you can understood as well. |
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.
|
Robot.txt file uses to tell the search engine spider, what are the file to be a boot or crawl and index.
|
All times are GMT -7. The time now is 02:19 AM. |
Powered by vBulletin Copyright © 2020 vBulletin Solutions, Inc.