![]() |
A file tell web bots what pages and directories to index or not index.
|
robots.txt file assist the search engine in crawling.
|
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.
|
Robot.txt is an on-page SEO technique and it is basically used to allow for the web robots also known as the web wanderers, crawlers or spiders. It is a program that traverses the website automatically and this helps the popular search engine like Google to index the website and its content.
|
Allow and disallow the google bot to crawl the website pages.
|
All times are GMT -7. The time now is 02:07 AM. |
Powered by vBulletin Copyright © 2020 vBulletin Solutions, Inc.