![]() |
What is robots.txt used for?
I learnt here about sitemap.xml file. Now I'm little bit confused for what is robots.txt used for?
Please guys clear it. |
Robots.txt is a text file that is used to insert into the website. This file contains URLs that are allowed and disallowed from search engine crawling.
|
The purpose of the using robots.txt file is to tell the search engine bots which file should and which should not be indexed by them. In order to allow search bots to crawl and index the entire content of your website.
|
Robot.txt file Is used to tell the search engine which Page you do not Want Crawled or Indexed.
|
Robots.txt is common name of a text file that is uploaded to a Web site's root directory and linked in the html code of the Web site. The robots.txt file is used to provide instructions about the Web site to Web robots and spiders.
|
Google search bro
|
Its very easy to understand here we give instructions to search engine which page they can crawl and index. And get more detail from online blogs.
|
All times are GMT -7. The time now is 06:02 AM. |
Powered by vBulletin Copyright © 2020 vBulletin Solutions, Inc.