![]() |
hi,I am reading this article and thanks for sharing this information for about forum posting,
|
Quote:
|
Thanks , Great post information!
|
Hello ,
It's a text file which instructs search engine spiders or crawlers on what to do. It tells specific web spiders on which specific web pages to index. |
A robots.txt file is a simple txt file. robots file on a website wills utility as a appeal that specified robots discount specified files or directories when crawling a site.
|
Robots.txt is a very useful text file to be uploaded on root directory of your site so as to disallow crawling our mentioned url's in robots.txt as not to be displayed to users out there.
Thanks |
The Software Exemption Conventional, also known as the Spiders Exemption Method or robots.txt protocol, is a meeting to avoid participating web spiders and other web robots from opening all or part of a web page which is otherwise openly readable. Spiders are often used by google to classify and store web websites, or by web page owners to check resource value.
|
Robots.txt is a text file that you can put on your site to tell search robots which page you like them not to visit. Robots.txt is by no means mandatory for search engines but search engines obey what they are asked not to do. The location of robots.txt is very important as it must to be in main directory.
|
A robots.txt is a permissions file that can be used to control which webpages of a website a search engine indexes. The file must be located in the root directory of the website for a search engine website-indexing program (spider) to reference
|
Robot.txt means to tell search engine of which pages you want to crawl or Not.
|
Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do.
Structure of a Robots.txt File : The structure of a robots.txt is pretty simple (and barely flexible) � it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows: User-agent: Disallow: �User-agent� are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to �user-agent:� and �disallow:� entries, you can include comment lines � just put the # sign at the beginning of the line: # All user agents are disallowed to see the /temp directory. User-agent: * Disallow: /temp/ |
Robot.txt tells to Google that which page should be crawl in the website.
|
All times are GMT -7. The time now is 07:04 PM. |
Powered by vBulletin Copyright © 2020 vBulletin Solutions, Inc.