Site Owners Forums - Webmaster Forums

Site Owners Forums - Webmaster Forums (http://siteownersforums.com/index.php)
-   Search Engine Optimization (http://siteownersforums.com/forumdisplay.php?f=16)
-   -   What is the use of Robots.txt..? (http://siteownersforums.com/showthread.php?t=169950)

AshokDixit89 06-02-2016 01:05 AM

What is the use of Robots.txt..?
 
What is the use of Robots.txt..?

vienchaua10 06-02-2016 02:08 AM

"Robots.txt" redirects here. For Wikipedia's robots.txt files, see the MediaWiki Robots.txt file and the English Wikipedia Robots.txt file.
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned. Robots are often used by search engines to categorize web sites. Not all robots cooperate with the standard; email harvesters, spambots, malware and Search Engines robots that scan for security vulnerabilities may even start with the portions of the website where they have been told to stay out. The standard is different from, but can be used in conjunction with Sitemaps, a robot inclusion standard for websites.

SEONinja 06-02-2016 04:32 AM

Robot.txt is a text file. It gives instruction to search engine crawlers about indexing and caching of a web page, file of a website or directory, domain.

SaraSanjay 06-02-2016 04:44 AM

When a search engine crawler comes to your site, it will look for a special file on your site. That file is called robots.txt and it tells the search engine spider, which Web pages of your site should be indexed and which Web pages should be ignored.

yasmineassif 06-02-2016 05:33 AM

Website owners use the /robots.txt file to give instructions about their website to robots. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

pavandarsigunta 06-02-2016 06:01 AM

it will guide google spiders to crawl your site properly. it will tell to the spiders not to verify specific folder or file in your blog.

priyankayadav 06-02-2016 06:09 AM

Utilizing a robots.txt document is simple, yet requires access to your server's root area. Case in point, if your site is situated at:

http://adomain.com/mysite/index.html

you should have the capacity to make a record situated here:

http://adomain.com/robots.txt

RahulKumar 06-02-2016 10:40 PM

Robots.txt is a text file that is uploaded to a website's root directory and linked in the html code of the website. The robots.txt is used to provide instructions about the website to web robots and spiders.

Pillars 06-03-2016 02:05 AM

Robots.txt is a text file that is inserted into your website and contains information for search engine robots. The file lists webpages that are allowed and disallowed from search engine crawling.


All times are GMT -7. The time now is 05:57 AM.


Powered by vBulletin Copyright © 2020 vBulletin Solutions, Inc.