Thread: Robot text.
View Single Post
Old 02-21-2013, 03:08 AM   #9
socialmediadfw
Registered User
 
Join Date: Feb 2013
Location: Dallas
Posts: 27
Robots.txt file is a text file which control to search engine spiders to crawl a website. If you want allow search engine spiders to crawl a website then use by default setting.

User-agent:*
disallow:

In above setting * indicates all search engines are allowed to crawl a website.

Now another example to stop search engine spiders to crawl a website

User-agent:*
disallow:/

In above setting all search engines are disallowed to crawl a website. IF you want to disallow to some pages of website to search engine spider the just give directory path after / in text file like below example.


User-agent:*
disallow:/wp-admin
disallow:/cgi-bin
__________________
Increase web presence and visibility of your online or local business to reach targeted audience from
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
socialmediadfw is offline   Reply With Quote