The robots prohibition standard, otherwise called the robots rejection convention or basically robots.txt, is a standard utilized by sites to speak with web crawlers and other web robots. The standard determines how to advise the web robot about which regions of the site ought not be handled or filtered.
__________________
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
|