Spiders is a name for the robots or crawlers that scan web and save the information in websites to a database. Hence, "Google spiders" is another name for "Google indexing bots". Spider - a browser like program that downloads web pages.
The Robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website.
Crawler – a program that automatically follows all of the links on each web page
__________________
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts. | To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts. | To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts. To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
|