Crawler: Also known as Robot, Bot or Spider. These are programs used by search engines to explore the Internet and automatically download web content available on web sites. They capture the text of the pages and the links found and thus enable search engine users to find new pages.
|