Crawlers are programs used by search engines to explore the Internet and automatically download web content available on web sites. They capture the text of the pages and the links found, and thus enable search engine users to find new pages.
Robots performs three basic actions:
1.First they find the pages of the site and build a list of words and phrases found in every page.
2.With this list they create a database and find the exact pages they should seek by entering the query in search option.
3. After that, the robot is able to find the site when the end user type a word or phrase. This step is called query processor.
__________________
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts. | To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts. | To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts. | To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
|