View Single Post
Old 06-21-2018, 09:11 PM   #3
phaman6
Registered User
 
Join Date: Jun 2018
Posts: 21
Crawling is the process by which a 'bot' discovers new and updated pages to be added to the respective search engines index.

Search Engines use a huge set of computers to fetch (or "crawl") billions of pages on the web. The program that does the fetching is called 'bot' (also known as a robot, bot, or spider). Bots use an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.

The crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters. As Bots visit each of these websites it detects links on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the search engines's index.
__________________

To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
|
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
|
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
|
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
|
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
phaman6 is offline   Reply With Quote