![]() |
How the Google bot's Works?
How the Google bot's Works?
|
Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index. We use a huge set of computers to fetch (or "crawl") billions of pages on the web. The program that does the fetching is called Googlebot
|
Googlebot uses sitemaps and databases of links discovered during previous crawls to determine where to go next. Whenever the crawler finds new links on a site, it adds them to the list of pages to visit next.
|
All times are GMT -7. The time now is 02:20 AM. |
Powered by vBulletin Copyright © 2020 vBulletin Solutions, Inc.