View Single Post
Old 06-22-2017, 04:09 AM   #9
balakumar
Registered User
 
Join Date: Jun 2017
Posts: 86
A crawler is a program used by search engines to collect data from the internet. When a crawler visits a website, it picks over the entire website's content (i.e. the text) and stores it in a databank. It also stores all the external and internal links to the website.
balakumar is offline   Reply With Quote