Robots.txt file which running 24/7 and crawl websites pages. First of All, you make a file namely robots.txt after that you can type with codes and you can told search engine robots allows or disallow to crawl your site indexes.
Furthermore, If you don't understand it please visit
Robots.txt Video Tutorial you can understood as well.