By using the Robot.txt file, you can provide the command to the crawler regarding what webpage is to be crawled. For example, when I was working on a
pregnancy website, I used a follow-link in the Robot.txt file on the webpage I wanted to crawl and the no-follow link that I didn't want to be crawled.