View Single Post
Old 03-21-2025, 02:34 AM   #2
markstyne
Registered User
 
Join Date: Dec 2012
Location: Los Angeles
Posts: 210
By using the Robot.txt file, you can provide the command to the crawler regarding what webpage is to be crawled. For example, when I was working on a pregnancy website, I used a follow-link in the Robot.txt file on the webpage I wanted to crawl and the no-follow link that I didn't want to be crawled.

Last edited by markstyne; 03-21-2025 at 09:19 PM..
markstyne is offline   Reply With Quote