![]() |
How to create Robots.txt file for my website
hi,
How to create Robots.txt file for my website |
there are many tools available to generate Robot.txt ....just search on google search engine ( create Robot.txt) the go to any robot.txt creating sites which come on search engine ,type your website domain and follow further step....your file of Robot.txt will generate for your website
|
Use a text editor to create the robots.txt file and add REP directives to block content from being visited by the bots. The text file should be saved in ASCII or UTF-8 encoding. Create a list of Disallow directives listing the content you want blocked.
|
Google have tool to put your website that tool
|
A Simple way to create robots.txt file. You can also create robot.txt file for own website to follow some simple steps:
1. Open Notepad/Text Pad or other editor tools. 2. Write these code User-agent: * Disallow: Allow: Sitemap: your URL/sitemap.xml 3. Save this file "robots.txt Your robots.txt file ready, use it. Thanks :) |
Dont use the any tools you should create the robots text manually. If you want learn about robots text click this link -- www.robotstxt.org/
|
The syntax for using the keywords is as follows:
User-agent: * [the name of the robot the following rule applies to] Disallow: [the URL path you want to block] Allow: [the URL path in of a subdirectory, within a blocked parent directory, that you want to unblock] |
Use a text editor to create the robots.txt file and add REP directives to block content from being visited by the bots. The text file should be saved in ASCII or UTF-8 encoding. Create a list of Disallow directives listing the content you want blocked.
|
All times are GMT -7. The time now is 07:25 PM. |
Powered by vBulletin Copyright © 2020 vBulletin Solutions, Inc.