Robots.txt File
The
robots.txt file is a text file placed in a website's root directory which instructs search engine bots which pages should be crawled and indexed and which should not. There are three available keywords (elements) in the robots.txt file: User-agent, Disallow and Allow and the syntax applied to them is extremely simple. The user-agent (search engine robot) keyword defines which robot the Disallow and Allow keywords applies to. Google uses several user-agents. Googlebot handles crawling for Google Search, and Googlebot-Image crawls for Google Image Search. Typically, all user-agents are grouped together using a wild card (asterisk) as in this example: User-agent: *
The Allow keyword can be used to permit access to a subdirectory under a parent directory which has been disallowed.
The order and syntax of the robots.txt file is as follows:
User-agent: [the name of the robot the following rule applies to]
Disallow: [the URL path you want to block]
Allow: [the URL path in of a subdirectory, within a blocked parent directory, that you want to unblock]
How to create a robots.txt file
Check out our guide to
robots.txt for SEO
coding