ACAP, which is a possible plug-in for the Robots Exclusion Standard, was released as v1.0 on November 30, 2006.
Examples
This example allows all robots to visit all files because the wildcard "*" specifies all robots:
User-agent: *
Disallow:
This example keeps all robots out:
User-agent: *
Disallow: /
The next is an example that tells all crawlers not to enter four directories of a website:
User-agent: *
Disallow: /cgi-bin/
Disallow: /images/
Disallow: /tmp/
Disallow: /private/
Example that tells a specific crawler not to enter one specific directory:
User-agent: BadBot # replace the 'BadBot' with the actual user-agent of the bot
Disallow: /private/
Example that tells all crawlers not to enter one specific file:
User-agent: *
Disallow: /directory/file.html
Note that all other files in the specified directory will be processed.
Example demonstrating how comments can be used:
# Comments appear after the "#" symbol at the start of a line, or after a directive
User-agent: * # match all bots
Disallow: / # keep them out
Examples
This example allows all robots to visit all files because the wildcard "*" specifies all robots:
User-agent: *
Disallow:
This example keeps all robots out:
User-agent: *
Disallow: /
The next is an example that tells all crawlers not to enter four directories of a website:
User-agent: *
Disallow: /cgi-bin/
Disallow: /images/
Disallow: /tmp/
Disallow: /private/
Example that tells a specific crawler not to enter one specific directory:
User-agent: BadBot # replace the 'BadBot' with the actual user-agent of the bot
Disallow: /private/
Example that tells all crawlers not to enter one specific file:
User-agent: *
Disallow: /directory/file.html
Note that all other files in the specified directory will be processed.
Example demonstrating how comments can be used:
# Comments appear after the "#" symbol at the start of a line, or after a directive
User-agent: * # match all bots
Disallow: / # keep them out
No comments:
Post a Comment