This page is under construction. For PmWiki v2.1 (or later) it is supposed to describe "robots" in general, and the settings available to control them.
The default PmWiki skin adds "rel=nofollow" to all the available actions (edit, diff, upload, and print action links at the top and bottom of the page). This is a convention introduced by Google and it tells the robot crawler not to give any weight to the content accessed by "nofollow" links.  Note that many robots, including Yahoo! Slurp and msnbot, completely ignore the nofollow attributes on links.
The anti-spam convention that first introduced "nofollow" doesn't say anything about robots not following the links,only that links with rel="nofollow" shouldn't given any weight in search results.
you can also add rel=nofollow in a [[wiki style(s), viz %rel=nofollow%, to specify
Here's one example of the contents of a robots.txt, that primarily would tell robots to ignore links containing '
User-agent: * Disallow: /pmwiki.php/Main/AllRecentChanges Disallow: /pmwiki.php/ Disallow: */search Disallow: SearchWiki Disallow: *RecentChanges Disallow: RecentChanges Disallow: *action= Disallow: action= User-Agent: W3C-checklink Disallow:
We can now provide better control of robot (webcrawler) interactions with a site to reduce server load and bandwidth.
- used to detect robots based on the user-agent string
- any actions not listed in this array will return a 403 Forbidden response to robots
- setting this flag will eliminate any forbidden ?action= values from page links returned to robots, which will reduce bandwidth loads from robots even further (PITS:00563).
- see layout variables
- Cookbook:Controlling WebRobots - How to control web robots or bots trying to scan files
- Wikipedia:Robots.txt Robots Exclusion Standard
- http://microformats.org/wiki/rel-nofollow, http://www.nonofollow.net/
- Category: Robots