00523: Allow robot.txt entries to come from a wiki page

Summary: Allow robot.txt entries to come from a wiki page
Created: 2005-09-15 04:03
Status: Closed
Category: Feature
From: Isidor
Assigned:
Priority: 4
Version: 2
OS:

Description: As PITS 00522 suggest tu manage the InterMap entries from a wikipage it could be a good idea to manage also the robot.txt from a dedicated wiki page.

Of course this feature will work only if PmWiki is able to write the /robot.txt file in the top-level of your URL space. (see http://www.robotstxt.org/wc/exclusion-admin.html)

If PmWiki do not have this privilege it would be a good idea to default the META Tags on all page, leaving additional controlled from pages http://pmwiki.org/wiki/Cookbook/ControllingWebRobots


Closed -- PmWiki already generates default META tags for all pages, in order to better control robots.

In general it's very unlikely that PmWiki would be able to write the /robots.txt file on any particular installation, so instead of doing this in the core I would leave it for a recipe.

Pm November 12, 2007, at 05:00 PM