Cookbook /
BlockCrawler
Summary: Redirect web crawlers to different pages
Version:
Prerequisites: Last tested on PmWiki version: 2.0beta-54
Status: missing script
Maintainer: Daniel Scheibler
Discussion: BlockCrawler-Talk?
Question
Could I show web crawlers like Googlebot, MSN Bot and others show different pages?
Answer
You want to protect several pages to crawl by web bots?
This script will detect the markup (:blockcrawler:)
and if a web crawler want to spider your website he will automaticly redirect to a special page in group BlockCrawler. In this way your content will be protected by indexing of (well-known) search engines.
- block_crawler.zip unpacks into cookbook/block_crawler.php, wikilib.d/BlockCrawler.HomePage and several special pages in wikilib.d/BlockCrawler.*
add the following line to config.php
@include_once("$FarmD
/cookbook/block_crawler.php");
- BlockCrawler/HomePage is the configuration file with following syntax:
- crawler name -> wiki page in group BlockCrawler to show -> optional comments
- There are three configuration variables
SDV($BlockCrawlerGroup, 'BlockCrawler');
- name of BlockCrawler groupSDV($BlockCrawlerConfigPage, 'HomePage');
- name of BlockCrawler config pageSDV($BlockCrawlerConfigItemize, true);
- config page is bullet list or not