Summary: Talk page for BackupPages.
Users: (View? / Edit)

This space is for User-contributed commentary and notes. Please include your name and a date along with your comment.


Is it not possible to protect the backup action with a password just as the edit or attr actions can be protected? Would this require rewriting of the pmwiki core? Francis February 22, 2007, at 09:23 AM

I'm not greatly in favor of adding admin functions directly to pmwiki, instead I created the script below to backup and restore the wiki.d directory. This uses the PEAR File::Archive package.

<TITLE>Admin Functions</TITLE>
<H1>pmwiki Administration Functions</H1>
<form method="POST"  action="<?php echo $_SERVER['PHP_SELF']; ?>">
Backup Wiki <input type='radio' name='admin' value="backup"><br>
Restore Wiki <input type='radio' name='admin' value='restore'><br>
<input type=submit value="Submit">
<input type="hidden" name="_submit_check" value="1"/> 

ini_set("include_path", $_SERVER['DOCUMENT_ROOT'] . "/pear_includes/");

require_once "File/Archive.php";
if (array_key_exists('_submit_check', $_POST)) {

  if ($_POST['admin'] == 'backup') {
    echo "<b>archiving wiki.d</b>";

  } else {
    echo "<b>restoring wiki.d</b>";


davidof September 3, 2005 at 4:44pm

This doesn't work properly with wikifarms. Making a backup will save the zip file in the chosen directory. Making a backup from more than one farm at once will overwrite the precedent backup

kt007 March 13, 2005, at 10:34 AM~

I'll try to solve the problem in a next version with will allow to backup wiki.d or files directory or both, adding a password protection too. For Farm i have to set a simply local one to test it, but casue the libray used to zip files allow appending too it's just an option to add to the command line or in the internal option configuration. Thanks for the note.

SteveAgl March 13, 2005, at 11:10 AM

This seems to always generate a zip file regardless of the extension specified. Looks like the code should switch on the format to determine the archive class to use instead of always using zip_file. Also, the link generated for $ShowLink was incorrect; I fixed it with the following:

If ($ShowLink) {
    $BackupUrl = preg_replace('#/[^/]*$#',"$BackupDir",$ScriptUrl,1); # use instead of $ScriptUrl.$BackupDir
    echo "<br /><br />You can download the backup file now:
        <a href='".$BackupUrl.$BackupFile.$BackupFormat."'>".$BackupUrl.$BackupFile.$BackupFormat."</a>";

KevinWatts April 15, 2005, at 11:52 PM

It would be usefull if one could specify just a directory, i.e. the root directory of the wiki, and the scripts stuffs all subdirectories excluding the archive directory itself to make a complete snapshot of a wiki including the scripts, cookbook etc.

Schlaefer April 17, 2005, at 05:51 AM

If your wiki's name has a space in it, your zip-file will have this space in its name, too. This max be a problem, because some FTP-clients (e.g. SmartFTP) won't do anything with such files (neither deleting nor renaming nor downloading). The solution is simple:

If ($DateFormat == 1 ) {
        SDV($BackupFile, str_replace(" ","",$WikiTitle)."_".date ("m-d-Y"));
    } else {
        SDV($BackupFile, str_replace(" ","",$WikiTitle)."_".date ("d-m-Y"));

JustusvV? June 14, 2005, at 08:54 AM

The Farm issue could be fixed changing the next lines

    # Set the local path to backup directory
    SDV($BackupDirPath,$_SERVER[DOCUMENT_ROOT].$BackupDir);  //for Farm

In my opinion this Cookbook receip should require the admin password The spaces are not the only issue in archive name '` could cause issues Isidor

To protect this cookbook and only alow admin to backup the wiki, you can add these lines :

    if (strpos($DefaultPasswords['admin'],$GLOBALS[authid])!==FALSE)
        //Current Cookbook code here.


I wrote for my needs an enhanced version of this script to save all useful directories (as mentioned in BackupAndRestore) and fix some bugs mentioned here.

You can find it at

If you find it useful, may be it's worth updating this page (or create a new one...)

Nicolas August 15, 2006

  • I found it useful - more useful than the one on this page - why not make it the default for this page? Francis September 01, 2006, at 10:14 PM

uploads2zip mod - download all files in a group

hi. the users on my site can create their own groups, and then upload lots of files to their pages for others to download. i've always wanted to be able to download all of the files from a single group, instead of having to do so one-by-one. i just modded this recipe (the one provided by Nicolas) in a way that i've found useful for doing just that. the redirect page is still pretty ugly/unglamorous, but it seems to work alright, by simply calling the current page as such (perhaps through a link on the page):

        [(approve links)

note that this recipe mod only works if either a) you DON'T have the original BackupPages recipe already installed, or b) if you create a conditional in config.php to only load this recipe in certain situations, and the traditional backup recipe in all other cases, so that they don't get called at the same time - for example:

     $group = PageVar($pagename,'$Group');
     if (CondAuth($pagename,'admin') && ($group=="Site")) {
        // maybe only allow backup recipe by admin, and from a Site-page
        ## backup
     } else {
	// backup a single group's uploads file when this
        // this can be called from any site that isn't "Site"
	## action=uploads2zip

(there's probably a simple way around this, perhaps by integrating both action-calls into the same file, but i'll leave that to another weekend code-warrior to figure out as this currently works for my needs.)

so, here's the code for the part that i modded, with a lot of comments and such from the original recipe deleted in order to keep this more concise:

   if ($action=='uploads2zip') {
	#Personalize the script here

	# get Groupname if it's not already declared in your config.php
	if(is_null($group)) { $group = PageVar($pagename,'$Group'); }

	# Define the directory where the backup files should be stored

	# Set the local path to backup directory

	# BackupUrl (useful only if ShowLink = TRUE) :
	# URL to get back the archive.
	SDV($BackupUrl, ''.$BackupDir);     // custom to your needs !...

	# Define the format of the compressed file

	# Define directories to backup
	SDV($DirsToBackup, array("uploads/".$group));  // download files from group you're viewing

	# Echoes the link to download backup file?
	SDV($ShowLink,TRUE);   // Yes, show the download link

	//SDV($TraceOn,TRUE); // more verbose, for debug
	SDV($TraceOn,FALSE); // less verbose

	echo "<h2>File Archive for {$group}</h2>";		

	# Check if $BackupDirPath is a directory
	If ( ! is_dir($BackupDirPath)) {
	 die( $BackupDirPath." is not a directory or it doens't exist");

	# Check if $BackupFormat is in the correct format
	if ($BackupFormat != ".tar" && $BackupFormat != ".tgz" && 
            $BackupFormat != ".tbz" && $BackupFormat != ".zip") {
	 die( $BackupFormat." is not a correct extension for compressed backup file");

	# Define BackupFile name, with date format and backup format
	SDV($BackupFile, $group."_archive".$BackupFormat);

	# Remove "bad" characters from $BackupFile name (some of these characters 
          are mentionned at
	$BackupFile = str_replace("'","",$BackupFile);   // simple quote causes problem 
                                                         // in URL link (when $ShowLink = TRUE) 
	$BackupFile = str_replace(" ","",$BackupFile);   // space may cause problems with some 
                                                         // FTP-clients (e.g. SmartFTP)
	$BackupFile = str_replace("`","",$BackupFile);   // back quote may cause problems
	$BackupFile = str_replace("<","",$BackupFile);   // all others characters are not 
                                                         // allowed in Windows file names
	$BackupFile = str_replace(">","",$BackupFile);
	$BackupFile = str_replace("|","",$BackupFile);
	$BackupFile = str_replace("*","",$BackupFile);
	$BackupFile = str_replace("\"","",$BackupFile);
	$BackupFile = str_replace(":","",$BackupFile);
	$BackupFile = str_replace("?","",$BackupFile);  
	$BackupFile = str_replace("/","",$BackupFile);
	$BackupFile = str_replace("\\","",$BackupFile);

	trace("<p>BackupDir = ".$BackupDir."</p>");
	trace("<p>BackupDirPath = ".$BackupDirPath."</p>");
	trace("<p>BackupFile = ".$BackupFile."</p>");
	echo "<p>Directories to archive = ";
	foreach ($DirsToBackup as $dir) {
		echo $dir." ";
	echo "</p>";

    $temp = new zip_file($BackupDirPath.$BackupFile);

    trace("<p>Scanning directories :");
    foreach ($DirsToBackup as $dir) {


	If ($ShowLink) {
		echo "Archiving successful...<br />";
		echo "Download archive for {$group}: <a href='".$BackupUrl.$BackupFile."'>"
                        .$BackupUrl.$BackupFile."</a><br /><br />";
		echo "Return to <a href='".$ScriptUrl."/".$pagename."'>".$group."</a>";
	} else {
		echo "<br/><br/>Backup done. Now you can FTP the backup file.";


things that would make this even more useful to me, that i don't know how to do:

  • would be great if one could prevent the final redirect to the ugly non-html'd page, and instead have the file automatically download to the user's desktop...
  • integration of this recipe into the other recipe, or a way to be able to call this from config.php, so that both the original recipe and this one can coexist without redeclaring variables...

overtones99 May 08, 2008, at 06:40 AM

Talk page for the BackupPages recipe (users?).