How to back up your wiki based on MediaWiki

How to back up your wiki based on MediaWiki

ยท

3 min read

This article is following a previous article: How I set up a wiki for Game of Roles the French better "Critical Role".

Now that you have a very nice wiki, you want to set up a regular backup in case something goes south. Thanks to Docker and how we mounted our volumes last time, it will be pretty easy.

To do a good backup of your wiki, we are going to:

  1. Backup the SQLite database
  2. Dump the wiki content as XML
  3. Package the SQLite backup, the dump, and the current extensions and images folder
  4. Send the compressed package to your Google Drive
  5. Clean up all the backup

Let's start.

  1. Backup the SQLite database

     docker exec my-wiki php maintenance/sqlite.php --backup-to /var/www/data/backup.sqlite.bak
    

    Note: This is a hot backup process. It can freeze your database for a couple of seconds.

  2. Dump the wiki content as XML

    The following command will export all the pages into a compressed XML dump.

     docker exec my-wiki php maintenance/dumpBackup.php  --full --uploads --output=gzip:/var/www/data/dump.xml.gz
    
  3. Package the SQLite backup, the dump, and the current extensions and images folder

    First, we generate a filename with a timestamp, then from the wiki folder, we archive and compress all the backup components.

     FILENAME=backup-$(date +"%Y%m%d%H%M%S").tar.gz
     cd /wiki
     tar cvzf $FILENAME data/backup.sqlite.bak data/dump.xml.gz LocalSettings.php extensions images
    
  4. Send the compressed package to your Google Drive

    With the tool gdcp, we upload the compressed archive to our Google Drive.

     gdcp upload -p __FOLDER_ID__ $FILENAME
    

    Note: I'll let you check the documentation of the tool gdcp to know how to authenticate and get the ID of the folder where you will store your backup.

  5. Clean up all the backup

     rm $FILENAME data/backup.sqlite.bak data/dump.xml.gz
    

Once you got everything working, what you can do is create a script backup.sh with all the above commands, then create a cronjob to trigger the backup regularly.

For example, I use the following cron file to back up my wiki hourly:

PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/opt/aws/bin:/root/bin
1 * * * * /wiki/backup.sh

Note: PATH here is used to help cron to know how where to find docker, gdcp, and such tools.

Voilร , you are all safe to go with your wiki now. I'll detail the restoration process in a future article.

Did you find this article valuable?

Support Sonny Alves Dias by becoming a sponsor. Any amount is appreciated!

ย