wiki.rogs.me

my personal wiki

User Tools

Site Tools


Sidebar

devops:backups

My server backup script

My server is running a bunch of stuff, but I'm only interested in a few to be backed up:

  • Nextcloud
  • Dokuwiki (this wiki)
  • Ghost

This is the script I use to back everything up

Script

#!/bin/sh
 
# Nextcloud
echo "======================================"
echo "Backing up Nextcloud"
cd /var/lib/docker/volumes/nextcloud_nextcloud/_data/data/roger
 
NEXTCLOUD_FILE_NAME=$(date +"%Y_%m_%d")_nextcloud_backup
echo $NEXTCLOUD_FILE_NAME
 
echo "Compressing"
tar czf /root/$NEXTCLOUD_FILE_NAME.tar.gz files/
 
echo "Encrypting"
gpg --passphrase-file the/location/of/my/passphrase --batch -c /root/$NEXTCLOUD_FILE_NAME.tar.gz 
 
echo "Uploading"
aws s3 cp /root/$NEXTCLOUD_FILE_NAME.tar.gz.gpg s3://backups-cloud/Nextcloud/$NEXTCLOUD_FILE_NAME.tar.gz.gpg --endpoint-url=https://s3.wasabisys.com
 
echo "Deleting"
rm /root/$NEXTCLOUD_FILE_NAME.tar.gz /root/$NEXTCLOUD_FILE_NAME.tar.gz.gpg
 
# Dokuwiki
echo "======================================"
echo "Backing up Dokuwiki"
cd /data/docker
 
DOKUWIKI_FILE_NAME=$(date +"%Y_%m_%d")_dokuwiki_backup
 
echo "Compressing"
tar czf /root/$DOKUWIKI_FILE_NAME.tar.gz dokuwiki/
 
echo "Encrypting"
gpg --passphrase-file the/location/of/my/passphrase --batch -c /root/$DOKUWIKI_FILE_NAME.tar.gz 
 
echo "Uploading"
aws s3 cp /root/$DOKUWIKI_FILE_NAME.tar.gz.gpg s3://backups-cloud/Dokuwiki/$DOKUWIKI_FILE_NAME.tar.gz.gpg --endpoint-url=https://s3.wasabisys.com
 
echo "Deleting"
rm /root/$DOKUWIKI_FILE_NAME.tar.gz /root/$DOKUWIKI_FILE_NAME.tar.gz.gpg
 
# Ghost
echo "======================================"
echo "Backing up Ghost"
cd /root
 
GHOST_FILE_NAME=$(date +"%Y_%m_%d")_ghost_backup
 
docker container cp ghost_ghost_1:/var/lib/ghost/ $GHOST_FILE_NAME
docker exec ghost_db_1 /usr/bin/mysqldump -u root --password=my-secure-root-password ghost > /root/$GHOST_FILE_NAME/ghost.sql
 
echo "Compressing"
tar czf /root/$GHOST_FILE_NAME.tar.gz $GHOST_FILE_NAME/
 
echo "Encrypting"
gpg --passphrase-file the/location/of/my/passphrase --batch -c /root/$GHOST_FILE_NAME.tar.gz
 
echo "Uploading"
aws s3 cp /root/$GHOST_FILE_NAME.tar.gz.gpg s3://backups-cloud/Ghost/$GHOST_FILE_NAME.tar.gz.gpg --endpoint-url=https://s3.wasabisys.com
 
echo "Deleting"
rm -r /root/$GHOST_FILE_NAME.tar.gz $GHOST_FILE_NAME /root/$GHOST_FILE_NAME.tar.gz.gpg
 
echo "======================================"
echo "ALL DONE"
echo "======================================"

Basic explanation

All of them work almost the same:

  1. create a folder
  2. tar it
  3. encrypt it with gpg
  4. upload it to a Wasabi bucket
  5. delete the local files

Simple enough.

The only one that makes something different is the ghost backup, which works like this:

  1. copy the info from the docker container
  2. create a db dump from the docker container
  3. tar it
  4. encrypt it with gpg
  5. upload it to a Wasabi bucket
  6. delete the local files

Cron

I run this script on Mondays at midnight with the following crontab:

0 0 * * 1 sh /location/of/my/script/backup.sh

Notes

I know this can be widely improved, but right now it works. I'll keep updating this wiki each time I change something

devops/backups.txt · Last modified: 2019/11/18 22:04 (external edit)