Iterative Remote Backups With Duplicity

Terms

  • Primary/Source - these terms will refer to the server/files that we are backing up
  • Backup/slave - these terms will refer to the server that will host the backup files.

Introduction

There are many backup tutorials on the net, but none seem to resolve the problem I have been struggling with. I wish to automate my backups such that if either my "source" or "backup" server is compromised the data on the other server will still be safe. This means that I cannot simply use an unprotected SSH key to allows one of the servers to directly access the content of the other. This would allow the attacker to remove all the backup and source files if either server was compromised. One could use a password-protected SSH key to allow direct communication, and use a tool such as keychain to enable automation, but that would require the administrator to enter the passphrase every time the server was restarted.

To solve the issue I am going to perform an encrypted iterative backup of the source material to a local folder with duplicity. The files will be stored in a folder that is accessible by the remote backup server through a passwordless SSH key, although having the folder publicly available, such as through an FTP, would also work and still safe because the files are encrypted. The "backup" server will then transfer the files to itself, in this case by using rsync. The source server will be using a GPG public key to encrypt the files, and will not have the private key. If the attacker managed to compromise the backup server, then they only have access to the encrypted files which they can delete but not read. They will not be able to read or delete the source material on the remote server. If the attacker were to gain access to the source server, then they would have access to the original content, but they would not be able to remove the files from the backup server. If the attacker were to delete the original content, and the backup server synced this, the backup server would still have the backups from before the deletion.

Configure the Primary Server

First, we need to install the necessary packages. Unfortunately, duplicity's default package on Ubuntu 14.04 is version 0.6.23 and requires the private key in order to be able to take an encrypted incremental backup. We can resolve this by adding the stable repository which will allow us to install version 0.7.01 which does not have this issue.

sudo apt-add-repository ppa:duplicity-team/ppa -y
sudo apt-get update
sudo apt-get install duplicity -y

Create a GPG key on your local machine if you don't have one already that you want to use. The reason we don't create the GPG key on the server is because it is difficult to generate the entropy on a headless server.

Export the public key in ASCII (armoured) format.

gpg --list-keys
gpg --export -a "blah@my.email.com"

Copy the outputted text and paste it into a file on the source server.

Import the public key into GPG for duplicity to use with the following command:

gpg --import [filename]

Use the command below to show that you have the key:

gpg --list-keys

Make sure not to transfer the private key to the server, and that the private key is kept in a secure location, such as an encrypted USB stick.

Create a "backups" user that the backup server will be allowed to login as and who will have access to the encrypted backups.

sudo adduser backups

Make sure to use a randomly generated password. You probably wont need to remember it for long as we are going to only allow key based authentication, but in the meantime, keep it in a textfile for reference and don't save it.

Before we can use the key, we need to tell GPG that we fully trust the key, we sent it to ourselves after all.

gpg --edit-key $KEY_ID
trust
5
quit

Create a folder with all access permissions in the backups user's home folder. This will essentially turn it into a public folder so that our other users can can write encrypted files to it. This is okay because all the files will be encrypted. The reason we don't just set the home folder itself to 777 is because that would stop our remote server logging in with an SSH key.

sudo mkdir /home/backups/backups_folder
sudo chown backups:backups /home/backups/backups_folder
sudo chmod 777 /home/backups/backups_folder

Save the following BASH backup script somewhere such that you can call it from a cron. Make sure to update the variables in the VARS section as necessary.

#!/bin/bash
# A simple backup script wrapper for duplicity.
# Author - nixCraft  under GPL v2+
# Edited by Programster
# -----------------------------------------------------

## Define VARS ##
_gpg_key='9817322C'

# The path to where the files that we want to backup are
_src='/home/stuart/my_code'

# The path to where we want to store the backup
_target='/home/backups/backups/'

# The full path to the duplicity program
# you shouldn't need to change this if you are using Ubuntu
_duplicity='/usr/bin/duplicity'

# Specify how far back we want to be able to restore to
# e.g. how long a backup lives before it is removed.
_backup_lifetime=60

# Specify how many days of iterative backups are allowed to
# be taken before a full backup has to be taken
_days_between_full_backups=10

####################


## Cleanup ##
echo "cleaning up..."
#$_duplicity cleanup --force --encrypt-key="${_gpg_key}" file://${_target}

## Removing old backups##
echo "Removing old backups..."
$_duplicity remove-older-than ${_backup_lifetime}D --encrypt-key="${_gpg_key}" file://${_target}

## Backup our files ##
echo "Backing up files..."
$_duplicity --force ${_src} --encrypt-key="${_gpg_key}" file://${_target}

## Change permissions so that backup user can rsync it later.
echo "Updating permissions..."
chmod -R 777 ${_target}*
echo "done!"

Run the backup script to check that it works.

bash my.backup.script.sh

You should be shown a display of the backup statistics as seen below:

If the previous command worked, then add it to your crons so that it occurs automatically!

crontab -e

Make sure that your source files are not accessible by the backups user in case that server gets compromised.

chmod -R o=-rwx /path/to/source/files

Configure the Backup Server

Install the necessary packages. We need to install rsync so that we can efficiently transfer the files, but we also need to install duplicity since we are going to perform another iterative backup just in case the source files get deleted.

sudo apt-add-repository ppa:duplicity-team/ppa -y
sudo apt-get update
sudo apt-get install duplicity rsync -y

Generate an SSH identity and make sure to not set a passphrase.

ssh-keygen -t rsa

Grant the identity we just created access to the backups user on the source server.

ssh-copy-id $BACKUPS_USER@[source server IP/hostname]

Create an rsync script that we can use to sync the backup files to a local folder. Thus if the other server ever gets terminated maliciously, we should have a copy here.

## Define VARS ##

# Define the user on the remote host that has the encrypted backup files
_backups_user="backups"

# Define the IP of the remote server that we are trying to backup
_source_server_ip="192.168.1.54"

# The full path to the duplicity program
# you shouldn't need to change this if you are using Ubuntu
_duplicity='/usr/bin/duplicity'

# Specify how far back we want to be able to restore to
# e.g. how long a backup lives before it is removed.
_backup_lifetime=60

# Specify how many days of iterative backups are allowed to
# be taken before a full backup has to be taken
_days_between_full_backups=10

# Specify where on the remote server the encrypted
#  backup files are kept
_remote_backup_folder="/home/${_backups_user}/backups/*"

# Specify where rsync is going to transfer the remote files to
# and is going to be the source of another iterative backup
_latest_backup_folder="/$HOME/backups/${_source_server_ip}/latest"

# The path to where we want to store the backup
_target='/$HOME/backups/${_source_server_ip}/backups'

####################

# create the local folders if they dont yet exist
mkdir -p ${_latest_backup_folder}
mkdir -p ${_target}

# Rsync the remote files to the local server.
rsync \
--recursive \
--delete \
--times \
--verbose \
--compress \
--force \
-e "ssh -p22" ${_backups_user}@${_source_server_ip}:${_remote_backup_folder} ${_latest_backup_folder}


# Take a backup of the backup, just in case someone removed the files from the remote host. This time there is no need for encryption.

echo "cleaning up..."
$_duplicity cleanup --no-encryption file://${_target}

echo "Removing old backups..."
$_duplicity remove-older-than --force --no-encryption ${_backup_lifetime}D file://${_target}

echo "Backing up files..."
$_duplicity ${_latest_backup_folder} --no-encryption file://${_target}

echo "done!"

Test the script by running at and if it works add it to your crons. Make sure this cron is activated after the other server's backup cron is going to happen, with some buffer time. E.g. running the backup at 00:00 and the sync at 01:00 would be a good idea.

crontab -e

Conclusion

It took a long time, but you should now have secure backups such that if either your backup server or the source server are compromised, you should still be able to access a copy of your data. We can only guarantee that the data won't be readable to the third party if they compromised the backup server, or the backup user on the source server.

References

Author

Programster

Stuart is a software developer with a passion for Linux and open source projects.

comments powered by Disqus