Backing Up S3 Bucket With Duplicity
The following guide will show you how to create iterative backups of your S3 bucket to another S3 bucket, in such a way that you don't need to be able to hold all of the contents of either bucket locally. We achieve this by making use of S3FS to mount the source bucket, and backing up straight to another S3 bucket.
Steps
First we need to mount the bucket we wish to backup, using s3fs, so that we can treat it like a local filesystem.
Create Backups
Install duplicity if you haven't already
Wel also need to install "boto" through pip for duplicity to work with S3.
Run the following script to create the backup
Docker Containers
If you are doing this within a docker container, you need to run the container with --cap-add SYS_ADMIN --device /dev/fuse
, or you can just run with --privileged
, but there are a lot of security implications if you run with privileged mode.
If using docker-compose, you can specify privileged with
--cap-add SYS_ADMIN --device /dev/fuse
with docker-compose`
You also need to set the timezone automatically in your Dockerfile, as installing the s3fs package will trigger the setting of the timezone.
Finally, you will need to add --allow-source-mismatch
to the backup command because the docker container's hostname will change. E.g.
References
- Duplicity encrypted backups to Amazon S3
- Programster - Iterative Remote Backups With Duplicity
- Github - s3fs
- Github - s3fs-fuse issue - fuse: device not found, try 'modprobe fuse' first
First published: 21st August 2020