The internet is a wild place, filled with well, everything. There are many ways of exposing an application to the internet, but no matter how secure an application claims to be, or how confident you are with your infrastructure, sometimes you may just be more comfortable keeping it internal. Historically,…
Duplicati is one of my favourite backup system. It’s pretty fast, supports numerous backup sources, and has a nice configuration web UI. Unfortunately however, it can’t be used to back up remote files. In fact, I can’t find a nice fully-features backup solution which does do this, which sucks.
Another great tool is
rclone, which lets you list, download, upload and modify remote files. Because of this, you can use Rclone as a naive backup system, but it’s not quite as powerful as Duplicati.
One of Rclone’s most powerful features is the ability to mount remotes as filesystems on your local machine. This means you can access files as if they were on your machine, but without needing to download them.
Wouldn’t it be great if you could combine the remote mounting features of rclone, with the backup system of Duplicati?
My solution? Do just that! Have rclone mount the remotes I need, and point Duplicati to those as sources, for it to back up elsewhere. To make this simpler, especially in a dockerized world, I created a container to handle this for you:
docker-rclone-mount will mount rclone remotes based on a configuration on your host, which can then be passed into the Duplicati container for it to back up from.
First, create a docker compose entry for
docker-rclone-mount. Putting it in the same compose file as Duplicati makes life easier.
rclone: image: theorangeone/rclone-mount:latest cap_add: - SYS_ADMIN security_opt: - apparmor:unconfined devices: - "/dev/fuse:/dev/fuse" environment: - PUID=1000 - PGID=1000 volumes: - "./rclone.conf:/config/rclone.conf:ro" - "./rclone-mounts.txt:/config/config.txt:ro" - "./mounts:/mnt:shared"
apparmor:unconfined are both required to allow the docker container to mount filesystems.
Then, mount the
mounts directory into your Duplicati container:
volumes: ... - "./mounts:/source/mounts:shared"
Note the use of
:shared on the end of both mounts. This is important as it allows docker to pass through the FUSE mounted filesystems correctly. Removing this from either side will prevent the filesystems being exposed correctly.
Next step is to set up your rclone remote, which is best done through the rclone CLI. I recommend installing and configuring your remotes locally, and copying the config over, as it lest you ensure everything works correctly without having to jump around docker.
Final step is to tell
docker-rclone-mount to mount your remote. This is done using the config file at
/config/config.txt. The file contains a rclone remote, and a destination mount inside the container relative to
/mnt, separated by a space.
For example, the above would mount
/mnt/data and therefore
./mounts/data on our host.
Now start the containers, and you should be set!
#Does it work?
Yes, yes it does! Duplicati can back up remotes now, which is great! Unfortunately backups are now very network intensive, which means the backups can be a lot slower than just reading off the local filesystem, but that’s mostly fine. For that reason I wouldn’t recommend this for huge datasets. If you’re running Duplicati on a fast network connection, it’s probably fine, but best test before relying on it!
I’m running it right now for some backups, take a look at my setup.
#“But what about other backup tools, like Restic?”
Restic is a new (ish) backup took which can use rclone for its remote access. Unfortunately it doesn’t support using a remote as a destination for backups.
You could totally use this with restic, or any other backup solution, as there’s nothing Duplicati specific.
Share this page
Today is world backup day, a day to highlight the importance of backups, protecting data, and keeping systems secure (at least that's what Wikipedia says). I'm taking this day as a chance to review my backup strategy, and make sure I'm happy with the coverage I'm getting. I mentioned a…
Backups are critical to any systems longevity and reliability. If you’re not backing up your data, stop reading this now, go do it, then come back… Assuming none of you suddenly panicked and left, let’s keep going. You can keep telling yourself otherwise, but eventually, every system will experience some…
View all →