I just started getting into self hosting using docker compose and I wonder about possible backup solutions.
I only have to safe my docker config so far, but I want host files as well.
What software and hardware are you using for backup?
That looks cool, and they've got some other nifty looking things like https://www.pikapods.com/. Any idea how stable the company is? I partially like rsync.net because it's pretty unlikely to just disappear someday.
Seconding this. On my unRAID host, I run a docker container called “Vorta” that uses Borg as its backend mechanism to backup to my SynologyNAS over NFS. Then on my Syno, run two backup jobs using HyperBackup, one goes to my cousin’s NAS connected via a Site-to-Site OpenVPN connection on our edge devices (Ubiquity Unifi Security Gateway Pro <-> UDM Pro), the other goes to Backblaze B2 Cloud Storage.
OP, let me know if you need any assistance setting something like this up. Gotta share the knowledge over here on Lemmy that we’re still used to searching evil Reddit for.
Local backup to my Synology NAS every night which is then replicated to another NAS at my folks house through a secure VPN tunnel. Pretty simple and easy to deploy.
my 20 TB storage is currently hosted by Hetzner on a SMB Share with a acompanying server
The storage is accessable via NFS/SMB i have a Windows 10 VPS running Backblaze Personal Backup for 7$/Month with unlimited storage while mounting the SMB share as a "Physical Drive" using Dokan because Backblaze B1 doesn't allow backing up Network shares
If your Storage is local you can use the win Backup Agent in a Docker container
For app data, Borg as backup/restore software. Backup data is then stored on Hetzner as an offsite backup - super easy and cheap to setup. Also add healthchecks.io to get notified if a backup failed.
Edit:
Backup docker compose files and other scripts (without API keys!!!) with git to GitHub.
I doubt your using NixOS so this config might seem useless but at its core it is a simple systemd timer service and bash scripting.
To convert this to another OS you will use cron to call the script at the time you want. Copy the part between script="" and then change out variables like the location of where docker-compose is stored since its different on NixOS.
Let me explain the script. We start out by defining the backupDate variable, this will be the name of the zip file. As of now that variable would be 2023-07-12. We then go to each folder with a docker-compose.yml file and take it down. You could also replace down with stop if you don't plan on updating each night like I do. I use rclone to connect to Dropbox but rclone supports many providers so check it out and see if it has the one you need. Lastly I use rclone to connect to my Dropbox and delete anything older than 7 days in the backup folder. If you end up going my route and get stuck let me know and I can help out. Good luck.
systemd = {
timers.docker-backup = {
wantedBy = [ "timers.target" ];
partOf = [ "docker-backup.service" ];
timerConfig.OnCalendar= "*-*-* 3:30:00";
};
services.docker-backup = {
serviceConfig.Type = "oneshot";
serviceConfig.User = "root";
script = ''
backupDate=$(date +'%F')
cd /docker/apps/rss
${pkgs.docker-compose}/bin/docker-compose down
cd /docker/apps/paaster
${pkgs.docker-compose}/bin/docker-compose down
cd /docker/no-backup-apps/nextcloud
${pkgs.docker-compose}/bin/docker-compose down
cd /docker/apps/nginx-proxy-manager
${pkgs.docker-compose}/bin/docker-compose down
cd /docker/backups/
${pkgs.zip}/bin/zip -r server-backup-$backupDate.zip /docker/apps
cd /docker/apps/nginx-proxy-manager
${pkgs.docker-compose}/bin/docker-compose pull
${pkgs.docker-compose}/bin/docker-compose up -d
cd /docker/apps/paaster
${pkgs.docker-compose}/bin/docker-compose pull
${pkgs.docker-compose}/bin/docker-compose up -d
cd /docker/apps/rss
${pkgs.docker-compose}/bin/docker-compose pull
${pkgs.docker-compose}/bin/docker-compose up -d
cd /docker/no-backup-apps/nextcloud
${pkgs.docker-compose}/bin/docker-compose pull
${pkgs.docker-compose}/bin/docker-compose up -d
cd /docker/backups/
${pkgs.rclone}/bin/rclone copy server-backup-$backupDate.zip Dropbox:Server-Backup/
rm server-backup-$backupDate.zip
${pkgs.rclone}/bin/rclone delete --min-age 7d Dropbox:Server-Backup/
'';
};
};
I use restic (and dejadup just to be safe) backing up to multiple cloud storage points. Among these cloud storage points are borgbase.com, backblaze b2 and Microsoft cloud.
Desktop: I was using Duplicati for years but I've recently switched to Restic directly to B2. I'm using this powershell script to run it.
Server: I'm also using restic to b2.
I also have a Qnap NAS. I'm synchronizing my replaceable data to crappy old seagate NAS locally. For the irreplaceable data that's using the Qnap backup client to B2.
I rsync my data once a day to another drive via script. If I accidentaly delete files, I can easily copy them back. Then once a day, rclone makes an encrypted backup to a hetzner storagebox
Backblaze B2. Any software that is S3 compatible can use B2 as the target and it’s reasonably priced for the service. I backup all the PCs and services to a Synology NAS and then backup that to B2 (everything except my Plex media, that would be pricy and it’s easy enough to re-rip from disc if needed).
Rsync everything besides media to a Storj free account. I also rsync my most important data(docker compose files,config files, home assistant, a few small databases) to Google drive.
On Proxmox, I use the built-in system + storing it to my Synology NAS (RS1221+). I use Active Backup for business (filesync) to back up the Proxmox config files, and also backup the husband's PC and my work PC.
I run a second Unraid server with a couple of backup-related applications, as well as Duplicati. I have my main server network mounted and run scheduled jobs to both copy data from the main pool to the backup pool, as well as to Backblaze. Nice having the on-site backup as well as the cloud based.
I occasionally burn to 100gb blurays as well for the physical backup.
So far I have had good experience with kopia. But it is definitly less battle-tested than the other alternatives and I do not use it for too critical stuff yet.
I use kup to back up my important PC files (the basic pre-installed backup software on KDE neon), which backs up to a separate drive on my PC, and that gets synced to my Nextcloud instance on my local server, and that - along with all the other data for my containers running on it - gets backed up by Kopia to DigitalOcean spaces.
I couldn't recommend Kopia strongly enough, because you have such fine control of what gets backed up, when it gets backed up, how many to keep etc. and it is versioned so doesn't grow exponentially, and it compresses and encrypts the backup. I also have a setup where it executes a script before and after the backup starts that stops and starts the containers to maintain file integrity since nothing will be writing to the files. And it's also a Docker container so it can just fit into your current compose setup.
For my workstation I'm using a small script that packs and compresses all relevant directories with tar once a week. The resulting file is then copied to a local backup drive and to my NAS. An encrypted version of that file is sent to an offsite VPS.
For my selfhosted services (on Proxmox) I'm using ProxmoxBackupServer.
I don't know if it's a smart solution but I have a HDD in my server that is used just for backups, each night I have rsync automatically moving stuff from multiple locations that I want to back up onto the drive. After that is done I have Kopia backup to B2, with compression, deduplication and encryption. I use healthchecks.io as well to alert me if any of the steps fails to complete (but none of the steps block each other).
ZFS send to a pair of mirrored HDDs on the same machine ever hour and a daily restic backup to S3 storage. Every six months I test and verify the cloud backup.
it runs Active Backup for business and Active backup for google workspace, as well as an AFP share for Apple machines. This is about 95% of all backups. Those backup archive files are then ALSO backed up to one of two large 14TB hdds. I swap them out monthly (or thereabout) and keep the spare at my office or in my firesafe when at home.
I have a couple other things out there too. A small SSh box to handle some scripting of config file backups etc. My main synology 1815+ also has a cloud sync up to backblaze that happens in realtime, but only keeps 1 copy of stuff as well as a hyperbackup job for super important stuff up to Backblaze, in addition to the nightly backups to the 1513+. This way if my house burns down I still have something (and likely a full copy with the 14TB HDD)
I've been using Restic for a while, and it's backing up to a Hetzner storage box (1TB).
Restic supports encryption, compression, deduplication, and can forget old backups in a spread out timeline (configurable; e.g. save one yearly, three monthly and 7 daily).
On top of this I also use healthchecks.io to make sure all backups are working.
Rsync custom script. I am connecting two different hard disks (1 natively + 1 remotely via ssh) to backup the disk.
1 tine per month, U unplug ny microsd fro my Raspberry Pi 4 Server and I am making a full backup of the sd in case it fails, to restore it to a new sd card.