Any backup software that supports incremental backup should work similarly bandwitdth-wise. I like Restic. You can even do incremental backups with plain rsync, if you want. If your data does not change much, than you should be okay. For the initial backup run it would be helpful if you have physical access to the remote location so you can bring a full backup there without having to upload it through your slow uplink.
Self Hosted - Self-hosting your services.
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules
- No harassment
- crossposts from c/Open Source & c/docker & related may be allowed, depending on context
- Video Promoting is allowed if is within the topic.
- No spamming.
- Stay friendly.
- Follow the lemmy.ml instance rules.
- Tag your post. (Read under)
Important
Beginning of January 1st 2024 this rule WILL be enforced. Posts that are not tagged will be warned and if not fixed within 24h then removed!
- Lemmy doesn't have tags yet, so mark it with [Question], [Help], [Project], [Other], [Promoting] or other you may think is appropriate.
Cross-posting
- [email protected] is allowed!
- [email protected] is allowed!
- [email protected] is allowed!
- [email protected] is allowed if topic has to do with selfhosting.
- [email protected] is allowed!
If you see a rule-breaker please DM the mods!
@[email protected] Remote backups might be rough with that upload speed. For example, you will be looking at over 2 hours per GiB uploaded.
I personally have a 3 node setup using kubernetes and I run longhorn for volume management. I do hourly snapshots, and then daily backups of all volumes to an additional drive on one of my 3 nodes with a simple NFS server which is also running in kubernetes. In longhorn I keep 2 replicas of every volume as well so losing one doesn't hurt anything.
I would imagine it would be pretty easy in this case to replace my local NFS with AWS storage and then I would have remote backups, but since I back up roughly 100 GiB per day that would be a little time consuming. At my 50 Mbps that's about 4.5 hours, though remote backups could be done less often as a last resort backup.
Yeah it is pretty rough although the files don't necessarily change all that much so if i can set up a backup somewhere and prepopulate it with my data as it stands now then incrementally keep it update it with nightly jobs then i'm hoping it'll mostly be done by the morning.
My backup backup plan would be to buy a couple high capacity solid state disks and either take them myself or mail them to my parents once a week. The mailman has pretty high bandwidth, even if the latency is rather rough
Not my solution, but I liked an idea and thinking to use it too - copy backups on external HDD and put it into your car trunk. Maybe have two drives in rotation.
It eliminates a need to drive somewhere for rotation, and any cost of renting a safebox.
Doesn't protect from a serious disaster like forest fire or earthquake or nuclear war, but I keep the most important data in cloud, and if my house and car burns I would be having other problems than worrying about some homelab snapshots.
actually not a bad idea. i live in a flat so my car is parked in a car park like 200m away from my property. if my entire town goes up in smoke then i imagine that losing data would be the least of my problems