-
Notifications
You must be signed in to change notification settings - Fork 20
Create and update duplicate archives #199
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Some observations from another issue:
Concept:With the following difference between Src archive and Dest...
...on Dest do something like:
Follow up with usual raw sync of archive dir using |
I think a Python script using the above idea, showing how to make |
This comment has been minimized.
This comment has been minimized.
Performing independent Wyng backups to the additional "copy" archive may be the most efficient general workaround, currently. For this to work correctly, the additional archive cannot be a simple clone; the two archives must have different UUIDs so that snapshots are managed right. (Otherwise, Wyng will repeatedly discard the local snapshots and send the volumes in slower "full scan" mode.) If you already have a cloned copy and want to make its UUID unique, run However, if your server and offsite backup both use Btrfs or ZFS, then you could efficiently Notes on external updater approach:
|
Updated PoC script. Its closer to a usable solution and just needs SSH URL parsing (and send the commands over |
@t-gh-ctrl I created a gist of an updater script that I tested today: https://gist.github.com/tasket/08f38279d8702c7defcb62cb4afdae7a Feel free to suggest changes incl. different rsync options. |
Awesome ! :) It might take a bit of time until I report back because I won't have access to my remote backup server for the next month ; so I'll have to do some tests with a test remote host when time permits. |
The script still doesn't take into account when timezone change shifts the session name (local time) in reverse (such as when backups were done just before traveling, then again from a laptop that has just flown west). But I'm adding a small bit of plaintext info to archives that will show the correct session order; the script can be updated to take advantage of that. |
Update: The 08wip branch now saves a simple in-order json list of sessions in the archive dir. These "sessions" files can be used by a script to avoid the chunkfile timezone misplacement issue. |
Add an archive duplication feature to create backups of archives in another location.
Problem
Although commands like
rsync
may be considered sufficient for making duplicates intact, there are a couple of drawbacks:rsync
,cp
, etc. extra care must be taken to avoid incomplete transfers resulting in a corrupt copy.Solution
A Wyng duplication function could make and refresh duplicate archives using the same safety patterns (data first, metadata last) employed when creating original archives. It could rely on its knowledge from archive metadata, avoiding costly dir and data scanning. It would also be possible to add some level of selectivity (per volume, etc) at some point.
The main task is to open the source and destination (copy) archives in tandem and then:
It could be of further help if the duplicate archive were marked as having a special status and possibly with a different uuid; this would be to avoid temptation of backing up to two+ copies absentmindedly with users thinking they are the same archive.
External duplication/updater scripts:
A simple
rsync
based script can duplicate archives and also update them:Also see working example of a more efficient script that is based on rsync but performs merging of pruned session dirs ahead of time to avoid unnecessary data transfers by rsync:
https://gist.github.com/tasket/08f38279d8702c7defcb62cb4afdae7a
Notes
Related
#140
#175
#184
The text was updated successfully, but these errors were encountered: