Hi, Craig Barratt wrote on 2013-10-06 17:08:36 -0700 [Re: [BackupPC-users] Tar method - deleted files workaround]: > Chris, > > I've never looked into the --listed-incremental option for GNU tar. This > might do something similar to what you want. > > http://www.gnu.org/software/tar/manual/html_node/Incremental-Dumps.html
from what I read there, surprisingly, tar files seem to be able to contain file deletions (i.e. extracting the archive will *delete* a file in the file system). This would mean it could actually work, at least in theory. On second reading, the documentation somewhat contradicts itself, so it's not really clear whether this is true. *Without* deletions being represented in the *tar file*, the whole exercise is somewhat pointless. (The contradiction I see is that the documentation clearly states that the snapshot file is not needed for restoration, yet "GNU tar attempts to restore the exact state the file system had when the archive was created. In particular, it will delete those files in the file system that did not exist in their directories when the archive was created." - which it can't do without the snapshot file; it could only delete those files that the incremental run had detected had been removed since the baseline backup, provided this information is present in the incremental tar file.) Let's assume (and verify) they are, else I can delete what I've already written: ;-) > I also don't know what is required to support it in BackupPC. The one problem I see is that you have a file with metadata ("snapshot file") in addition to the tar stream. While you *could* just keep that file at the remote end (on the backup client), there would need to be some preprocessing, i.e. copying the file for independent incrementals. This would also mean that BackupPC would be keeping part of its state on the client machine, which would be new (and probably undesired). Alternatively, the file could be copied between BackupPC server and client, perhaps in DumpPre/PostUserCmd. All of this means that the administrator of BackupPC needs to know much more about the backup process and the client machines (where may we put the snapshot file?). Currently, we have default configuration values for tar backups over ssh that should mostly work. I doubt that would remain possible if this were to become default mode of operation. That doesn't mean it can't be done. It just means part of the process would need to be implemented by the (expert) BackupPC administrator. And for *local* backups (where BackupPC server == client), native support would be possible. Aside from that, we'd probably need support for file deletions in the BackupPC code. The rsync XferMethod already has that capability, so it shouldn't be too hard, I suppose. Providing this capability should be transparent for anyone not wanting to use --listed-incremental. Some new variables might also be needed both in the *Pre/PostUserCmds and, perhaps, TarClientCmd and/or TarFullArgs/TarIncrArgs, for instance the number of the baseline backup and the incremental level. Hmm. How do we *store* the snapshot file(s) in our pool FS? If the UserCmds need to access them, we'd either need some kind of hook, or they could just access $TopDir/pc/... directly (which is sort of ugly). Is anyone actually interested in experimenting with this option? Regards, Holger ------------------------------------------------------------------------------ October Webinars: Code for Performance Free Intel webinars can help you accelerate application performance. Explore tips for MPI, OpenMP, advanced profiling, and more. Get the most from the latest Intel processors and coprocessors. See abstracts and register > http://pubads.g.doubleclick.net/gampad/clk?id=60134071&iu=/4140/ostg.clktrk _______________________________________________ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/