+1 Great idea! I would definitely be in favor of such a feature. Anything to detect unauthorized access/tampering with my backups would be great. It would also be nice to be able to run a simple low bandwidth test to have confidence that it works.
...James On Thu, Sep 8, 2016 at 10:09 AM, Daniel Neades <[email protected]> wrote: > Hi Scott, > > > On 8 Sep 2016, at 14:40, Scott Wheeler <[email protected]> wrote: > > > >> On Sep 8, 2016, at 11:43 AM, Daniel Neades <[email protected]> wrote: > >> > >> The slow list and restore times also make testing Tarsnap backups > *extremely* painful. For example, we backup dumps (4–5 GB in size) of one > of our databases. We can restore a dump via SSH from a remote backup > machine located on a different continent in a matter of minutes. Restoring > the identical dump from Tarsnap takes hours. > >> > >> I am not sure we’d have chosen Tarsnap had we realized how slow these > essential and common operations would be. > > > > I realize this probably won't help if you're restoring single file > database dumps, but for doing complete (rather than hand-picking single > files) restores with a lot of files (about 70k in our case) using multiple > tarsnap processes can speed things up dramatically. I wrote a little Ruby > tool to do this for us years ago: > > > > https://github.com/directededge/redsnapper > > > > Again though, if that can be done with a tiny Ruby wrapper, it should be > done in the default client. It's the only thing that makes doing complete > restores for a catastrophic case of complete data loss almost tenable for > us with Tarsnap. > > That is helpful for people with lots of files (though not, as you > surmized, for us); thank you for mentioning that here. It is a shame that > people are having to do these sorts of work-arounds, though – being able to > restore reasonably quickly from a backup ought to be a core capability of > any backup solution. > > -- > Daniel Neades > Director, Araxis Ltd > www.araxis.com >
