At this point I have realized two things. 1) tar accepts a stream of
filenames in, not a data stream. 2) backuppc expects specifically a tar
stream, not just a file stream(with a list of files preceding the data).
To that end I have thrown out the idea of using tar and I have resorted to
writing my own perl script that uses Archive::Tar. So far testing is going
well with dummy data but I dont have it fully working yet.
On 25 April 2013 22:15, Adam Goryachev
<[email protected]>wrote:
> On 26/04/13 14:39, Les Mikesell wrote:
> > On Thu, Apr 25, 2013 at 6:09 PM, Lord Sporkton <[email protected]>
> wrote:
> >> I'm aware of zmanda and several other backup options however at this
> time
> >> this is what we have and this is what we are trying to leverage.
> Perhaps it
> >> will turn out that writing to a flat file is the only option. But the
> nature
> >> of the backuppc commands leads me to believe there is some possibility
> to
> >> link streams. mysqldump streams into a file and backuppc "appears" to
> stream
> >> from the local file, across the tunnel. This leads me to believe(and i
> could
> >> of course be wrong due to some caveat that Im simply not aware of yet)
> that
> >> If i could redirect backuppc to stream from a stream rather than file, i
> >> could get it to work. Tar is of course capable of accepting either
> stream or
> >> file as input and mysqldump is capable of outputing to either stream or
> >> file. I suppose I will just have to play around with it more maybe.
> > I don't think you are going to find a way to get backuppc to collect
> > the output stream directly. However you could use disk space on
> > another machine for the intermediate file copy, using either a pipe
> > over ssh or an nfs mount and let backuppc pick it up from there.
> I think the issue is that backuppc expects tar to not just provide a
> stream of data, but a list of filenames with the data for each file
> included. If you pipe the data into tar, I'm not sure that tar will be
> able to know what the filename is.
>
> I was wondering if you could instruct backuppc to use a custom tar
> command to backup a named pipe something like this:
>
> mysqlbackup > somepipe
>
> backuppc is instructed to backup somepipe using the tar protocol, and
> tar is instructed to handle the file like a normal file instead of just
> backing it up as a named pipe.
>
> The theory being that backuppc will see it is backing up a file called
> "somepipe", but in reality, the file contents never exist on disk.
>
> The challenge here is that the backup script "mysqlbackup > somepipe"
> will not actually complete (because it can't write to the pipe) until
> the backup is started. Probably you need backuppc to run a pre-backup
> script which will trigger the mysqlbackup to start running
> (disconnected/background so that backuppc will continue), then backuppc
> will read the backup contents, and finish. You would need one "share"
> for each DB, so 50 DB's means 50 shares, or potentially, you might be
> able to have a directory of pipes, and the backup script can start the
> 50 backups in parallel (each one backgrounded, but each one doesn't
> really start until the previous one is finished). The challenge is to
> make sure tar will quit reading from the pipe at the end of the backup
> output, and also it won't start reading from the next pipe before the
> backup data starts to be sent there (possibly, depending on when/how tar
> decides it has finished reading the file/pipe)
>
> IMHO, this *might* work, but could also be fairly fragile, and may have
> many unintended side-effects.
>
> Potentially, a second option would be to use mysql replication to keep a
> current copy of all live databases on a 'backup' machine. Then you can
> simply stop the mysql server, backup the raw DB files with backuppc, and
> then re-start the mysql server. The mysql server will then catch up from
> all the remote replication partners, and continue. This also gives you a
> possible source of more up to date backup data if some (not all) problem
> happens on the live DB server.
>
> Regards,
> Adam
>
> --
> Adam Goryachev
> Website Managers
> www.websitemanagers.com.au
>
>
>
> ------------------------------------------------------------------------------
> Try New Relic Now & We'll Send You this Cool Shirt
> New Relic is the only SaaS-based application performance monitoring service
> that delivers powerful full stack analytics. Optimize and monitor your
> browser, app, & servers with just a few lines of code. Try New Relic
> and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr
> _______________________________________________
> BackupPC-users mailing list
> [email protected]
> List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki: http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/
>
------------------------------------------------------------------------------
Try New Relic Now & We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service
that delivers powerful full stack analytics. Optimize and monitor your
browser, app, & servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr
_______________________________________________
BackupPC-users mailing list
[email protected]
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/