Dan,
> Alternately, just break your backup set into two (or more) pieces.
Ah, I haven't tried that yet... There are a couple of reasons why:
-- even though I'm backing up only one machine, I have one "client" per
backed up dir ("share"?, I'm using rsync), because I want to invoke the
creation of a single LVM snapshot at a time with $Conf{DumpPreUserCmd} (I
don't want more than one snapshot simultaneously on the client, at least
for now, for performance and space reasons).
-- subdirectories below each backed up dir can change, so I can't use a
static list, lest I be forced to maintain it (I'm lazy, I know).
But now you got me thinking... there might be ways to do it. For example, I
know that in the home partition there is one user that takes as much space
as everybody else together. So that would be a good candidate to break that
backup set into two nearly-equal-size pieces.
This is what I have currently:
$Conf{RsyncShareName} = '/home_snapshot';
Would something like this work?:
$Conf{RsyncShareName} = '/home_snapshot';
$Conf{BackupFilesOnly} = {
'/home_snapshot' => [ '/<large_user_home_dir>' ],
'/home_snapshot' => [ some grep-like operation that lists all home
directories, but excludes /<large_user_home_dir> ],
};
Because the config files are Perl scripts, the second list can be generated
dynamically. I'd have to ssh to the client, ls /home_snapshot, filter
<large_user...> out from that directory listing... looks doable... Argh!
that wouldn't fly, because two hash entries with the same key means the
second one would overwrite the first one. Hmm, but something along those
lines... Is it possible to have arrays of arrays?:
$Conf{BackupFilesOnly} = {
'/home_snapshot' => [
[ '/<large_user_home_dir>' ],
[ some grep-like operation that lists all home
directories, but excludes /<large_user_home_dir> ]
],
};
In which case I could just as well do this:
$Conf{RsyncShareName} = [
[ '/home_snapshot/<large_user_home_dir>' ],
[ /home_snapshot prefixed to each element of the filtered list of
home dirs ]
];
I bet there is some much simpler way of doing this, and I just have the
blinders on.
> cat5e should work fine for gigabit (it is the specified cable for
> 1000baseT). You don't need cat6.
Thanks for correcting me on cat 5e vs cat6. That's good news for me.
Bernardo Rechea
|---------+---------------------------->
| | Dan Pritts |
| | <[EMAIL PROTECTED]|
| | edu> |
| | |
| | 01/24/2006 03:03 |
| | PM |
| | |
|---------+---------------------------->
>------------------------------------------------------------------------------------------------------------------------------|
|
|
| To: [EMAIL PROTECTED]
|
| cc: [email protected]
|
| bcc:
|
| Subject: Re: [BackupPC-users] New user, few questions
|
>------------------------------------------------------------------------------------------------------------------------------|
On Tue, Jan 24, 2006 at 02:01:51PM -0500, [EMAIL PROTECTED]
wrote:
> As you can see, very long backup times. I'm looking into ways to reduce
> them. I think I'm network bound for the fulls, and memory bound for the
> incrementals (the server uses all the 2 GB of RAM, plus another ~1.5 GB
of
> swap). Swapping is obviously bad, so my first inclination would be to
> purchase 2 more GB of RAM for the server.
Alternately, just break your backup set into two (or more) pieces.
> As for network, ours is 100 Mb/s, with cat 5e cabling, but I might try
> connecting the Gbit ports of server and client, point to point (our Cisco
> switch doesn't support Gbit). While this won't give full Gbit speed (to
> guarantee that I'd need cat 6 cable),
cat5e should work fine for gigabit (it is the specified cable for
1000baseT). You don't need cat6.
danno
--
dan pritts - systems administrator - internet2
734/352-4953 office 734/834-7224 mobile
-------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems? Stop! Download the new AJAX search engine that makes
searching your log files as easy as surfing the web. DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
_______________________________________________
BackupPC-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/