[BackupPC-users] IPv6 support

2011-03-15 Thread Vinz S
Hello,

I'm using backuppc 3.1.0-9 on a Debian Squeeze server and I have some
trouble with ipv6.
When I define a  record for a backup client it doesn't work, I have this
message: Can't find host test via netbios.
When I add a A record for this client everything is working.
Can you tell me if ipv6 support to backuppc is already ok for the version
I'm using or if it's on the roadmap of future versions.

Regards,
--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] IPv6 support

2011-03-15 Thread Timothy J Massey
Vinz S svin...@gmail.com wrote on 03/15/2011 05:13:19 AM:

 I'm using backuppc 3.1.0-9 on a Debian Squeeze server and I have 
 some trouble with ipv6.
 When I define a  record for a backup client it doesn't work, I 
 have this message: Can't find host test via netbios.
 When I add a A record for this client everything is working.
 Can you tell me if ipv6 support to backuppc is already ok for the 
 version I'm using or if it's on the roadmap of future versions.

BackupPC does no communication on its own.  At its most simplest (and in 
the best spirit of UNIX), it is a daemon that coordinates lots of other 
utilities together.

What that means is that the question is not, Is BackupPC IPv6 ready?, 
but, Are all of the tools BackupPC uses IPv6 ready?  As you can imagine, 
that's a *much* more complex question!

For example, the item that you're asking about (Can't find host via 
netbios) is handled by Samba.  The Samba utility is the one trying to 
resolve the host, and it's not looking for  records.  So the place to 
look would be within the Samba project.

Of course, you can substitute nearly *any* program (via the BackupPC 
config.pl file) for the ones that BackupPC is using, so you might be able 
to substitute an IPv6-ready nslookup (or dig or whatever) command in its 
place.

Timothy J. Massey

 
Out of the Box Solutions, Inc. 
Creative IT Solutions Made Simple!
http://www.OutOfTheBoxSolutions.com
tmas...@obscorp.com 
 
22108 Harper Ave.
St. Clair Shores, MI 48080
Office: (800)750-4OBS (4627)
Cell: (586)945-8796 
--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Configuration parameters for ubuntu client

2011-03-15 Thread John BORIS
I am trying to modify the settings for backing up a Ubuntu client using
tar. Does anyone on the list have a pointer to a document that shows the
correct way to include the sudo command in the proper spot. I am
assuming one would have to use ssh keys for this to work. 

TIA


John J. Boris, Sr.
JEN-A-SyS Administrator
Archdiocese of Philadelphia
Remember! That light at the end of the tunnel
Just might be the headlight of an oncoming train!

--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Configuration parameters for ubuntu client

2011-03-15 Thread Papp Tamas

On 03/15/2011 07:09 PM, John BORIS wrote:
 I am trying to modify the settings for backing up a Ubuntu client using
 tar. Does anyone on the list have a pointer to a document that shows the
 correct way to include the sudo command in the proper spot. I am
 assuming one would have to use ssh keys for this to work.


hi!

This is for rsync:

$sshPath -q -x -l backupuser $host /usr/bin/sudo $rsyncPath $argList+

Modify as you need it.

tamas

--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Configuration parameters for ubuntu client

2011-03-15 Thread Papp Tamas

On 03/15/2011 08:01 PM, John BORIS wrote:
 tamas,
 backuppcuser is that the user on the backuppc server or the user on the
 client?

Of course the user on the client.

tamas

--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Backups tooo slow over WAN - how to split them

2011-03-15 Thread David Herring
I'm trying to backup windows servers with approx 3 partitions of 100G each
over a WAN link. This takes 'days' to run and never successfully completes.
I'm using rsyncd on the windows machines - and the backup is to a ubuntu
server.

So, how do you split the backups from a single machine - do you have to
create multiple host entries for each machine ? Does this make restore too
painful ?

Is there anything else I should be doing ? MTU size ?

Any help great-fully received,
Dave

-- 
David Herring
--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] IPv6 support

2011-03-15 Thread Tod Detre
Actually, that's not quite the whole picture. BackupPC does do dns
name lookup and those calls are not IPv6 compatible.  Once such
instance is in bin/BackupPC_dump line 503. The gethostbyname()
function (at least last time I tried to do IPv6 with BackupPC) does
not support IPv6. This causes BackupPC to try to do an nmblookup. If
your client doesn't have and IPv4 address and does not respond to
nmblookup, BackupPC will fail. However, if you do not use nmblookup,
you can just get rid of lines 496-508,510 and BackupPC will work just
fine.

--Tod

--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Weekly full with no incremental configuration leads to daily full backups

2011-03-15 Thread Matthias Meyer
Hi,

I try to configure a client to make only weekly full backups but no 
incrementals:

$Conf{FullAgeMax} = '732';
$Conf{FullKeepCnt} = [
  '4',
  '0',
  '13'
];
$Conf{FullKeepCntMin} = '15';
$Conf{IncrAgeMax} = '0';
$Conf{IncrKeepCnt} = '0';
$Conf{IncrLevels} = [
  '0'
];

But the above configuration leads to daily full backups.

Anyone have a configuration like mine?

Thanks in advance
Matthias
-- 
Don't Panic


--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Weekly full with no incremental configuration leads to daily full backups

2011-03-15 Thread Les Mikesell
On 3/15/2011 3:17 PM, Jeffrey J. Kosowsky wrote:
 Matthias Meyer wrote at about 21:00:19 +0100 on Tuesday, March 15, 2011:
 Hi,
   
 I try to configure a client to make only weekly full backups but no
 incrementals:
   
 $Conf{FullAgeMax} = '732';
 $Conf{FullKeepCnt} = [
   '4',
   '0',
   '13'
 ];
 $Conf{FullKeepCntMin} = '15';
 $Conf{IncrAgeMax} = '0';
 $Conf{IncrKeepCnt} = '0';
 $Conf{IncrLevels} = [
   '0'
 ];
   
 But the above configuration leads to daily full backups.
   
 Anyone have a configuration like mine?

 I think you need to do something like:
 $Conf{FullPeriod} = 6.97;

And set $Conf{IncrPeriod} slightly higher.  I believe the runs are 
scheduled at IncrPeriod intervals and a full is done if FullPeriod has 
elapsed since the last run.

-- 
   Les Mikesell
lesmikes...@gmail.com


--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backups tooo slow over WAN - how to split them

2011-03-15 Thread Bowie Bailey
On 3/15/2011 3:49 PM, David Herring wrote:

 I'm trying to backup windows servers with approx 3 partitions of 100G
 each over a WAN link. This takes 'days' to run and never successfully
 completes. I'm using rsyncd on the windows machines - and the backup
 is to a ubuntu server.

 So, how do you split the backups from a single machine - do you have
 to create multiple host entries for each machine ? Does this make
 restore too painful ?

 Is there anything else I should be doing ? MTU size ?

You can split the backups by directories.  Just create separate backup
hosts that each backup some of the directories.  Give each host slightly
different names and use the ClientNameAlias setting to point them back
to the proper host name.

Keep in mind that this may result in the separate backups running
simultaneously.  To prevent this you can use the DumpPreUserCmd and
DumpPostUserCmd to set and check for lock files.

-- 
Bowie

--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backups tooo slow over WAN - how to split them

2011-03-15 Thread Les Mikesell
On 3/15/2011 3:23 PM, David Herring wrote:


 New to this list - not sure if my first post worked - so am resending

 I'm trying to backup windows servers with approx 3 partitions of 100G
 each over a WAN link. I'm using rsyncd on the windows machines - and the
 backup is to a ubuntu server.

 This takes 'days' to run and never successfully complete - normally I
 have to stop the backuppc  (via /etc/init.d/backuppc stop) and this
 loses the already downloaded data

If you are running rsync/rsyncd you should get a partial backup when a 
full fails and the next run should use that as the rsync base.  Do you 
have the latest version installed?

 So, how do you split the backups from a single machine - so each can run
 within a reasonable (say 4 hour) timeframe ? Do you have to create
 multiple host entries for each machine ? Does this make restore too
 painful ? Is the whole issue about number of files / rsyncd performance 

You can make a host entry for each partition (or each rsyncd module, 
which wouldn't have to be partition boundaries), with ClientAlias 
settings pointing to the same actual host.  This will let each be 
scheduled separately.

 Is there anything else I should be doing ? MTU size ?

MTU size might come into play if you are using a VPN that doesn't handle 
it gracefully.  If you use openvpn, enabling lzo compression might help.

Some people work out some other way to get the initial backup (move the 
server to the target LAN or vice versa, copy to an external disk and 
mount in a local machine with a ClientAlias point there for the first 
run, etc.).  After your first full completes the rest should be much faster.

-- 
   Les Mikesell
lesmikes...@gmail.com


--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup backuppc pool with rsync offsite

2011-03-15 Thread Matthias Meyer
Jeffrey J. Kosowsky wrote:

 
 I think there is a 3rd camp:
   3. Scripts that understand the special structure of the pool and pc
   trees and efficiently create lists of all hard links in pc
   directory.
 a] BackupPC_tarPCCOPY
 Included in standard BackupPC installations. It uses a perl
  script to recurse through the pc directory, calculate (and
   cache if you have enough memory) the file name md5sums and
 then uses that to create a tar-formatted file of the hard
   links that need to be created. This routine has been
   well-tested at least on smaller systems.
 
 b] BackupPC_copyPcPool
 Perl script that I recently wrote that should be significantly
 faster than [a], particularly on machines with low memory
 and/or slower cpus. This script creates a new temporary
 inode-number indexed pool to allow direct lookup of links and
 avoid having to calculate and check file name md5sums.  The
 pool is then rsynced (without hard links -- i.e. no -H flag)
 and then the restore script is run to recreate the hard
 links. I recently used this to successfully copy over a pool of
 almost 1 million files and a pc tree of about 10 million files.
 See the recent archives to retrieve a copy.
  
Hi Jeffrey,

I can't find your BackupPC_copyPcPool. I looking for it on the wiki as well 
as in backuppc.general.
What/where are this recent archive ?

Thanks in advance
Matthias
--
Don't Panic


--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] IPv6 support

2011-03-15 Thread Timothy J Massey
Tod Detre tod.de...@maine.edu wrote on 03/15/2011 03:30:21 PM:

 Actually, that's not quite the whole picture. BackupPC does do dns
 name lookup and those calls are not IPv6 compatible.  Once such
 instance is in bin/BackupPC_dump line 503. The gethostbyname()
 function (at least last time I tried to do IPv6 with BackupPC) does
 not support IPv6.

Ah, you are correct:  you have to use getaddrinfo, but I'm not sure that 
Perl has been updated to reflect this...

 This causes BackupPC to try to do an nmblookup. If
 your client doesn't have and IPv4 address and does not respond to
 nmblookup, BackupPC will fail. However, if you do not use nmblookup,
 you can just get rid of lines 496-508,510 and BackupPC will work just
 fine.

I'd have to look at the rest of the code, but if gethostbyname fails, and 
if you remove the nmb stuff, you're left with a hostname.  The question 
is, does the rest of the code work with a host name string rather than an 
IP address?

Timothy J. Massey

 
Out of the Box Solutions, Inc. 
Creative IT Solutions Made Simple!
http://www.OutOfTheBoxSolutions.com
tmas...@obscorp.com 
 
22108 Harper Ave.
St. Clair Shores, MI 48080
Office: (800)750-4OBS (4627)
Cell: (586)945-8796 
--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backups tooo slow over WAN - how to split them

2011-03-15 Thread Jeffrey J. Kosowsky
Bowie Bailey wrote at about 16:39:59 -0400 on Tuesday, March 15, 2011:
  On 3/15/2011 3:49 PM, David Herring wrote:
  
   I'm trying to backup windows servers with approx 3 partitions of 100G
   each over a WAN link. This takes 'days' to run and never successfully
   completes. I'm using rsyncd on the windows machines - and the backup
   is to a ubuntu server.
  
   So, how do you split the backups from a single machine - do you have
   to create multiple host entries for each machine ? Does this make
   restore too painful ?
  
   Is there anything else I should be doing ? MTU size ?
  
  You can split the backups by directories.  Just create separate backup
  hosts that each backup some of the directories.  Give each host slightly
  different names and use the ClientNameAlias setting to point them back
  to the proper host name.
  
  Keep in mind that this may result in the separate backups running
  simultaneously.  To prevent this you can use the DumpPreUserCmd and
  DumpPostUserCmd to set and check for lock files.
  
 
Bowie's suggestion is spot on.
But I'm still not sure why backup over WAN should take 'days' even for
300GB and even if this were the first time through or even if the
files changed heavily.

Assuming you have even low-end machines and a 100Mbps Internet, you
should be able to get O(5-10MBps) speeds which should complete your
backups in well under a day. Perhaps you have some large files that
are causing rsyncd timeouts? (you might want to check the 'timout'
parameter setting in rsync). Also, look at the client and server logs
to see why the backups don't complete.

--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backups tooo slow over WAN - how to split them

2011-03-15 Thread Linux Punk
You could just keep the one machine entry, but use the exclude options
to only back up a small portion of the machine. Once that completes
gradually remove directories from the excluded list until the full
backup completes fine. This assumes that most of your data is static
and that any files that change can be easily backed up in a daily
window.

Brian

On Tue, Mar 15, 2011 at 1:49 PM, David Herring d...@netfm.org wrote:

 I'm trying to backup windows servers with approx 3 partitions of 100G each
 over a WAN link. This takes 'days' to run and never successfully completes.
 I'm using rsyncd on the windows machines - and the backup is to a ubuntu
 server.
 So, how do you split the backups from a single machine - do you have to
 create multiple host entries for each machine ? Does this make restore too
 painful ?
 Is there anything else I should be doing ? MTU size ?
 Any help great-fully received,
 Dave

 --
 David Herring

 --
 Colocation vs. Managed Hosting
 A question and answer guide to determining the best fit
 for your organization - today and in the future.
 http://p.sf.net/sfu/internap-sfd2d
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/



--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/