Re: [BackupPC-users] Exclude list being ignored?

2008-02-19 Thread Nils Breunese (Lemonbit)
mark k wrote:

 Here is how I have mine setup, note the /' = at the top, backing  
 up 30 + systems this way with nor errors.

 $Conf{BackupFilesExclude} = {
   '/' = [
 '/proc/*',
 '/sys/*',
 '/var/run/*',
 '/dev/*',

If you use '/' as the key, then these excludes only apply to machines  
where you backup / (which may be all of your machines). Using '*'  
means: use these excludes for all machines that don't have any  
explicit excludes attached to them, which would have the same effect  
if you backup / on all your machines.

We use the following with rsync over SSH and it works just fine:

$Conf{BackupFilesExclude} = {
   '*' = [
 '/proc',
 '/var/named/run-root/proc',
 '/sys',
 '/mnt'
   ]
};

Nils Breunese.

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclude list being ignored?

2008-02-19 Thread mark k
Here is how I have mine setup, note the /' = at the top, backing up 30 +
systems this way with nor errors.

$Conf{BackupFilesExclude} = {

  '/' = [

'/proc/*',

'/sys/*',

'/var/run/*',

'/dev/*',
-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclude list being ignored?

2008-02-19 Thread Steven Whaley
Craig Barratt wrote:
 Steven writes:

   
 Running: /usr/bin/ssh -q -x -l netbackup freedom.rapidxdev.com
 /usr/bin/sudo /usr/bin/rsync --server --sender --numeric-ids --perms
 --owner --group --links --times --block-size=2048 --recursive -D
 --bwlimit=200 --ignore-times . /
 

 Yes, you can see none of the excludes make it into the command.
 But you knew that.

 The most likely explanation is there is a per-client config.pl that
 is overriding the setting.  Try editing $Conf{BackupFilesExclude}
 and $Conf{RsyncClientCmd} in the per-client config.pl (make
 $Conf{RsyncClientCmd} something different to the main config
 file to make sure the file is correctly being parsed).  Then
 check the actual command being run.

 Craig
   
Yes, I know that seems like the likely answer, but unfortunately it 
isn't.  This happens on all of the hosts, regardless of whether or not 
there are per host configuration files. 

As an update, it works fine if I update the RsyncArgs directly, like so:

$Conf{RsyncArgs} = [
#
# Do not edit these!
#
'--numeric-ids',
'--perms',
'--owner',
'--group',
'--links',
'--times',
'--block-size=2048',
'--recursive',

#
# If you are using a patched client rsync that supports the
# --checksum-seed option (see http://backuppc.sourceforge.net),
# then uncomment this to enabled rsync checksum cachcing
#
#'--checksum-seed=32761',

#
# Add additional arguments here
#
'-D',
'--bwlimit=200',
 '--exclude', '/proc',
 '--exclude', '/sys',
 '--exclude', '/tmp',
 '--exclude', '/var/tmp',
 '--exclude', '/usr/tmp',
 '--exclude', '/mnt',
 '--exclude', '/media',
 '--exclude', '/auto',
 '--exclude', '/var/run/acpid.socket',
 '--exclude', '/var/run/dbus/system_bus_socket',
 '--exclude', '/var/lib/backuppc/cpool',
 '--exclude', '/var/lib/backuppc/pc',
 '--exclude', '/var/lib/backuppc/pool',
 '--exclude', '/var/lib/backuppc/trash',
 '--exclude', '/var/lib/vmware/Virtual Machines/*/*.vmdk',
 '--exclude', '/var/lib/vmware/Virtual Machines/*/*.vmem',
 '--exclude', '/var/lib/vmware/Virtual Machines/*/*.vmsd',
 '--exclude', '/var/lib/vmware/Virtual Machines/*/*.vmsn',
 '--exclude', '/var/lib/vmware/Virtual Machines/*/*.vmss',
 '--exclude', '/backupdata',

];



-- 
Puryear Information Technology, LLC
Baton Rouge, LA * 225-706-8414
http://www.puryear-it.com

Visit http://www.puryear-it.com/pubs/ebooks/ to download your free
copies of:

 Best Practices for Managing Linux and UNIX Servers
 Spam Fighting and Email Security in the 21st Century


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Network error on smb backup

2008-02-19 Thread dan
Networking issue most likely.  Sometimes hardware fails for no good reason.
If you have an extra ethernet card lying around I suggest you try that.

Alternatively, has anyone moved the computer recently?  did the cable get
crunched?  Is it near a microwave or do the cables run by a microwave?  did
someone just get some Serious Satellite Radio?  XM and Serious can disrupt
network traffic if the cable is too close(i learned that after hours of
diagnosing a network issue, then the employee left for the day and took
their serious radio and everything started working!)

On Feb 18, 2008 10:57 PM, Alex Schaft [EMAIL PROTECTED] wrote:

  On 2008/02/19 07:50, Craig Barratt wrote:

 Alex writes:



  I've got a laptop (mine :)) which has been backed up successfully via
 smb for a while now. All of a sudden, it's losing the network
 connection, along with the following errors.


  I've seen cases where WinXX disk corruption can cause smbclient to fail.

 Have you tried running the WinXX chkdsk utility?  (eg, in WinXP, C: drive
 right click - properties - Tools tab - Check Now...)

 Craig


  There don't appear to be any file system issues.


 -
 This SF.net email is sponsored by: Microsoft
 Defy all challenges. Microsoft(R) Visual Studio 2008.
 http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] backppc error

2008-02-19 Thread Joshua Fuente
Hi can someone please help!..

Backuppc 3.1.0
Suse 10.3
2.6.22.16-0.2-default
rsync 2.6.9 on server and clients

I removed all the arguments from the command inorder to better troubleshoot
the issue.  (i can just copy them back from the restore settings later)

but this is killing... any ideas?

Contents of file /srv/backuppc/pc/csrhplt01/XferLOG.bad.z, modified
2008-02-08 00:27:25

full backup started for directory /data/share
Running: /usr/bin/ssh -q -x -l root csrhplt01 /usr/bin/rsync
Xfer PIDs are now 16157
Got remote protocol 1869771333
Fatal error (bad version): Error: Can't open display: csrbakup01.barn.yard:1.0

Read EOF:
Tried again: got 0 bytes
fileListReceive() failed
Done: 0 files, 0 bytes
Got fatal error during xfer (fileListReceive failed)
Backup aborted (fileListReceive failed)
Not saving this as a partial backup since it has fewer files than the
prior one (got 0 and 0 files versus 0)


Thanks,
Joshua
-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Web request faileds!

2008-02-19 Thread Robert Syms
Hi

Can someone help me please, I have just downloading Backuppc from the
syncatic for Ubuntu 7.10 and told it to load apache2. When I put in the url
for it, it gives me the following:

http://www.backuppc.com/
This domain is missing from the Web server configuration

The domain name is correctly pointing at a valid Web server. This Web server
does not recognize this domain name as a valid Web site.

If you are the Webmaster please contact Technical Support.


-- 
Robert
-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] WORKAROUND: Hang when using rsync over ssh to backup Windows 2003 files

2008-02-19 Thread hot java
PROBLEM: Backup Hangs when using BackupPC / rsync over ssh to a Windows 2003
server.

WORKAROUND SUMMARY: Backup a Windows 2003 server using by using BackupPC's
Pre and Post commands to establish a forwarding ssh tunnel and a locally
bound Windows rsyncd service.  I know what you are thinking - I don't want
to load rsyncd as a service because this creates another security issue.
Wait,  we are going to BIND the rsyncd service to 127.0.0.1 and then connect
to it via the forwarding tunnel! Awesome.   Performing a backup using this
method will seem weird because you'll be issuing an rsync command on your
BackupPC server against localhost,  127.0.0.1::module, which is forwarded
over to the Windows 2003 server where it then connect to the rsync service
on 127.0.0.1:873.  Believe me - it works.  I've been using this method for
over a month now without any problems.



HOWTO:



How to backup a Windows 2003 server using BackupPC, rsyncd, and a forwarding
ssh tunnel.  The goal was to develop a secure backup method that actually
works.  Rsync over ssh from Linux to Windows fails (for me).  So, we
developed a secure method that meshes nicely with BackupPC and rsyncd.
Caution: these are my personal notes, following them may crash your system
and result in data loss.


FAILURE: Linux --rsync/ssh-- Windows 2003(sshd):
We spent about a week trying to resolve problems backing up a Windows 2003
server from Linux using rsync over ssh.  Almost all of our attempts at
getting a clean backup of Windows 2003 server from a Linux server using
rsync over ssh failed miserably - the backup would simply hang on certain
files.  This problem persisted even when we replaced the original Windows
source files with a Volume Shadow Copy - ouch!

SUCCESS: Linux ==rsync (modules)/ssh== Windows 2003(sshd/rsyncd)
All of our tests using module-based rsync from Linux to Windows 2003 rsyncd
services worked perfectly.  So, we developed a simple workaround to secure
rsyncd connections through a forwarding ssh connection.  To do this, we bind
rsyncd to localhost on a Window 2003 server and then connect to this service
from our Linux backup server through a forwarding SSH tunnel.


---
ESTABLISHING RSYNCD (localhost) AND SSHd ON WINDOWS 2003 SERVER:
* Install cygwin, be sure to include cygrunsrv, openssh and rsync.
* Follow one of the many online guides for setting up cygwin's sshd
(reference: http://pigtail.net/LRP/printsrv/cygwin-sshd.html)

To setup rsync as service in Windows 2003 do the following:
(reference: http://www.gaztronics.net/rsync.php)

Start cygwin:
% vi /etc/rsyncd.conf

use chroot = false
strict mode = false

[backupwww]
   path = /cygdrive/c/webserver
   read only = false
   list = true
   comment = BACKUP

ESTABLISH CYGWIN AS A SERVICE
% cygrunsrv -I Rsyncd -p /cygdrive/c/cygwin/bin/rsync.exe -a
--config=/cygdrive/c/cygwin/etc/rsyncd.conf --daemon --no-detach --address=
127.0.0.1 -f Rsyncd daemon service on localhost -u Administrator


***IMPORTANT: BE SURE TO USE --address=127.0.0.1 *


START SERVICE:
% cygrunsrv --list
% cygrunsrv --start sshd
% cygrunsrv --start Rsynd

Now, we are ready to test our new services.

TESTING: ESTABLISH THE FORWARDING TUNNEL:
TESTING: On your Linux backup server issue this command:

TESTING: linux% ssh -L 1500:127.0.0.1:873 -l user myserver.my.domain

TESTING: This command will establish a tunnel to myserver where new
connections to the local linux port on 1500 are forwarded over to the remote
side and actually connect to 127.0.0.1:873.  That is to say, local
connections to 127.0.0.1:1500 are: (a) FORWARDED through the tunnel and (b)
connected to 127.0.0.1:873 on the remote side.

TESTING: Now that we have this incredibly useful tunnel in place, all we
need to do is run rsync against the localhost:1500 to actually backup the
remote side.

TESTING: Here is an example of the rsync command:

TESTING: linux% rsync -av --port 1500 127.0.0.1::backupwww /home/backups

TESTING: In this example, backupwww is the name of your Windows 2003 rsyncd
module.  Obviously, /home/backups is the destination on your backup server
where you want to store these test backups.
---

If everything works, you are ready to configure BackupPC.

== BACKUPPC ==

BACKUPPC: BackupPC (rsyncd method) --ssh tunnel- Windows 2003
Server (sshd/rsyncd)
LINUX: Install BackupPC
LINUX: Setup ssh keys such that user backuppc can ssh over to your Windows
2003 server without supplying a password
(reference: http://backuppc.sourceforge.net/faq/ssh.html)

Pick an alias for your Windows 2003 server to be used by BackupPC.  Any name
will do - we'll map this alias to 127.0.0.1 later with ClientNameAlias.
For this example, I selected securewww1 as an alias for our Windows 2003
server.

linux% vi /BackupPC/conf/hosts
 

Re: [BackupPC-users] WORKAROUND: Hang when using rsync over ssh to backup Windows 2003 files

2008-02-19 Thread dan
This is a great piece of knowledge, I encourage you to put this on the wiki.

also note that this can be done in reverse to have the remote machine create
the tunnel and issue a command over ssh 'BackupPC_servermsg or
BackupPC_dump' allowing remote clients to backup on their own schedule
without the backuppc server having any knowledge of their remote IP address.

this is also a great was to secure your rsync traffic as the rsyncd server
in cygwin does not listen to any network IP address, only 127.0.0.1

very nice.

On Feb 11, 2008 4:07 PM, hot java [EMAIL PROTECTED] wrote:

 PROBLEM: Backup Hangs when using BackupPC / rsync over ssh to a Windows
 2003 server.

 WORKAROUND SUMMARY: Backup a Windows 2003 server using by using BackupPC's
 Pre and Post commands to establish a forwarding ssh tunnel and a locally
 bound Windows rsyncd service.  I know what you are thinking - I don't want
 to load rsyncd as a service because this creates another security issue.
 Wait,  we are going to BIND the rsyncd service to 127.0.0.1 and then
 connect to it via the forwarding tunnel! Awesome.   Performing a backup
 using this method will seem weird because you'll be issuing an rsync command
 on your BackupPC server against localhost,  127.0.0.1::module, which is
 forwarded over to the Windows 2003 server where it then connect to the rsync
 service on 127.0.0.1:873.  Believe me - it works.  I've been using this
 method for over a month now without any problems.



 HOWTO:



 How to backup a Windows 2003 server using BackupPC, rsyncd, and a
 forwarding ssh tunnel.  The goal was to develop a secure backup method that
 actually works.  Rsync over ssh from Linux to Windows fails (for me).  So,
 we developed a secure method that meshes nicely with BackupPC and rsyncd.
 Caution: these are my personal notes, following them may crash your system
 and result in data loss.


 FAILURE: Linux --rsync/ssh-- Windows 2003(sshd):
 We spent about a week trying to resolve problems backing up a Windows 2003
 server from Linux using rsync over ssh.  Almost all of our attempts at
 getting a clean backup of Windows 2003 server from a Linux server using
 rsync over ssh failed miserably - the backup would simply hang on certain
 files.  This problem persisted even when we replaced the original Windows
 source files with a Volume Shadow Copy - ouch!

 SUCCESS: Linux ==rsync (modules)/ssh== Windows 2003(sshd/rsyncd)
 All of our tests using module-based rsync from Linux to Windows 2003
 rsyncd services worked perfectly.  So, we developed a simple workaround to
 secure rsyncd connections through a forwarding ssh connection.  To do this,
 we bind rsyncd to localhost on a Window 2003 server and then connect to this
 service from our Linux backup server through a forwarding SSH tunnel.


 ---
 ESTABLISHING RSYNCD (localhost) AND SSHd ON WINDOWS 2003 SERVER:
 * Install cygwin, be sure to include cygrunsrv, openssh and rsync.
 * Follow one of the many online guides for setting up cygwin's sshd
 (reference: http://pigtail.net/LRP/printsrv/cygwin-sshd.html)

 To setup rsync as service in Windows 2003 do the following:
 (reference: http://www.gaztronics.net/rsync.php)

 Start cygwin:
 % vi /etc/rsyncd.conf

 use chroot = false
 strict mode = false

 [backupwww]
path = /cygdrive/c/webserver
read only = false
list = true
comment = BACKUP

 ESTABLISH CYGWIN AS A SERVICE
 % cygrunsrv -I Rsyncd -p /cygdrive/c/cygwin/bin/rsync.exe -a
 --config=/cygdrive/c/cygwin/etc/rsyncd.conf --daemon --no-detach --address=
 127.0.0.1 -f Rsyncd daemon service on localhost -u Administrator

 
 ***IMPORTANT: BE SURE TO USE --address=127.0.0.1 *
 

 START SERVICE:
 % cygrunsrv --list
 % cygrunsrv --start sshd
 % cygrunsrv --start Rsynd

 Now, we are ready to test our new services.

 TESTING: ESTABLISH THE FORWARDING TUNNEL:
 TESTING: On your Linux backup server issue this command:

 TESTING: linux% ssh -L 1500:127.0.0.1:873 -l user myserver.my.domain

 TESTING: This command will establish a tunnel to myserver where new
 connections to the local linux port on 1500 are forwarded over to the remote
 side and actually connect to 127.0.0.1:873.  That is to say, local
 connections to 127.0.0.1:1500 are: (a) FORWARDED through the tunnel and
 (b) connected to 127.0.0.1:873 on the remote side.

 TESTING: Now that we have this incredibly useful tunnel in place, all we
 need to do is run rsync against the localhost:1500 to actually backup the
 remote side.

 TESTING: Here is an example of the rsync command:

 TESTING: linux% rsync -av --port 1500 127.0.0.1::backupwww /home/backups

 TESTING: In this example, backupwww is the name of your Windows 2003
 rsyncd module.  Obviously, /home/backups is the destination on your backup
 server where you want to store these test backups.
 

Re: [BackupPC-users] Download file directly from browse fails...

2008-02-19 Thread dan
Joe, what filesystem are you running on the downloading machine? is it
FAT32?  When I read this I immediately thought FAT32 filesystem, FAT32 can't
handle 4GB+ files.

On Feb 19, 2008 10:54 AM, Joe Krahn [EMAIL PROTECTED] wrote:

 Mirco Piccin wrote:
  Hi and thanks for reply.
 
  I'm trying to get a backup of a file.
 
  It's size is about 12 GB.
  I try to download it directly from Browse Backup, but the first time
  download freeze at about 3,99 GB, and the second time at about 2,76
  GB.
 
  Maybe there's a timeout for the direct downloading?
 
  I'm going to try also to restore that file by:
  1. picking it in the Browse Backup
  2. click on Restore Selected Files
  3. choose the zip level compression and download the .zip file.
 
  Also trying to restore that file choosing to download the zip file
  does not work.
  Download in this way go at 1kb/s.
 
  For a restore that large I'd use the command line interface, just to
  make sure a browser timeout won't be an issue.
 
  Browser timeout shouldn't be an issue (after only 3,99 GB... - i'm
  thinking at the Ubuntu 4,4 GB DVD .iso image downloaded from web).
 
  You can use BackupPC_tarCreate as your BackupPC user to create
   a tar archive of the files you want to restore.
 
  Well, the restore must be done by a Windows user.
  The restore i'm talking about is of one single file of big size (about
 12 GB).
  Backup ot that file works perfectly.
 
  But download that file -by selecting it in Backup Browse tree or
  creating the zip file- seems to be not possible.
  Any help/tips?
 
  Regards
  M
 Did you check the Apache error logs? Even though you can download a 4.4G
 file, there may still be some timeout problems, especially if the server
 is under a heavy load. It may be timeout settings for Apache, and not
 the browser. (Just guessing)

 Hanging at 3,99 makes me wonder if there is some sort of 4G file size
 limitation somewhere, but maybe this was just a coincidence.

 Joe Krahn

 -
 This SF.net email is sponsored by: Microsoft
 Defy all challenges. Microsoft(R) Visual Studio 2008.
 http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Download file directly from browse fails...

2008-02-19 Thread Joe Krahn
Mirco Piccin wrote:
 Hi and thanks for reply.
 
 I'm trying to get a backup of a file.

 It's size is about 12 GB.
 I try to download it directly from Browse Backup, but the first time
 download freeze at about 3,99 GB, and the second time at about 2,76
 GB.

 Maybe there's a timeout for the direct downloading?

 I'm going to try also to restore that file by:
 1. picking it in the Browse Backup
 2. click on Restore Selected Files
 3. choose the zip level compression and download the .zip file.
 
 Also trying to restore that file choosing to download the zip file
 does not work.
 Download in this way go at 1kb/s.
 
 For a restore that large I'd use the command line interface, just to
 make sure a browser timeout won't be an issue.
 
 Browser timeout shouldn't be an issue (after only 3,99 GB... - i'm
 thinking at the Ubuntu 4,4 GB DVD .iso image downloaded from web).
 
 You can use BackupPC_tarCreate as your BackupPC user to create
  a tar archive of the files you want to restore.
 
 Well, the restore must be done by a Windows user.
 The restore i'm talking about is of one single file of big size (about 12 GB).
 Backup ot that file works perfectly.
 
 But download that file -by selecting it in Backup Browse tree or
 creating the zip file- seems to be not possible.
 Any help/tips?
 
 Regards
 M
Did you check the Apache error logs? Even though you can download a 4.4G
file, there may still be some timeout problems, especially if the server
is under a heavy load. It may be timeout settings for Apache, and not
the browser. (Just guessing)

Hanging at 3,99 makes me wonder if there is some sort of 4G file size
limitation somewhere, but maybe this was just a coincidence.

Joe Krahn

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Excluding special files and folders

2008-02-19 Thread kurzi
Hi,

I'm using BackupPC version 3.0.0 and samba for backing up several 
Windows and Linux clients..
and have spent lots of time (without working solution) trying to figure 
out if there is some way to exclude following files:

A) directories which name starts with dot. ( ex. /dir/.obj/ )
B) filenames without extension ( ex. /dir/uselessfile )
C) filenames with any extension, which starts with certain pattern ( 
ex. foo_*.* )

Excluding all files with certain extension works fine with:
$Conf{BackupFilesExclude} = {
  '*' = [
'*.tmp'
  ]
};

And there is nothing in BackupFilesOnly-section

Thanks in advance..



-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Network error on smb backup

2008-02-19 Thread Alex Schaft
On 2008/02/19 17:04, dan wrote:
 Networking issue most likely.  Sometimes hardware fails for no good 
 reason.  If you have an extra ethernet card lying around I suggest you 
 try that. 

 Alternatively, has anyone moved the computer recently?  did the cable 
 get crunched?  Is it near a microwave or do the cables run by a 
 microwave?  did someone just get some Serious Satellite Radio?  XM and 
 Serious can disrupt network traffic if the cable is too close(i 
 learned that after hours of diagnosing a network issue, then the 
 employee left for the day and took their serious radio and everything 
 started working!)

This is my laptop which is backed up when plugged into its docking 
station. I'll give it a go tomorrow without it.

Alex


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Enhancing WAN link transfers

2008-02-19 Thread Nick Webb
Carl Wilhelm Soderstrom wrote:
 On 11/28 09:39 , Tim Hall wrote:
 Are there any known backuppc tweaks/settings that
 are proven to increase transfer performance over
 wan links?  Specifically with using rsyncd or rsync
 as the transfer method.
 
 . . . .
 
 If you're running =v3 the following option will make all the incrementals
 sync against the previous incremental, instead of the last full. This keeps
 them from growing quite as quickly. (It's the behavior you expect from
 rsync).
 
 $Conf{IncrLevels} = [1, 2, 3, 4, 5, 6];
 
 

My apologizes for replying to an ancient thread, but I'm curious if an 
upgrade from 2.x to 3.x will help with bandwidth efficiency much?

I was under the assumption that BackupPC never transfers the same file 
twice unless it changed after the last backup (either full or 
incremental), even in the 2.x version.  Was that an invalid assumption 
on my part?

Thanks for a great product!

Nick

-- 
Nick Webb
System Administrator
Freelock Computing - www.freelock.com
206.577.0540 x22

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Enhancing WAN link transfers

2008-02-19 Thread Carl Wilhelm Soderstrom
  If you're running =v3 the following option will make all the incrementals
  sync against the previous incremental, instead of the last full. This keeps
  them from growing quite as quickly. (It's the behavior you expect from
  rsync).
  
  $Conf{IncrLevels} = [1, 2, 3, 4, 5, 6];
  
 I was under the assumption that BackupPC never transfers the same file 
 twice unless it changed after the last backup (either full or 
 incremental), even in the 2.x version.  Was that an invalid assumption 
 on my part?

That is incorrect. Backuppc does its incrementals against the last full; not
against the previous incremental (unless you set $Conf{IncrLevels} = [1, 2,
3, 4, 5, 6];).

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Network error on smb backup

2008-02-19 Thread dan
do you use a headset with your phone? like a plantronics?  i have
plantronics cs50 and cs55 headsets that have to be about 6 away from an
ethernet cable or they distrupt the signal.  typically this just causes the
network to fall back to 10Mbps but can cause all kinds of funky errors.

On Feb 19, 2008 12:49 PM, Alex Schaft [EMAIL PROTECTED] wrote:

 On 2008/02/19 17:04, dan wrote:
  Networking issue most likely.  Sometimes hardware fails for no good
  reason.  If you have an extra ethernet card lying around I suggest you
  try that.
 
  Alternatively, has anyone moved the computer recently?  did the cable
  get crunched?  Is it near a microwave or do the cables run by a
  microwave?  did someone just get some Serious Satellite Radio?  XM and
  Serious can disrupt network traffic if the cable is too close(i
  learned that after hours of diagnosing a network issue, then the
  employee left for the day and took their serious radio and everything
  started working!)
 
 This is my laptop which is backed up when plugged into its docking
 station. I'll give it a go tomorrow without it.

 Alex


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Enhancing WAN link transfers

2008-02-19 Thread Carl Wilhelm Soderstrom
On 02/19 05:53 , Raman Gupta wrote:
 Carl Wilhelm Soderstrom wrote:
  If you're running =v3 the following option will make all the incrementals
  sync against the previous incremental, instead of the last full. This 
  keeps
  them from growing quite as quickly. (It's the behavior you expect from
  rsync).
 
  $Conf{IncrLevels} = [1, 2, 3, 4, 5, 6];
 
  I was under the assumption that BackupPC never transfers the same file 
  twice unless it changed after the last backup (either full or 
  incremental), even in the 2.x version.  Was that an invalid assumption 
  on my part?
  
  That is incorrect. Backuppc does its incrementals against the last full; not
  against the previous incremental (unless you set $Conf{IncrLevels} = [1, 2,
  3, 4, 5, 6];).
 
 So is it correct to say that when using rsync, its probably more
 efficient to just turn off incrementals and always do fulls?

No, and a brief examination of the reports will make this clear. 
I'm not thoroughly clear on the difference; but backuppc 'fulls' using rsync
do a more thorough set of checks than the 'incrementals'.

Incrementals, even using rsync, are usually much faster than fulls. 

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Enhancing WAN link transfers

2008-02-19 Thread dan
i looked at my archive history hear and i have a number of hosts than do
incrementals  take like 6 minutes and fulls like 46 minutes

On Feb 19, 2008 4:07 PM, Carl Wilhelm Soderstrom [EMAIL PROTECTED]
wrote:

 On 02/19 05:53 , Raman Gupta wrote:
  Carl Wilhelm Soderstrom wrote:
   If you're running =v3 the following option will make all the
 incrementals
   sync against the previous incremental, instead of the last full.
 This keeps
   them from growing quite as quickly. (It's the behavior you expect
 from
   rsync).
  
   $Conf{IncrLevels} = [1, 2, 3, 4, 5, 6];
  
   I was under the assumption that BackupPC never transfers the same
 file
   twice unless it changed after the last backup (either full or
   incremental), even in the 2.x version.  Was that an invalid
 assumption
   on my part?
  
   That is incorrect. Backuppc does its incrementals against the last
 full; not
   against the previous incremental (unless you set $Conf{IncrLevels} =
 [1, 2,
   3, 4, 5, 6];).
 
  So is it correct to say that when using rsync, its probably more
  efficient to just turn off incrementals and always do fulls?

 No, and a brief examination of the reports will make this clear.
 I'm not thoroughly clear on the difference; but backuppc 'fulls' using
 rsync
 do a more thorough set of checks than the 'incrementals'.

 Incrementals, even using rsync, are usually much faster than fulls.

 --
 Carl Soderstrom
 Systems Administrator
 Real-Time Enterprises
 www.real-time.com

 -
 This SF.net email is sponsored by: Microsoft
 Defy all challenges. Microsoft(R) Visual Studio 2008.
 http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Enhancing WAN link transfers

2008-02-19 Thread dan
no, incrementals are more efficient on bandwidth.  they do a less strenuous
test to determine if  a file has changed.

at the expense of CPU power on both sides, you can compress the rsync
traffic either with rsync -z or if you are using ssh then with ssh's
compression.  if you REALLY wanted to go all out you can have rsync pipe
through bzip2 max compression but you would have to pipe it out of bzip2 on
the server side.  that would eat up tons of CPU but would likely use less
bandwidth as long as the files you are backing up are not already
compressed.

On Feb 19, 2008 3:53 PM, Raman Gupta [EMAIL PROTECTED] wrote:

 Carl Wilhelm Soderstrom wrote:
  If you're running =v3 the following option will make all the
 incrementals
  sync against the previous incremental, instead of the last full. This
 keeps
  them from growing quite as quickly. (It's the behavior you expect from
  rsync).
 
  $Conf{IncrLevels} = [1, 2, 3, 4, 5, 6];
 
  I was under the assumption that BackupPC never transfers the same file
  twice unless it changed after the last backup (either full or
  incremental), even in the 2.x version.  Was that an invalid assumption
  on my part?
 
  That is incorrect. Backuppc does its incrementals against the last full;
 not
  against the previous incremental (unless you set $Conf{IncrLevels} = [1,
 2,
  3, 4, 5, 6];).

 So is it correct to say that when using rsync, its probably more
 efficient to just turn off incrementals and always do fulls?

 Cheers,
 Raman Gupta


 -
 This SF.net email is sponsored by: Microsoft
 Defy all challenges. Microsoft(R) Visual Studio 2008.
 http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC on AR7 Platform?

2008-02-19 Thread Hendrik Friedel
Hello,

It would be quite interesting to install BackupPC on a Linux WLAN Router.
For me it would be the AVM Fritz Box. It has a Ti-AR7 Processor with 32MB
Ram.

Now, someone successfully installed a Debian system on it, so I wonder, if
it would be useful to install BackupPC on it. 

What would be the bottleneck here? I plan to use it for backing up remote
Systems via rsync.


Greetings,
Hendrik


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Enhancing WAN link transfers

2008-02-19 Thread Rich Rauenzahn


dan wrote:
 no, incrementals are more efficient on bandwidth.  they do a less 
 strenuous test to determine if  a file has changed.

 at the expense of CPU power on both sides, you can compress the rsync 
 traffic either with rsync -z 
Have you tried rsync -z?   Last I heard, BackupPC's rsync modules don't 
support it.

Rich



-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Enhancing WAN link transfers

2008-02-19 Thread Nick Webb
Rich Rauenzahn wrote:
 
 dan wrote:
 no, incrementals are more efficient on bandwidth.  they do a less 
 strenuous test to determine if  a file has changed.

 at the expense of CPU power on both sides, you can compress the rsync 
 traffic either with rsync -z 
 Have you tried rsync -z?   Last I heard, BackupPC's rsync modules don't 
 support it.
 
 Rich

I tried it a long time back, it was not pretty.  Backups failed to 
complete... after backing up for 24 hours or more.  Don't do it.

Nick

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC on AR7 Platform?

2008-02-19 Thread dan
rsync alone makes this box pretty much incapable of running backuppc.  32MB
of ram minus running system will give you at most 20MB usable under BEST
CASE scenario, which is about 150,000 files on a client MAXIMUM, which is
just not enough for many many clients.  also, that CPU is just not powerfull
enough to do the work in a reasonable amount of time.  your looking at many
many hours to do one small backup.

you need a real PC to do backuppc on with some CPU power and RAM.  At least
p3 500mhz and 256MB ram, prefereably much much more.

I run dual-core opterons with 2GB ram which is about 100x more powerful.

On Feb 19, 2008 4:44 PM, Hendrik Friedel [EMAIL PROTECTED] wrote:

 Hello,

 It would be quite interesting to install BackupPC on a Linux WLAN Router.
 For me it would be the AVM Fritz Box. It has a Ti-AR7 Processor with 32MB
 Ram.

 Now, someone successfully installed a Debian system on it, so I wonder, if
 it would be useful to install BackupPC on it.

 What would be the bottleneck here? I plan to use it for backing up remote
 Systems via rsync.


 Greetings,
 Hendrik


 -
 This SF.net email is sponsored by: Microsoft
 Defy all challenges. Microsoft(R) Visual Studio 2008.
 http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclude list being ignored?

2008-02-19 Thread Craig Barratt
Steven writes:

 Yes, I know that seems like the likely answer, but unfortunately it
 isn't.  This happens on all of the hosts, regardless of whether or not
 there are per host configuration files.

Well, if you are confident the likely explanation is wrong, I'd
recommend adding some debug statements to the code to figure
out why the value isn't being read.

Assuming you are running 3.x, in the ConfigDataRead function in
lib/BackupPC/Storage/Text.pm add the two print statements:

sub ConfigDataRead
{
my($s, $host) = @_;
my($ret, $mesg, $config, @configs);

#
# TODO: add lock
#
my $conf = {};
my $configPath = $s-ConfigPath($host);

push(@configs, $configPath) if ( -f $configPath );
foreach $config ( @configs ) {
%Conf = ();
if ( !defined($ret = do $config)  ($! || $@) ) {
$mesg = Couldn't open $config: $! if ( $! );
$mesg = Couldn't execute $config: $@ if ( $@ );
$mesg =~ s/[\n\r]+//;
return ($mesg, $conf);
}
%$conf = ( %$conf, %Conf );

print(STDERR Read config file $config;\nBackupFilesExclude is 
now:);   #  add
print(STDERR Dumper($conf-{BackupFilesExclude}));  
 #  add
}

Then run

 su - backuppc
 BackupPC_dump -f -v HOSTNAME

and you should see it read each config file and the resulting BackupFilesExclude
value.  Hit ^C after that.

Craig

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/