Re: [BackupPC-users] Permission query

2009-04-17 Thread Craig Barratt
Alex writes:

 The folder shows up in the backup as 0750. The -p is present

Do you mean when you look at the directory permissions below
the PC directory on the backup server; eg, the output from:

ls -ld /TOPDIR/pc/HOST/nnn/fshare/fhome

What permissions are shown when you browse to that directory
in the web interface?

Craig

--
Stay on top of everything new and different, both inside and 
around Java (TM) technology - register by April 22, and save
$200 on the JavaOne (SM) conference, June 2-5, 2009, San Francisco.
300 plus technical and hands-on sessions. Register today. 
Use priority code J9JMT32. http://p.sf.net/sfu/p
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Permission query

2009-04-17 Thread Craig Barratt
Alex writes:

 [r...@qsbackup f%2f]# pwd
 /opt/backuppc/files/pc/mail/184/f%2f
 [r...@qsbackup f%2f]# ll
 total 16
 -rw-r-  3 backuppc backuppc   26 Apr 17 05:04 attrib
 drwxr-x---  5 backuppc backuppc 4096 Apr 17 06:00 fetc
 drwxr-x---  3 backuppc backuppc 4096 Apr 17 06:03 fhome
 drwxr-x---  6 backuppc backuppc 4096 Apr 17 06:00 froot

This is normal and correct.

The meta data (permissions, ownership, mtime etc) are stored in a
seperate file (the attrib file, one per directory).  It can't be
stored with the file (as on the client) because of pooling, and also
some meta data values (eg: owned by root, or a block or character
special file) can't be created unless you are root.

For security reasons files are stored using $Conf{UmaskMode},
which by default disables world read/write permissions.

If you browse the directory using the web interface you should
see the right permissions.

As Les points out, you need to use the -p and --numeric-owner
options with tar when you restore files (extracting).  That
doesn't apply to when you are doing a backup (creating an
archive of the client).

 I'm just investigating a difference in TarClientCmd that I'm not sure if that 
 could cause it

 Default:
 $Conf{TarClientCmd} = '$sshPath -q -x -n -l root $host env LC_ALL=C $tarPath 
 -c -v -f - -C $share Name+ --totals';

 Overridden:
 $Conf{TarClientCmd} = '$sshPath -q -x -n -l root $host $tarPath -c -v -f - -C 
 $shareName+ --totals';

Removing the env LC_ALL=C should be ok so long as your client has
a locale of C.

Craig

--
Stay on top of everything new and different, both inside and 
around Java (TM) technology - register by April 22, and save
$200 on the JavaOne (SM) conference, June 2-5, 2009, San Francisco.
300 plus technical and hands-on sessions. Register today. 
Use priority code J9JMT32. http://p.sf.net/sfu/p
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupFilesExclude and BackupFilesOnly Not Working

2009-04-16 Thread Craig Barratt
John writes:

 $Conf{SmbShareName} = [
   'C$'
 ];

 #FILES TO BACKUP
 #-
 $Conf{BackupFilesOnly} = {
   'c' = ['/MS_OUTLOOK/*'],
 };

First, the 'c' should be 'C$' - it should match the share name.
Also, you can't use wildcards in $Conf{BackupFilesOnly}.  You
need to use an absolute path.

 #FILES TO EXCLUDE
 #-
 $Conf{BackupFilesExclude} = {
'c' = ['/Program Files/*', '/WINDOWS/*'],
'c' = ['/Documents and Settings/*'],
 };

Once again the 'c' should be 'C$'.

Unfortunately smbclient doesn't allow you to use both
$Conf{BackupFilesOnly} and $Conf{BackupFilesExclude}. One choice
is to change the share to be just the directory tree you want to
back up and leave $Conf{BackupFilesOnly} empty.

Using wildcards in $Conf{BackupFilesExclude} with smbclient is ok,
but you need to use '\' instead of '/':

 $Conf{BackupFilesExclude} = {
'C$' = ['\\Program Files\\*', '\\WINDOWS\\*'],
'C$' = ['\\Documents and Settings\\*'],
 };

(If you are entering this via the CGI interface you only need
to enter a single '\'.)

Craig

--
Stay on top of everything new and different, both inside and 
around Java (TM) technology - register by April 22, and save
$200 on the JavaOne (SM) conference, June 2-5, 2009, San Francisco.
300 plus technical and hands-on sessions. Register today. 
Use priority code J9JMT32. http://p.sf.net/sfu/p
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] RSYNC xfer failing backuppc version 3.2.0beta0

2009-04-14 Thread Craig Barratt
Tim writes:

 This is a new install so I thought I would try the beta version
 do you recommend I go back to the stable version?

If you are willing to test the beta version some more that
would be great.  You've already found one bug :).

Holger told you where to get the File::Listing module.
Another workaround is to simply comment out this line in

use File::Listing qw/parse_dir/;

in

lib/BackupPC/Xfer/Ftp.pm

(ie: add a single # character at the start of the line, or
delete it altogether).  You won't be able to use ftp, but
everything else should work.

Craig

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] RSYNC xfer failing backuppc version 3.2.0beta0

2009-04-13 Thread Craig Barratt
Holger writes:

 two things are really confusing me:
 
 1.) The title claims that it is supposed to be an *rsync* xfer, the error
 message clearly indicates that *ftp* is attempted (and fails). Tim, could
 you please clarify which transfer method you are trying to use?

The code loads all the Xfer modules and the failure happens
when there is a module that isn't found.  The Xfer method
doesn't matter at that point.

  Looks like you need to install the File::Listing module.
 
  Paul - the code should check for this and it should be in the
  docs too.
 
 Isn't the same true for Archive::Zip and XML::RSS? I'm not sure those two
 *can* be handled in a sane way, but I'll have a look in the next few days and
 try to provide a patch.

Yes, like other non-builtin modules, the ftp code should load this
module inside an eval() in the BEGIN block, and set a flag if it
fails.  In particular, this line:

use File::Listing qw/parse_dir/;

in lib/BackupPC/Xfer/Ftp.pm should be moved inside the BEGIN block and
used with an eval.  It's not quite as simple as adding File::Listing to
this list:

my @FTPLibs = qw( Net::FTP Net::FTP::RetrHandle );

since it needs the qw/parse_dir/.

Craig

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Next release

2009-04-13 Thread Craig Barratt
Obj writes:

 I am running version 3.2.0. can someone tell me why
 $Conf{BackupFilesExclude} is not working. It still backups up all Temp
 folders, and .mp3 files, etc. The backup method is SMB.

You sent me offlist your config file and XferLOG file.  Thanks.

The problem is that if you use wildcards (in particular *) in
excludes, smbclient will only do the right thing if you use the
Windows-style directory separator '\' instead of '/'.

Therefore, instead of using something like this:

$Conf{BackupFilesExclude} = {
  '*' = [
'/Documents and Settings/*/Local Settings/Temporary Internet Files/',
'/Documents and Settings/*/Local Settings/Temp/',
'/Documents and Settings/*/NTUSER.DAT',
'/Documents and Settings/*/ntuser.dat.LOG',
  ],
};

you should use this:

$Conf{BackupFilesExclude} = {
  '*' = [
'\\Documents and Settings\\*\\Local Settings\\Temporary Internet Files',
'\\Documents and Settings\\*\\Local Settings\\Temp',
'\\Documents and Settings\\*\\NTUSER.DAT',
'\\Documents and Settings\\*\\ntuser.dat.LOG',
  ],
};

Also, you shouldn't include the trailing \ at the end of the path.
You should either use:

'\\FOO'

or

'\\FOO\\*'

The first will exclude \FOO and all its contents.  The latter will
include the directory \FOO but similarly exclude its contents.

You can experiment with what works or doesn't work by manually
running smbclient, eg:

setenv PASSWD XXX
/usr/bin/smbclient HOST\\SHARE -I IPADDR -U USERNAME -E -N -d 1 \
-c tarmode\ full -TcrX - \
\\Documents\ and\ Settings\\\*\\Local\ Settings\\Temporary\ 
Internet\ Files \
\\Documents\ and\ Settings\\\*\\Local\ Settings\\Temp \
\\Documents\ and\ Settings\\\*\\NTUSER.DAT \
\\Documents\ and\ Settings\\\*\\ntuser.dat.LOG \
| tar tvf -

Note the additional escaping for the shell.  (Also, the -I IPADDR is
optional depending on your configuration.)

Craig

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Translating BackupPc to another language

2009-04-12 Thread Craig Barratt
Fatih writes:

 I want to translate BackupPc's CGI and the installation part from
 English to Turkish.
 How can i do this ? Is there anyone who is responsible for this kind
 work ? Who can me give some advice where i could begin start

You should look in lib/BackupPC/Lang.  Each language has its own
file.  You can copy copy en.pm (English) to tr.pm (Turkish) and
then edit it with the translation of each string.  Don't change
any of the html or variable names prefixed by $ - just the
English text.

To test your translation set $Conf{Language} to 'tr'.  You also should
add 'tr' to the definition of Language in lib/BackupPC/Config/Meta.pm
so that choice is displayed in the CGI interface.  The new file name
also has to be added to makeDist in CVS, which I can do.  The makeDist
script does a lot of consistency checking of the language files, so it
would be good if you could run that too.

Before you start you should be sure you are committed to support
the translation updates for each new version.  While the on-going
work is actually very small, each new translation means there is
an increasing amount of work when I change the CGI interface to
get everyone to update their translations.  In fact, two of the
language files - Spanish and Portuguese-Brazilian - still have
English snippets prefixed with ENG. It would be great if any
fluent list member can send me diffs for those strings.

Craig

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] RSYNC xfer failing backuppc version 3.2.0beta0

2009-04-12 Thread Craig Barratt
Tim writes:

 Hi I just installed the latest backuppc
 version 3.2.0beta0.  When I try to do
 a full backup of a test host I'm seeing
 this error in the log
 
 2009-04-12 11:07:14 User backuppc requested backup of scvffs09 (scvffs09)
 Can't locate File/Listing.pm in @INC (@INC contains: /usr/local/BackupPC/lib 
 /usr/lib/perl5/5.8.8/i586-linux-thread-multi /usr/lib/perl5/5.8.8 
 /usr/lib/perl5/site_perl/5.8.8/i586-linux-thread-multi 
 /usr/lib/perl5/site_perl/5.8.8 /usr/lib/perl5/site_perl 
 /usr/lib/perl5/vendor_perl/5.8.8/i586-linux-thread-multi 
 /usr/lib/perl5/vendor_perl/5.8.8 /usr/lib/perl5/vendor_perl .) at 
 /usr/local/BackupPC/lib/BackupPC/Xfer/Ftp.pm line 49.

Looks like you need to install the File::Listing module.

Paul - the code should check for this and it should be in the
docs too.

Craig

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Sending csums, cnt = 0, phase = 1 - Backups stuck!

2009-04-10 Thread Craig Barratt
Boniforti writes:

 carola/Desktop/CAROLA/Mise à jour des prix 2009-04-01.xls: size doesn't 
 match (14702080 vs 0)
 
 Can you tell me why it should tell things like size doesn't match?
 Could you please explain what's going on?

You have a high log level enabled ($Conf{XferLogLevel}), so you will
get a lot of debug messages.

This message is normal.  It says the size of the file on the client
doesn't match the size on the server.  When the file isn't on the
server yet (which is the case for a first backup) this message says
the server file size is 0 versus 14702080 on the client.

Craig

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 3.2.0beta0 released

2009-04-10 Thread Craig Barratt
Chris writes:

 I didn't see any mention of lib/BackupPC/Lib.pm being updated for the
 case when XFS is used as the pool file system  and IO::Dirent is
 installed (as per
 http://www.mail-archive.com/backuppc-de...@lists.sourceforge.net/msg00195.html).
 
 Looking at the source of the Beta, this looks like the changes were not
 implemented.  Was it just missed, or is there another reason?

This is fixed but somehow I didn't add it to the ChangeLog.
Bernhard Ott and Tino Schwarze debugged the problem.  I'll
update the ChangeLog for the next release.

Look at the dirRead() function in lib/BackupPC/Lib.pm.  It does
the check at runtime (when $TopDir is known).  In 3.1.0 the bug
was that the check was done at startup (in the BEGIN block) on
the directory that BackupPC was started in, not $TopDir (which
is often a different file system).

Thanks to Tino and Bernhard.

Craig

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Include only specified folders - how

2009-04-09 Thread Craig Barratt
Bharat writes:

 On the host page - I selected SMB with sharename set to D$
 I've setup my username and password (administrator)
 
 I've unticked BackupFilesExclude and Ticked BackupFilesExclude and tried 
 various formats
 (including */Temp/* as found in EXCLUDE!!! - yes there is a Temp folder on D)
 
 I've tried
 
 /folder1
 /folder2
 
 */folder1/*
 */folder2/*
 
 D$/folder1/*
 D$/folder2/*   and without the last two characters /*
 
 I've also tried \ in place of / for all above

I presume you are trying to include just /folder1 and /folder2
on the D$ share, right?

If you are using the CGI interface it can be a bit confusing.
For $Conf{BackupFilesOnly} the key should be D$ and the
values should be the paths (relative to the share).

The actual setting in the config.pl file should look like this:

$Conf{BackupFilesOnly} = {
   'D$' = ['/folder1', '/folder2'],
}

Please tell us the relevant config settings you have, and also
include the first few lines of the XferLOG file so we can see
exactly what command is being run.

Craig

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Feature request for 3.2: wait for children to remove zombies

2009-04-09 Thread Craig Barratt
John,

It's still on my todo list - I didn't get around to it for 3.2.0beta0.
I'll see if I can get it in before the final 3.2.0 release.

Craig

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC 3.2.0beta0 released

2009-04-09 Thread Craig Barratt
BackupPC 3.2.0beta0 has been released on SF.net.
3.2.0beta0 is the first beta release of 3.2.0.

3.2.0beta0 has several new features and quite a few bug fixes
since 3.1.0.  New features include:

* Added FTP xfer method, implemented by Paul Mantz.

* Added more options to server backup command: rather than just forcing
  an incremental or full backup, a regular (auto) backup can be queued
  (ie: do nothing/incr/full based on schedule), as well as doing just
  an incremental or full or nothing based on the client schedule.
  Based on patches submitted by Joe Digilio.

* Added $Conf{CmdQueueNice} to specify nice level for command queue
  commands (eg: BackupPC_link and BackupPC_nightly).  Suggested by
  Carl Soderstrom.

* Changed bin/BackupPC_dump to not ping or lookup the host if
  $Conf{BackupsDisable} is set.  Requested by John Rouillard.

* Added variable substitution for host, confDir, client in
  RsyncArgs, and also added option RsyncArgsExtra to allow
  more easy customization of RsyncArgs on a per-client basis.
  Proposed (with patch) by Raman Gupta.

* Added Xfer error column to the host summary table in the CGI
  interface.  Based on patch submitted by Jan Kratochvíl.

See the ChangeLog below for full details.

Craig

#
# Version 3.2.0beta0, 5 April 2009
#

* Added BackupPC::Xfer::Protocol as a common class for each Xfer
  method.  This simplifies some of the xfer specific code.
  Implemented by Paul Mantz.

* Added FTP xfer method, implemented by Paul Mantz.

* Added BackupPC::Xfer module to provide a common interface to the
  different xfer methods.  Implemented by Paul Mantz.

* Moved setting of $bpc-{PoolDir} and $bpc-{CPoolDir} after the
  config file is read in BackupPC::Lib.  Fix proposed by Tim Taylor
  and Joe Krahn, and rediscovered by several others including
  Holger Parplies.

* Create $TopDir and related data directories in BackupPC_dump
  prior to hardlink test.  Requested by Les Stott.

* Fixed encoding of email subject header in bin/BackupPC_sendEmail as
  suggested by Jean-Claude Repetto.  Also changed $Conf{EMailHeaders}
  charset to utf-8.  Also changed bin/BackupPC_sendEmail to not send
  any per-client email if $Conf{BackupsDisable} is set.

* Modified bin/BackupPC_dump to fix the case of a single partial
  backup followed by a successful incremental resulting in a full
  backup of level 1, rather than level 0.  Reported by Jeff
  Kosowsky.

* Fixed BackupPC::PoolWrite to always create the parent directory.
  This fixed a case with rsync/rsyncd where a file like -i in the
  top-level directory sorts before ., which meant the directory
  creation is after the file creation.  Also PoolWrite errors now
  increment xferError count.  Reported by Jeff Kosowsky.

* BackupPC now gives a more useful error message if BackupPC_nightly
  takes more than 24 hours (ie: when the next one is meant to
  start).  Reported by Tony Schreiner.

* Added more options to server backup command: rather than just forcing
  an incremental or full backup, a regular (auto) backup can be queued
  (ie: do nothing/incr/full based on schedule), as well as doing just
  an incremental or full or nothing based on the client schedule.
  Based on patches submitted by Joe Digilio.

* Modified lib/BackupPC/CGI/RSS.pm to replace \n with \r\n in the RSS
  http response headers.  Patch submitted by Thomas Eckhardt.

* Modified bin/BackupPC_archive to allow the archive request file
  name to contain spaces and dashes, requested by Tim Massey.

* Fix to configure.pl for --no-fhs case to initialize ConfigDir
  from Dan Pritts.  Also changed perl path to #!/usr/bin/env perl.

* Modified bin/BackupPC_archiveHost to shell escape the output file
  name.  That allows it to contain spaces and other special characters.
  Requested by Toni Van Remortel.  Also updated bin/BackupPC_archiveHost
  to shell escape and check other arguments.

* Added $Conf{CmdQueueNice} to specify nice level for command queue
  commands (eg: BackupPC_link and BackupPC_nightly).  Suggested by
  Carl Soderstrom.

* Added --config-override to configure.pl, allow config settings to be
  set on the command line.  Proposed by Les Stott and Holger Parplies.

* Moved call to NmbLookupFindHostCmd in BackupPC_dump to after the
  check of whether a backup needs to be done.  This makes wakeonlan
  work correctly, rather than waking up the client every WakeupSchedule.
  Reported by David Lasker.

* Improved settings for compression and compext in BackupPC_archiveStart
  based on compression type, as proposed by Paul Dugas.  compext is now
  empty, .gz or .bz2 based on ArchiveComp.

* Changed bin/BackupPC_dump to not ping or lookup the host if
  $Conf{BackupsDisable} is set.  Requested by John Rouillard.

* Changed BackupPC_tarCreate to disable output of final nulls in
  tar archive when -l or -L option is used.  Reported 

Re: [BackupPC-users] trashClean won't clean

2009-04-08 Thread Craig Barratt
Madcha writes:

 Since few days, trashClean is started but, won't clean old backups,
 I don't understand why,
 In trash folder there nothing,

That means BackupPC_trashClean is working: its job is to remove
everything that appears in $TopDir/trash.

 It is perhaps for that reason that it does cleans nothing.
 Is it possible to manually clean up old backups?
 Is it possible to manually start backuppc_trashClean that rescan what should 
 be deleted?
 Where I can look to find out where is the error?

Perhaps you mean that backups aren't being deleted as you expect?

Can you show a specific example of a set of backups where you
expect a backup to be deleted that isn't?  Please include your
$Conf{FullKeepCnt}, $Conf{IncrKeepCnt} and releted settings.

Craig

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Try to backup non existent files....

2009-04-08 Thread Craig Barratt
Mirco,

 Here error log, mailed from BackupPC:
 
 The following hosts had an error that is probably caused by a
 misconfiguration.  Please fix these hosts:
  - elpra01lc (Call timed out: server did not respond after 2
 milliseconds opening remote file \ELPRA01WS\ELPRA06SV\E
 (\ELPRA01WS\ELPRA06SV\))

Your source.gif screen shot didn't show the contents of
\ELPRA01WS\ELPRA06SV.  You need to descend into that directory.
But I believe you that there is not file \ELPRA01WS\ELPRA06SV\E.

Typically this timeout error is due to antivirus software taking
too long.

However, if smbclient claims there is a file that doesn't exist you
might try two things:

  - run smbclient manually (in interactive mode), cd to that
directory and see what ls shows 

  - running chkdsk - perhaps your client file system is corrupted?

Craig

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Next release

2009-04-06 Thread Craig Barratt
Obj writes:

 I am running version 3.2.0.

Actually you are running CVS.

 can someone tell me why
 $Conf{BackupFilesExclude} is not working. It still backups up all Temp
 folders, and .mp3 files, etc. The backup method is SMB.

Can you send the first few lines for the XferLOG file?

Craig

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Next release

2009-03-30 Thread Craig Barratt
Pedro writes:

 Those are good news, where can we see about new stuff is in this upgrade?

Here is the current ChangeLog.  This should be pretty much what is
in 3.2.0beta0.

Craig

* Added BackupPC::Xfer::Protocol as a common class for each Xfer
  method.  This simplifies some of the xfer specific code.
  Implemented by Paul Mantz.

* Added FTP xfer method, implemented by Paul Mantz.

* Added BackupPC::Xfer module to provide a common interface to the
  different xfer methods.  Implemented by Paul Mantz.

* Moved setting of $bpc-{PoolDir} and $bpc-{CPoolDir} after the
  config file is read in BackupPC::Lib.  Fix proposed by Tim Taylor
  and Joe Krahn, and rediscovered by several others including
  Holger Parplies.

* Create $TopDir and related data directories in BackupPC_dump
  prior to hardlink test.  Requested by Les Stott.

* Fixed encoding of email subject header in bin/BackupPC_sendEmail as
  suggested by Jean-Claude Repetto.  Also changed $Conf{EMailHeaders}
  charset to utf-8.  Also changed bin/BackupPC_sendEmail to not send
  any per-client email if $Conf{BackupsDisable} is set.

* Modified bin/BackupPC_dump to fix the case of a single partial
  backup followed by a successful incremental resulting in a full
  backup of level 1, rather than level 0.  Reported by Jeff
  Kosowsky.

* Fixed BackupPC::PoolWrite to always create the parent directory.
  This fixed a case with rsync/rsyncd where a file like -i in the
  top-level directory sorts before ., which meant the directory
  creation is after the file creation.  Also PoolWrite errors now
  increment xferError count.  Reported by Jeff Kosowsky.

* BackupPC now gives a more useful error message if BackupPC_nightly
  takes more than 24 hours (ie: when the next one is meant to
  start).  Reported by Tony Schreiner.

* Added more options to server backup command: rather than just forcing
  an incremental or full backup, a regular (auto) backup can be queued
  (ie: do nothing/incr/full based on schedule), as well as doing just
  an incremental or full or nothing based on the client schedule.
  Based on patches submitted by Joe Digilio.

* Modified lib/BackupPC/CGI/RSS.pm to replace \n with \r\n in the RSS
  http response headers.  Patch submitted by Thomas Eckhardt.

* Modified bin/BackupPC_archive to allow the archive request file
  name to contain spaces and dashes, requested by Tim Massey.

* Fix to configure.pl for --no-fhs case to initialize ConfigDir
  from Dan Pritts.  Also changed perl path to #!/usr/bin/env perl.

* Modified bin/BackupPC_archiveHost to shell escape the output file
  name.  That allows it to contain spaces and other special characters.
  Requested by Toni Van Remortel.

* Added $Conf{CmdQueueNice} to specify nice level for command queue
  commands (eg: BackupPC_link and BackupPC_nightly).  Suggested by
  Carl Soderstrom.

* Added --config-override to configure.pl, allow config settings to be
  set on the command line.  Proposed by Les Stott and Holger Parplies.

* Moved call to NmbLookupFindHostCmd in BackupPC_dump to after the
  check of whether a backup needs to be done.  This makes wakeonlan
  work correctly, rather than waking up the client every WakeupSchedule.
  Reported by David Lasker.

* Improved settings for compression and compext in BackupPC_archiveStart
  based on compression type, as proposed by Paul Dugas.  compext is now
  empty, .gz or .bz2 based on ArchiveComp.

* Changed bin/BackupPC_dump to not ping or lookup the host if
  $Conf{BackupsDisable} is set.  Requested by John Rouillard.

* Changed BackupPC_tarCreate to disable output of final nulls in
  tar archive when -l or -L option is used.  Reported by John
  Rouillard.

* Added error check in BackupPC::Xfer::RsyncFileIO after call to
  BackupPC::Xfer::RsyncDigest-digestStart(), reported by Jeff
  Kosowsky.

* Added variable substitution for host, confDir, client in
  RsyncArgs, and also added option RsyncArgsExtra to allow
  more easy customization of RsyncArgs on a per-client basis.
  Proposed (with patch) by Raman Gupta.

* Added Xfer error column to the host summary table in the CGI
  interface.  Based on patch submitted by Jan Kratochvíl.

* Minor fix to sprintf arguments in BackupPC::Attrib, reported by
  Jonathan Kamens.

* Fixed sort compareLOGName syntax in bin/BackupPC for perl 5.10.x,
  reported by Jeff Kosowsky and Holger Parplies.

* Fixed bin/BackupPC_archiveStart to set compression correctly,
  and also set the file extension to .gz when compression is on.
  Reported by Stephen Vaughan.

* Fixed netbios name comparison in bin/BackupPC_dump and
  bin/BackupPC_restore to just use the first 15 characters
  of the host name.  Patch from Dan MacNeil.

* Fixed nmblookup parsing in BackupPC::Lib::NetBiosInfoGet to ignore
  entries with the GROUP tag.  Based on patch from Dan MacNeil.

* Fixed BackupPC_dump so that the XferLOG file is saved when
  DumpPreUserCmd fails.  Reported by John Rouillard.

* Updated BackupPC.pod for $Conf{BackupsDisable}, 

Re: [BackupPC-users] Next release

2009-03-30 Thread Craig Barratt
Tomasz writes:

 Are there any plans to update File-RsyncP to make it compatible with
 newer rsync protocol versions?

I'm experimenting with FUSE to see if native rsync3 + FUSE will
be the best path.  Otherwise, yes, I will update File-RsyncP.

Craig

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] UPDATED: Fully automated script for creating shadow copies and launching rsyncd

2009-03-30 Thread Craig Barratt
David writes:

 The short story is that you need to configure BackupPC to wake up only
 once per day, in order for wakeonlan to work in a reasonable manner.

That should be fixed in 3.2.0beta0:

* Moved call to NmbLookupFindHostCmd in BackupPC_dump to after the
  check of whether a backup needs to be done.  This makes wakeonlan
  work correctly, rather than waking up the client every WakeupSchedule.
  Reported by David Lasker.

Craig

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Next release

2009-03-30 Thread Craig Barratt
Jeff writes:

 Sounds cool... I imagine this is in line with the thread we had a few
 months ago.

Yes, that's right.  I want to be sure the performance and reliability
are high enough before making the decision.

Craig

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_tarCreate generats corrupted file listings (ver 3.1.0)

2009-03-26 Thread Craig Barratt
John,

 I am seeing corrupted directory listings using BackupPC_tarCreate. One
 of the reported filenames has a bunch of nulls in the middle of it
 using BackupPC-3.1.0.

I'd like to get to the bottom of this.  Let's take this off list.

It would be great if you could get this to happen on as small an
archive as possible, and if possible to email to me (I'm hoping
you can make it happen on non-confidential files, ideally existing
open-souce public files).  If I can't replicate it I will propose
some additional debugging code you could run.

Craig

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Next release

2009-03-26 Thread Craig Barratt
J:

 Is it possible to get a CVS copy?
 
 I tried: cvs -z3.2
 -d:pserver:anonym...@backuppc.cvs.sourceforge.net:/cvsroot/backuppc co
 BackupPC
 
 ...but received the dreaded __CONFIGURE_BIN_LIST__ error when I ran
 the ./configure.pl

You need to read CVS_README (actually I need to update this since makeDist
now takes arguments for the release name and date).

Quick summary: you need to run makeDist to create a tarball
release.  configure.pl doesn't run straight out of CVS, but
will run from the tarball.

Craig

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] External WD Worldbook Device

2009-03-26 Thread Craig Barratt
Les writes:

 I would absolutely love it if the top level directories were still
 created by backuppc first before doing the hardlink test. If those
 directories are created because they dont exist and the hardlink
 test fails then just remove the directories. Or leave them there,
 all you've done is create a few directories.

It's fixed in CVS and will be in 3.2.0:

* Create $TopDir and related data directories in BackupPC_dump
  prior to hardlink test.  Requested by Les Stott.

Craig

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Local backups

2009-03-23 Thread Craig Barratt
Paul writes:

 Here's what I'm getting:
 
   full backup started for directory /etc (baseline backup #50)
   Running: /usr/bin/rsync --server --sender --numeric-ids --perms --owner 
 --group -D --links --hard-links --times --block-size=2048 --recursive 
 --filter dir-merge\\\ /.backuppc-filter --ignore-times . /etc/
   Xfer PIDs are now 24521
   Got remote protocol 1852534357
   Fatal error (bad version): Unknown filter rule: `dir-merge\ 
 /.backuppc-filter'

Ah, yes, you are requesting an extra layer of escaping with:

$Conf{RsyncClientCmd} = '$rsyncPath $argList+';

Since you are executing rsync directly, you should use this:

$Conf{RsyncClientCmd} = '$rsyncPath $argList';

Craig

--
Apps built with the Adobe(R) Flex(R) framework and Flex Builder(TM) are
powering Web 2.0 with engaging, cross-platform capabilities. Quickly and
easily build your RIAs with Flex Builder, the Eclipse(TM)based development
software that enables intelligent coding and step-through debugging.
Download the free 60 day trial. http://p.sf.net/sfu/www-adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] issues with backuppc

2009-03-22 Thread Craig Barratt
Xavier writes:

 *) BackupPC doesn't worked correctly on one host

 #86 was supposed to be a full backup but when browsing I found out that it's 
 missing a lot of directory ( /bin /home ...)
 
 size of backup# on disk
 2,8G 86
 9,3G 87
 9,4G 88
 
 Moreover, when trying to read logfile, to found out anything : file XferLog 
 seems corrupted
 
 # /usr/share/backuppc/bin/BackupPC_zcat XferLOG.86.z
 /usr/share/backuppc/bin/BackupPC_zcat: can't uncompress XferLOG.86.z
 
 # hexdump -C XferLOG.86.z | head -n5
   00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  ||
 *
 006b1970  77 52 2f 25 4f 82 cf bc  a3 61 3f 85 2d 92 77 e7  |wR/%Oa?.-.w.|
 006b1980  4d cd 60 fd ef 9e 04 7f  ff ec ce fc 3b f5 cd 7c  |M.`.;..||
 006b1990  9a c1 82 3d 09 c2 37 ba  7b 27 af fa a2 ff 7a 02  |...=..7.{'z.|

The XferLOG.86.z file shouldn't start with a large block of
0x0 bytes.  The most likely cause is file system corruption.

Craig

--
Apps built with the Adobe(R) Flex(R) framework and Flex Builder(TM) are
powering Web 2.0 with engaging, cross-platform capabilities. Quickly and
easily build your RIAs with Flex Builder, the Eclipse(TM)based development
software that enables intelligent coding and step-through debugging.
Download the free 60 day trial. http://p.sf.net/sfu/www-adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Local backups

2009-03-22 Thread Craig Barratt
Paul writes:

 I tried just changing 'RsyncClientCmd' to $rsyncPath $argList+ but it
 seems BackupPC is expecting the SSH and is now improperly escaping
 'RsyncArgs'.  The hitch is with a space in one.
 
 $Conf{RsyncClientCmd} = '$rsyncPath $argList+';
 $Conf{RsyncArgs} = [
   '--numeric-ids',
   '--perms',
   '--owner',
   '--group',
   '-D',
   '--links',
   '--hard-links',
   '--times',
   '--block-size=2048',
   '--recursive',
   '--filter',
   'dir-merge /.backuppc-filter'
 ];
 
 The last two are an addition I use to allow users to exclude files from
 the backup with properly crafted .backuppc-filter files in their
 directories.
 
 I end up with an extra slash after the dir-merge.
 
 /usr/bin/rsync --server --sender --numeric-ids --perms --owner --group
 -D --links --hard-links --times --block-size=2048 --recursive --filter
 dir-merge\ /.backuppc-filter --ignore-times . /etc/

The \ is to escape the following   since it is a single argument.

Actually, BackupPC doesn't execute the command as printed.  Rather,
it keeps the arguments as an array and passes them directly to exec().
So rsync should see a single argument

dir-merge /.backuppc-filter

right after '--filter'.

For logging purposes, the command is turned into a single string and
escaped as though it was going to be executed by a shell (although it
isn't). That's when the \ gets added.

Craig

--
Apps built with the Adobe(R) Flex(R) framework and Flex Builder(TM) are
powering Web 2.0 with engaging, cross-platform capabilities. Quickly and
easily build your RIAs with Flex Builder, the Eclipse(TM)based development
software that enables intelligent coding and step-through debugging.
Download the free 60 day trial. http://p.sf.net/sfu/www-adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] file name mangling and unmangling

2009-03-09 Thread Craig Barratt
Pramathesh writes:

 The documentation on the backuppc mentions that old unmangled file
 names are still supported by the CGI interace. However, I have not been
 able to figure out how and where this option can be set.

What that means is backups taken with very old versions of BackupPC
(when file mangling was off or not implemented, prior to 1.4.0) will
still be displayed correctly in the CGI interface.

Craig

--
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC not recording DumpPreUserCmd in XferLOG if non-zero exit is not ignored.

2009-02-25 Thread Craig Barratt
John writes:

 Can anybody confirm that xferlogs are not being written if
 DumpPreUserCmd exits non-zero with $Conf{UserCmdCheckStatus} = 1? Also
 does anybody know if it is fixed in a subsequent release?

Yes, this looks like a bug.  An error will be written to the per-client
LOG file.  But in this case the XferLOG file isn't closed before exit,
so the cached compressed data isn't written, nor is the file renamed to
XferLOG.bad.z.  So as you noticed you just have an empty XferLOG.z file.

It's very easy to fix... in BackupPC_dump just call BackupFailCleanup()
instead of exit(1) in the cases where DumpPreUserCmd (or any *UserCmd)
fails.

Craig

--
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Cygwin-rsyncd ignoring large files

2009-02-24 Thread Craig Barratt
Ski writes:

 I have a windows client that has been working fine for over a year and
 now there are three files in the 6 - 7GB range that it just ignores.  I
 am using cygwin-rsyncd 2.6.6 and backuppc 2.1.2.  I was able to force a
 backup of one large file by excluding all other directories except for
 the one with the large file in it.  Then it backed up fine.
 
 Appreciate any ideas or hints on what could be happening and what I need
 to do to fix it.

First, it would be helpful if you can reduce the backup to a
relatively small set of files (a few thousand is still ok) that
still shows the problem.

Then set $Conf{XferLogLevel} to 5, email me the resulting XferLOG.z
file and tell me which file is skipped.

Craig

--
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] move a specific backup (share) from one pool to another pool

2009-02-23 Thread Craig Barratt
sabujp writes:

 In the last command that runs BackupPC_tarPCCopy, does this perl command look 
 at any of the configuration files on the local host or does it just get what 
 it needs to re-generate the hard links straight from the old pc directory? 
 I looked through the code and don't see that it does look at the local 
 system's backuppc configuration files under /etc/BackupPC .

It shouldn't look at the config files beyond extracting the TopDir.

 What I'm doing is the following: I have two BackupPC servers host1 and host2. 
 I'm copying the BackupPC directories from a SAN (san2) attached to host2 to 
 the SAN attached to host1 (san1) . Both SANs are available to both hosts 
 through a FC switch so I just umount it on host2 and re-mount it on host1. I 
 then copy the cpool and pool directories to host1's san1. After this is 
 finished I plan on running BackupPC_tarPCCopy on host1 (san1) against the pc 
 directory on san2. Just wanted to make sure BackupPC_tarPCCopy wasn't going 
 to look at host1:/etc/BackupPC/hosts and try something funny .

I'm not sure I fully understand what you are trying to do, but it is
critical that the cpool directories are identical on the source and
target trees before running BackupPC_tarPCCopy.  That's because the
hardlink structure is deteceted on the source tree, and just the cpool
path is sent to the target tar -x so it can create the same hardlink.
The cpool files themselves are not transferred by BackupPC_tarPCCopy.

I don't think there is a way to transfer a single host's backups using
BackupPC_tarPCCopy.

Craig

--
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Child is aborting during backup but backuppc reports a success

2009-02-23 Thread Craig Barratt
Matthias writes:

 I backup a windows client with rsyncd over ssh. I am pretty sure the ssh
 connection was interrupted at 23:27.
 In the /var/lib/backuppc/pc/st-ms-wv/XferLOG.0.z I found the error message:
   create   770 4294967295/4294967295  240986 
 Help/Windows/de-DE/artcone.h1s
 Read EOF:
 Tried again: got 0 bytes
 finish: removing in-process file Help/Windows/de-DE/artconm.h1s
 Child is aborting
 Done: 1539 files, 573232406 bytes
 
 but in the /var/lib/backuppc/LOG.0.z
 2009-02-16 20:36:06 Started full backup on st-ms-wv (pid=615, share=WINDOWS)
 2009-02-17 01:05:42 Finished full backup on st-ms-wv
 2009-02-17 01:05:42 Running BackupPC_link st-ms-wv (pid=9360)
 2009-02-17 01:06:05 Finished st-ms-wv (BackupPC_link st-ms-wv)
 
 and backuppc believe that this first full backup was successful. But it 
 wasn't.
 It aborts somewhere in the middle.
 
 Is it possible to recognise this situation and report a right status?

Yes, this looks like a bug where the child gets an error but doesn't
correctly communicate that status to the parent.  I've added this to
my todo list.

Craig

--
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_archiveStart tweak

2009-02-23 Thread Craig Barratt
Paul writes:

  In case this is of use to others, I tweaked the BackupPC_archiveStart
 script to properly (IMHO) deal with the ArchiveComp setting.  While my
 coding style may be icky to some, I think my removal of the .raw file
 extension for uncompressed archive files may be of issue to others.

Thanks for the patch.  I'll add this to the next release.

Craig

--
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC calls NmbLookupFindHostCmd every hour messing up wakeonlan

2009-02-23 Thread Craig Barratt
David writes:

 I took a closer look at the perl code and I see the cause of the problem.
 Please note I have no DNS. My PCs use DHCP, but are configured in BackupPC
 with the host table's DHCP flag set to zero.
 
 Here is what I think is happening:
 
 1. BackupPC_dump is called periodically at each WakeupSchedule interval.
 2. BackupPC_dump calls NetBiosHostIPFind which uses NmbLookupFindHostCmd to
 lookup my hosts' Samba names, thereby powering them on.
 3. It then determines whether or not the host needs to be backed up.
 
 At this point the only solution I can think of is to modify BackupPC_dump's
 perl code to reverse the order of steps 2 and 3 above, and only invoke
 wake-on-lan if a backup is due.
 
 I am hoping a more experienced BackupPC user has a better solution that
 doesn't involve modifying the perl code.

You're right - these two steps are backwards and it should be fine
to do them in the order you propose.  However, I can't suggest a
way to do it without modifying the code.  I will fix this.

An alternative is just doing a single wakeup each night, but that
won't work for laptops that aren't connected overnight.  Or you
could do two: one at night and one during the day, minimizing
the likelihood of an extraneous power up.

Craig

--
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem, Backuppc is not executing a defined command before running a backup

2009-02-22 Thread Craig Barratt
James writes:

 I wanted to be able to notify a group of people when any backup starts
 or ends, so I did some googling and found an archived email on this
 list about how I might do it.

 I made a script with the following contents as a test and named
 it startbkpemail.sh (just used the example name from the post)
 and added the a line '/usr/local/bin/startbkpemail.sh $host
 $type';  in the DumpPreUserCmd field in the host config - but
 it is not executing it. Any suggestions? The permissions on the
 file are 777 for testing so I know that is not the problem.

Is there any error message in the XferLOG file?  Does it say it is executive
your DumpPreUserCmd?

Craig

--
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Volunteer graphic artist for BackupPC?

2009-02-16 Thread Craig Barratt
BackupCentral.com has generously offered to contribute some free
banner ads for BackupPC on their site.

To take advantage of this offer I need someone with some graphic
skills to generate a couple of banner images with particular
geometries.  If you are willing to contribute some time please
email me and I can send you the specific guidelines.

Thanks,
Craig

--
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] 38GB file backup is hanging backuppc (more info and more questions)

2009-02-12 Thread Craig Barratt
Tony writes:

 I missed the original post, but  I run rsync with the --whole-file
 option, but I still get RStmp files, is that not supposed to happen?

RStmp is a temporary file used to store the uncompressed pool file,
which is needed for the rsync algorithm.  It's only used for larger
files - smaller files are uncompressed in memory.

RStmp is independent of --whole-file.

Craig

--
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Revamping the Wiki

2009-02-12 Thread Craig Barratt
Cody writes:

 I'd be willing to do a lot of the cleaning myself, though I don't want
 to step on anyone's toes without talking with you first. Also, my
 knowledge of BackupPC is fairly limited to my setup (XP/Vista clients 
 Ubuntu server).

I agree it isn't very well organized.  I don't think anyone has
been keeping up the overall structure.  I'd be happy for you to
make the changes you want.

Craig

--
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] multiple pre/post commands

2009-02-03 Thread Craig Barratt
Nick writes:

 Ive tried putting ; in between the two commands, ive tried  and 
 as well, with  in there, it seems to run both commands but the
 variables arent being pulled from backuppc, so the email doesnt work
 correctly.  and also the script that runs my vshadow commands doesnt
 seem to be working correctly either now that ive checked it.
 ive also tried hand editing the config.pl file and using  around
 each command and that doesnt work either, and usingand ;

BackupPC doesn't use a shell to execute any external commands,
so you can't use shell syntax for multiple commands.

If you need to run multiple commands you should include them
in a single shell script and point the *Cmd config variable
to that script.

Craig

--
Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
software. With Adobe AIR, Ajax developers can use existing skills and code to
build responsive, highly engaging applications that combine the power of local
resources and data with the reach of the web. Download the Adobe AIR SDK and
Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] When will backuppc begin pooling?

2009-01-29 Thread Craig Barratt
Brian writes:

  *   0 pending backup requests from last scheduled wakeup,
  *   0 pending user backup requests,
  *   0 pending command requests,
  *   Pool is 0.00GB comprising 0 files and 0 directories (as of 1/29 01:00),
  *   Pool hashing gives 0 repeated files with longest chain 0,
  *   Nightly cleanup removed 0 files of size 0.00GB (around 1/29 01:00),
  *   Pool file system was recently at 83% (1/29 18:59), today's max is 83% 
 (1/29 17:08) and yesterday's max was 82%.

It looks like BackupPC_nightly is failing to traverse the pool.
This could be due to a bug in IO::Dirent that causes it to fail
on certain file systems.  There is a test in 3.1.0 to check if
IO::Dirent works, but it checks ., not $TopDir.  That bug is
fixed in CVS.

Do you have IO::Dirent installed and is your pool on XFS?

If so, change this line in lib/BackupPC/Lib.pm:

$IODirentOk = 1;

to:

$IODirentOk = 0;

Craig

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync exclude syntax

2009-01-15 Thread Craig Barratt
Jean-Michel writes:

 $Conf{BackupFilesExclude} = [
 '/Users/garant/Library/Preferences/ByHost/*00224126372e.plist' ];
 
 notice the  wildcard '*' in the file list...
 
 but it seems that BackupPC_dump  stats the file  BEFORE to exclude the
 file from backup because there is a failed to open message...
 
  log -
 incr backup started back to 2009-01-12 10:05:19 (backup #30) for
 directory /Users/garant
 Running: /usr/bin/ssh -q -x -l garant 130.104.58.195 /usr/bin/rsync
 --server --sender --numeric-ids --perms --owner --group --devices
 --links --times --block-size=2048 --recursive . /Users/garant/

Excludes are relative to the share name, which is /Users/garant
in this case.  So you should use this:

$Conf{BackupFilesExclude} = [
'/Library/Preferences/ByHost/*00224126372e.plist' ];

Craig

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclude files

2009-01-15 Thread Craig Barratt
Christian writes:

 I'm having some issues with excluding directories.
 
 If have the following settings in the host.pl:
 =snip===
 $Conf{RsyncShareName} = [
 '/',
 '/srv'
 ];
 $Conf{BackupFilesExclude} = {
 'srv' = [
 'file1',
 '/bu'
 ]
 };

You have a typo.  The share name is /srv, not srv, so you need:

 $Conf{BackupFilesExclude} = {
 '/srv' = [
 'file1',
 '/bu'
 ]

Also, you are excluding files below /srv, but not /.  You need
to either restrict rsync with --one-file-system or exclude /srv
under /.

Craig

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Copying in a file instead of backing up?

2009-01-14 Thread Craig Barratt
Jeff writes:

 Are you sure that you can't get rsync to calculate the checksums (both
 block and full-file) before file transfer begins -- I don't know I'm
 just asking..

I believe rsync's --checksum option precomutes and sends the whole
file checksum (which as has been noted is different to BackupPC's
partial file checksum).

Currently File::RsyncP doesn't handle that option.  Also, it requires
two passes on each changed file on the client.

Craig

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPc tape config

2009-01-13 Thread Craig Barratt
Sil writes:

 $Conf{ArchiveClientCmd} = '$Installdir/bin/BackupPC_archiveHost'   =
 add -b 10
. ' $tarCreatePath $splitpath $parpath $host $backupnumber'
. ' $compression $compext $splitsize $archiveloc $parfile *';
 
 I don't know how to write this, and where to place it ?
 Did someone have done this before  ? Is there any other documentation ?

The idea is you can copy and edit BackupPC_archiveHost to add
or change any of the program arguments.  You can then change
$Conf{ArchiveClientCmd} to point at the name of your new script.

Craig

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] how to restore one extra file if a restore is requested

2009-01-10 Thread Craig Barratt
Matthias writes:

 If a user requests a restore I want to restore one extra file and handle it
 by the RestorePostUserCmd.
 Is it possible to request this additional restore with BackupPC_restore
 during the RestorePreUserCmd or RestorePostUserCmd ?

Yes, you could do it by emulating what the CGI interface does.

You will need to write a perl script that would create a restore
request file using Data::Dumper (see the CGI code for an example),
and then issue a restore command using BackupPC_serverMesg.

Be careful you don't end up with an infinite loop, where the
second restore triggers a third, etc...

See bin/BackupPC_archiveStart for an example of the basic steps,
since this script does a similar thing for archives as you want
to do with restores.

Craig

--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] incremental tar xfer errors

2009-01-07 Thread Craig Barratt
Simone writes:

 I got a strange problem doing incrementals with tar over ssh using
 --newer=$incrDate+. It seems an escape problem of part of the time
 reference for the incremental.

Yes, the escaping isn't happening.  The $incrDate+ form means
to escape the value, so that is what you should use (since you
are running through ssh).

Are you sure $Conf{TarIncrArgs} includes --newer=$incrDate+ rather
than --newer=$incrDate?  Have you checked the per-client config too?

Craig

--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] I received the error No files dumped for share

2009-01-07 Thread Craig Barratt
Omar writes:

 $Conf{TarClientCmd} = ' env LC_ALL=C /usr/bin/sudo $tarPath -c -v -f -
 -C $shareName+'
 . ' --totals';
 
 $Conf{TarClientRestoreCmd} = ' env LC_ALL=C /usr/bin/sudo $tarPath -x -p
 --numeric-owner --same-owner'
. ' -v -f - -C $shareName+';

Both of these are wrong - they start with a space.  BackupPC doesn't
know what program to exec.

You need something like:

$Conf{TarClientCmd} = '/usr/bin/sudo env LC_ALL=C $tarPath -c -v -f - -C 
$shareName+ --totals';

Craig

--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] no files dumped for share c$

2009-01-06 Thread Craig Barratt
Gilles writes:

 tree connect failed: NT_STATUS_ACCESS_DENIED

David Kahn reports that this happens with recent versions
of smbclient.  Removing the -N option fixes it.

Can you confirm this fix works for you?

Craig

-- Forwarded message --
To:   backuppc-users@lists.sourceforge.net 
backuppc-users@lists.sourceforge.net
From: davidekahn backuppc-fo...@backupcentral.com
Date: Thu, 13 Nov 2008 15:37:00 -0800
Subj: [BackupPC-users]  tree connect failed: NT_STATUS_ACCESS_DENIED


It appears that that have changed the way smbclient works with version 3.2.3, 
and it is causing this problem.  A problem identical to yours was reported as 
being a bug in Ubuntred 8.10 (Intrepid): 
https://bugs.launchpad.net/ubuntu/+source/backuppc/+bug/283652.  However, the 
actual source of the problem is smbclient, which is called by backuppc.  
Therefore, I reported a second bug: https://bugs.launchpad.net/bugs/297025 that 
will hopefully fix the problem.

The solution to your backup problem is to edit /etc/backuppc/config.pl on 
server #2, which is using backuppc version 3.1.0 and smbclient 3.2.3.  Do not 
make this modification to server #1, as it will break it.

There are three strings that you need to modify in config.pl:

$Conf{SmbClientFullCmd}
$Conf{SmbClientIncrCmd}
$Conf{SmbClientRestoreCmd}

which control Samba backups and restore. In all three strings remove the -N 
flag.

My understanding that the flag is no longer needed, because the login prompt is 
automatically suppressed by smbclient when backuppc passes the password through 
the PASSWD environment variable.  But for some unfathomable reason, when the 
-N flag is used, the password does not get passed to Windows' LAN Manager.

Good luck.

+--
|This was sent by david.k...@certiby.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--

--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] I received the error No files dumped for share

2009-01-06 Thread Craig Barratt
Sean writes:

 I have tried to do a full backup of a Windows XP PC. the Backup is
 successful. Although I get the error ?No files dumped for share. What
 is wrong?

The backup isn't successful (since no files were dumped for one (or more)
shares).

Please look at the XferLOG.bad file (which should be quite short) and
if the answer isn't apparent, email the contents of the file (or at
least the first few lines) to this thread.  You should also explain
which XferMethod you are using and the corresponding Share and
Include/Exclude settings.

Craig

--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] configure.pl fails

2009-01-06 Thread Craig Barratt
Kiran writes:

 I am trying to install BackupPC on ubuntu server edition. I am running the 
 confiure command as
 sudo perl configure.pl
 
 it fails with the error message
 
 Making init.d scripts
 can't chown 1000, 1000 init.d/gentoo-backuppc.conf at configure.pl line 1011.
 
 Not sure where the permissions have to be changed. Can somebody help me fix 
 this?

It is trying to change the ownership from root (who is running configure.pl)
to the BackupPC user.  The strange thing is that many chown() calls prior
to this were successful.

Does init.d/gentoo-backuppc.conf exist?  (The path is relative to the
unpacked release - ie: the place you ran configure.pl from.)  What
happens when you manually try to chown it, eg:

sudo chown 1000:1000 init.d/gentoo-backuppc.conf

Craig

--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Client Updated- how tell BackupPC?

2009-01-04 Thread Craig Barratt
Christian writes:

 2009-01-02 19:56:54 User admin requested backup of ip (ip)
 2009-01-02 19:56:55 Started full backup on ip (pid=26716, share=/)
 2009-01-02 19:56:56 Backup failed on ip (fileListReceive failed)

The most common cause is extraneous output from the client-side ssh
or shell before rsync starts.  Can you send the first few lines of
the XferLOG.bad file?

Craig

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BUG with files of form '-i'

2008-12-29 Thread Craig Barratt
Jeff Kosowsky writes:

 I add an (empty) file named '-i' in several of my key Linux
 directories to prevent inadvertent rm * calamities.
 
 However, BackupPC doesn't seem to like this, giving me error messages
 of form:
Can't open /var/lib/BackupPC//pc/mycomputer/new/f%2fetc/f-i for empty
output\n
 
 Then, when I look in the backup directory
/var/lib/BackupPC.jnew.clean/pc/mycomputer/0/f%2fetc
 I cannot find the file '-i' backed up (though everything else is ok).

Sorry about the delay in getting back to you on this.

I'm trying to replicate this but without success.  If I create
an empty -i file in a directory that has other files, then
both full and incremental backups work on my set up.

Let's take this off list and figure out how I can replicate it.
At a minimum, can you restrict the backup to just /etc and
increase XferLogLevel to, say, 6 and send me the XferLOG file?

Craig

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Status of fuse for viewing backuppc backups

2008-12-29 Thread Craig Barratt
Jeff Kosowsky writes:

 I had been thinking of writing code to implement a robust fuse
 filesystem for BackupPC backups but then I saw that John Craig (and
 perhaps others) had started to write code.
 
 While the code still seems to be at the proof-of-concept I think the idea
 is very powerful and extensible.

I agree.  I've been following the suggestions and proof-of-concept
code with interest.

I actually believe having a FUSE implementation that supports writing
would be the best way to support rsync 3.x (and any other xfer methods
for that matter).  Assuming the performance was ok, the time-reversed
delta format for storing backups that I'm planning for BackupPC 4.x
would be most easily implemented with FUSE.

I've been working on various CVS checkins for a 3.2.0 release
(finally!), so I haven't had a chance to play with FUSE, other
than installing it and perl FUSE on my CentOS 5.2 system.

One question I'm curious about: if FUSE becomes a required part of
BackupPC 4.x, does that unduly complicate installation or reduce the
number of distros that BackupPC can readily run on?  I realize FUSE
is standard on recent 2.6.x kernels, but CentOS 5.2, as one example,
doesn't enable FUSE, and it was actually quite a pain installing it,
since the rpm package I found didn't install the kernel module.

Craig

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Little thoughts to share - RSYNC MAC SSH

2008-12-29 Thread Craig Barratt
Pedro writes:

 After searching for a while and doing some digging i found that i had
 files that would cause ssh to exit, usually you can exit ssh with ~.
 and in fact i had files with that name and content.
 what i did on backuppc config page was:
 on the main configuration editor, xfer:
 RsyncClientCmd
 $sshPath  -q -x -l root $host $rsyncPath $argList+
 changed to
 $sshPath -e none -q -x -l root $host $rsyncPath $argList+
 RsyncClientRestoreCmd
 $sshPath -q -x -l root $host $rsyncPath $argList+
 changed to
 $sshPath -e none -q -x -l root $host $rsyncPath $argList+
 
 this will avoid escape caracters.
 
 what do you guys think on putting this by default on the next release?

I looked at the source for ssh (specifically OpenSSH 5.1p1) and the
ssh escape character is disabled when ssh is not talking to a tty.
So adding -e none should make no difference.

However, I can't explain how it made a difference for you.

Craig

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Feature or bug? Full backup (mis)numbered as level 1

2008-12-29 Thread Craig Barratt
Jeff Kosowsky writes:

 Actually, after doing some subsequent incrementals, I believe it is a
 'bug' and not a feature.

Yes, definitely a bug.  Here's a patch, which will shortly
be in CVS.

Craig

--- bin/BackupPC_dump.orig   2008-12-29 02:09:04.643105800 -0800
+++ bin/BackupPC_dump   2008-12-29 22:46:13.014051700 -0800
@@ -413,7 +413,7 @@
 #
 # Decide whether we do nothing, or a full or incremental backup.
 #
-if ( @Backups == 0
+if ( $lastFullTime == 0
 || $opts{f}
 || (!$opts{i}  (time - $lastFullTime  $Conf{FullPeriod} * 24*3600
  time - $lastIncrTime  $Conf{IncrPeriod} * 24*3600)) ) {


--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Executing a restore on a host while an incremental is also running

2008-12-15 Thread Craig Barratt
Chris writes:

 Don't do a direct restore.  Download a Zip or Tar archive.

Or cancel the incremental backup.

Craig

--
SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada.
The future of the web can't happen without you.  Join us at MIX09 to help
pave the way to the Next Web now. Learn more and register at
http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclude complexity in version 2.1.2pl1

2008-12-11 Thread Craig Barratt
James writes:

 I have the following config line:
 
 $Conf{BackupFilesExclude} = ['/proc', '/mnt', '/sys', '/home/
 users',  ... , '+ /vz/dump', '/vz/*'];
 
 But there is no /vz/dump in the backups.  What am I doing wrong?

You can't use the rsync syntax (+/-) in $Conf{BackupFilesExclude}.
You can use $Conf{RsyncArgs} directly if you want.

Craig

--
SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada.
The future of the web can't happen without you.  Join us at MIX09 to help
pave the way to the Next Web now. Learn more and register at
http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental backups aren't.

2008-12-10 Thread Craig Barratt
Mark writes:

 Just noticed the /var/log/backuppc/LOG file for BackupPC is pumping
 these out mercilessly:
 
 2008-12-10 06:49:02 BackupPC_link got error -4 when calling 
 MakeFileLink(/mnt/backup/pc/shuttle 
 http://adamsmdk/backuppc/BackupPC_Admin.cgi?host=shuttle/0/f%2f/fhome/fmadams/f.beagle/fTextCache/f97/f1a4401-b13f-4ebb-bcc9-971de65dae6d,
  6c34911b58a200c2e8615c3d37c8e33d, 1)
 
 And /home/madams/.beagle is supposed to be on the exclude list!

 And my per-pc config. Some of you have seen it before:
 # cat spike.pl
 $Conf{TopDir} = '/mnt/attic/spike_images';
 $Conf{XferMethod} = 'rsync';
 $Conf{BackupFilesExclude} = ['/floppy'];
 $Conf{BackupFilesOnly} = ['/mnt/spike/images', '/mnt/spike/photos'];

As Holger notes you appear to be backing up more than you expect.
The $Conf{BackupFilesOnly} setting obviously isn't working.
Plus you can't set $Conf{TopDir} on a per-client basis, and
as Holger notes you should change TopDir with a symlink.

Also, is there any chance your desktop has nfs mounts of the
server, in particular including the BackupPC store?  That
recursion will make the next incremental quite large.

Craig

--
SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada.
The future of the web can't happen without you.  Join us at MIX09 to help
pave the way to the Next Web now. Learn more and register at
http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backuppc doesn't clean cpool

2008-12-02 Thread Craig Barratt
Cesar,

Are you sure this is 3.0 and not 3.1?

In 3.1 an optimization was added to use IO::Dirent for reading
the inodes in a directory, which on certain filesystems doesn't
work correctly.  If you are running 3.1.0 I would recommend
trying to disable IO::Dirent by changing this line:

$IODirentOk = 1;

at line 83 of lib/BackupPC/Lib.pm to:

$IODirentOk = 0;

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Using perl code within config files

2008-12-02 Thread Craig Barratt
Jeffrey writes:

 Just as an FYI, it is possible to use perl code within config files so
 that you can use a single config file yet still customize
 configurations by pc (or groups of pc's) without having to duplicate
 changes across multiple relatively similar config files each time you
 change a parameter.
 
 For example, I have a number of windows machines and one linux server.
 For the windows machines, I use a single config file in
 /etc/BackupPC/pc and then create links to it for each of my windows
 machines.
 
 I then use perl conditional statements to make some configurations
 unique by pc (or group of pc's).
 
 The trick is that $ARGV[0] is the name of the config file (or link)
 called and thus can be used to determine the machine.
 Personally, I find this trick very helpful...

Yes, a nice trick.  But there are two caveats:

  - the CGI editor won't preserve any of the in-line perl code

  - re-writing a config file (when upgrading with configure.pl and
the CGI editor) might break since the in-line perl formatting
might not be interpreted correctly.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Why is there no '0' in the default $Conf{WakeupSchedule}

2008-12-02 Thread Craig Barratt
Jeffrey writes:

 $Conf{WakeupSchedule} = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13,
 14, 15, 16, 17, 18, 19, 20, 21, 22, 23];
 
 Is there any reason midnight is left off?

Mostly laziness: I never got around to complete testing the 0 case.
It probably does work, but there are various cases that need to be
checked.  Also, midnight is a common choice for a lot of other
things on a system, so avoiding it isn't a bad idea.

Just consider it a quirk.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Question about attrib file structure

2008-12-01 Thread Craig Barratt
Jeffrey,

Sorry about the delay in replying.  I've been really busy lately.

   1. If a directory is *empty*, is there any reason for it to have an
  attrib file?
  
  Because in playing around with creating and deleting directory 
 contents, I
  found that sometimes even after emptying directory contents, the
  subsequent incremental backups may sometimes still have (empty)
  attrib files.

I believe there is always an attrib file created.  It would be
a minor optimization not to write empty attrib files.

   2. If not, can I safely erase any (empty) attrib file that has no files
  associated with it?

Yes.  If there is no attrib file the necessary meta-data is created
by reading the directory.  If it is empty then the effect will be the
same as an empty attrib file.

   3. Other than type=10 (delete), is there *any* reason for an attrib
  file to contain an entry for a file that is not present in the 
 directory?

No other files should be in the attrib file: files that are there
plus type=10 (delete).

 Because, I have found some attrib files on my system in past backups that
  have file entries with type 0 (i.e *not* type 10) yet there is no
  file present in the directory.

After your file system corruption problem was fixed?

   4. If not, can I safely *remove* any non type=10 attrib entry if the
  corresponding file is not in the directory?

Yes.  But this shouldn't happen.

 One more question to add:
 5. Are 'attrib' files also linked into the pool/cpool? If so, are they
 done the same way as regular data files?

Yes, they are pooled in the same way as regular files.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How do you lock BackupPC from running?

2008-12-01 Thread Craig Barratt
Jeffrey  writes:

 For the part of my routines BackupPC_fixLinks and BackupPC_deleteFile
 that actually try to make new links (or delete old ones), I would like
 to make sure that nothing else is creating or deleting links to the
 pool to avoid collisions.
 
 Specifically, I would like to be able to do the following:
 1. Query the server whether it is free
 2. If free tell the server to not launch anything else (i.e place
equivalent of lock file)
 3. Tell server when it can go back to launching other things
(i.e. remove lock file).
 
 I assume that BackupPC_nightly and BackupPC_link must do similar
 things but I couldn't quite figure out how.

First this should be on the devel rather than user list.

The BackupPC server maintains queue (CmdQueue) for commands
that need to be run one-at-a-time (currently BackupPC_link).
The BackupPC_nightly policy is hardcoded (ie: first wakeup
when current command from CmdQueue is done).  So you could
add a server message that adds your command to CmdQueue
and it will be run serially with other BackupPC_links
and not overlap BackupPC_nightly.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclude definitions not used?

2008-12-01 Thread Craig Barratt
Tino writes:

 The excludes are specified per share. So it should read:
 $Conf{BackupFilesExclude} = {
   '/' = [ '/sys/', '/vz/root/', ... ]
 };
 
 ('/' being your share name here if you use rsync via ssh.)

Yes, you're right.

Also, James, in 3.x the excludes are not passed in the
command-line to the remote rsync.  They are sent via the
pipe to the rsync process.  You need to look in the XferLOG
file to see the excludes.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC Working Well - Except Archiving

2008-11-20 Thread Craig Barratt
Sam writes:

 A copy of BackupPC 3.1 has been obtained from the main Ubuntu
 repository and successfully installed on v 8.04-Server.  This has
 now provided access to the file BackupPC_archiveStart which I
 intend to use via cron as outlined in the BackupPC documentation.

 When executed from the command line this produces an archive in
 .tar.raw format rather than gzip format.  I must be going snow
 blind on this as I simply cannot figure out what configuration
 entries need to be made in which files to achieve this.  I am now
 totally confused and cannot make progress by using the documentation.

 Is anyone willing to give me a hand with this?

BackupPC_archiveStart is hardcoded to have no compression and use
a .raw suffix.  It's pretty easy to see where to edit it to change
that.  As Nils' mentioned you can easily compress that file, or
use BackupPC_tarCreate and compress its output (after all,
BackupPC_archiveStart eventually just runs BackupPC_tarCreate).

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Child Exited Prematurely

2008-11-20 Thread Craig Barratt
James writes:

 The problem we are seeing is that Backups are randomly failing.
 The log file on BackupPC showing something like this:

This is most likely a TCP timeout or other network problem.

Rsync added a TCP keep-alive option in protocol version 29
(if I recall correctly) and is not currently supported in
File::RsyncP that BackupPC uses.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC Working Well - Except Archiving

2008-11-18 Thread Craig Barratt
SamK writes:

 I am struggling to create an archive from the command line.  The desired 
 outcome is to emulate the clicking of the Start Archive button in the web 
 page as this method is working perfectly.  The overall objective is to create 
 the archive as a cron job.
 
 From /usr/share/backuppc/bin/BackupPC_archive
 Extract Start
 # BackupPC_archive: Archive files for an archive client.
 #
 # DESCRIPTION
 #
 #   Usage: BackupPC_archive user archiveclient reqFileName
 Extract End
 
 I cannot find the definitions of user archiveclient reqFileName and all 
 permutations of my guesses of what might be required have all failed.  Can 
 anyone help or indicate where the definitions are listed?

BackupPC_archive isn't very easy to use directly: reqFileName is the
path of a file (output from perl's Data::Dumper) with information
about what to archive.

BackupPC_archiveStart is much easier to use.  It creates a reqFile
and then tells the BackupPC server to run the archive.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to Prevent Users Starting a Backup

2008-11-14 Thread Craig Barratt
Samk writes:

 I was hoping to find a solution which prevents the user from starting
 a backup but allows for a restore whenever required.

There isn't a configuration option that allows this.

I'd recommend just editing lib/BackupPC/Lang/en.pm (assuming you
are using english), and remove these lines:

\${h2(User Actions)}
p
form name=StartStopForm action=\$MyURL method=get
input type=hidden name=host   value=\$host
input type=hidden name=action value=
\$startIncrStr
input type=button value=\$Lang-{Start_Full_Backup}
 onClick=document.StartStopForm.action.value='Start_Full_Backup';
  document.StartStopForm.submit();
input type=button value=\$Lang-{Stop_Dequeue_Backup}
 onClick=document.StartStopForm.action.value='Stop_Dequeue_Backup';
  document.StartStopForm.submit();
/form
/p

This doesn't actually disable user requested backups - it just
removes the CGI buttons.  An enterprising use could still
figure out the URL to start a backup.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Is there any way for BackupPC to restore hard links properly?

2008-11-09 Thread Craig Barratt
Jeffrey writes:

 Looking at the code and the structure of the storage and attrib files,
 it doesn't seem like there is any way for BackupPC to record and
 restore hard links.

Not true.  Hardlinks are stored without using hardlinks.

Hardlinks are stored just like symlinks.  The attribute type is
BPC_FTYPE_HARDLINK and the contents of the file is the path of
the file being linked to.

For example, if there are 4 links to a file, one instance (it
doesn't matter which - depends on the Xfer Method) will be stored
as a regular file.  The remaining three instances will be stored
as type BPC_FTYPE_HARDLINK with pointers to the first file.

Note that this is independent of the hardlinks used to de-duplicate.

There is one subtlety with rsync: it also needs to remember if a
regular file is the target of (other) hardlinks so that the correct
file list can be generated during a restore.  This is done by using
an extra bit in the file's mode stored in the attrib file.

This results in one subtle bug that can't be easily fixed: if you
switch the Xfer method from tar to rsync, old backups that have
hardlinks stored with tar won't be correctly restored with rsync.
The workaround is generate a tar file and extract it, or switch
the Xfer method back to tar before you do the restore.  The
opposite case should work correctly.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Restore Hangs (rsync+ssh)

2008-11-09 Thread Craig Barratt
Rob writes:

 I'm having a problem trying to restore files to my clients from my
 backuppc server. I am using rsync over ssh. The backups work great, but
 when I try to restore a file (I've just been trying to restore /etc/hosts
 to test the setup), it just hangs. I flushed my iptables, so it's not a
 firewall issue. When I run the restore, here are the relevant process
 listings on both the backuppc server (hostname: backup) and the client
 (hostname: sechost) to which I'm attempting to restore:

Is there any useful information in the RestoreLOG file?

Try increasing $Conf{XferLogLevel}.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] What is the best way (using perl) to determine the length of a compressed backup file?

2008-11-02 Thread Craig Barratt
Jeffrey writes:

 I have been considering the following:
 - Uncompressing the full file to determine its length..
 But this is very computationally inefficient for large files...

Right.

 - Unpacking attrib file but this seems
 This seems best, but I'm not sure what are the best/easiest
 subroutines for parsing the attrib file
 (I looked at Holger's BackupPC_attribCheck and to my naive eyes it
 looked pretty complicate)

Using the attrib file is the best way.  See bin/BackupPC_attribPrint
for an example.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] incremental backups taking longer than full

2008-10-31 Thread Craig Barratt
Mark writes:

 Using rsync between two linux servers, the full took 2.5 hours, the
 incremental backups are taking longer each day.
 
 2008-10-21 23:00:01 full backup started for directory /
 2008-10-22 02:32:55 full backup 0 complete, 294903 files, 203401100538
 bytes, 28 xferErrs (0 bad files, 0 bad shares, 28 other)
 2008-10-22 23:00:00 incr backup started back to 2008-10-21 23:00:01
 (backup #0) for directory /
 2008-10-23 04:59:40 incr backup 1 complete, 541 files, 170443233887
 bytes, 11 xferErrs (0 bad files, 0 bad shares, 11 other)

Each incremental is transferring 170GB.  The full is 203GB.  Do you
have, eg, a large sparse file below /var/log that you should exclude
from the backup?

Les Mikesell notes that:

In particular, many 64-bit linux versions have a /var/log/lastlog
file that appears to be 1.2 terabytes in size as an artifact of
indexing it by uid numbers and using -1 for the nfsnobody id.  It's
generally not important to back this file up, so if that is the
problem you can just exclude

If there is a small change to such a file it is quite likely to be
more expensive than the original complete transfer.  (It won't matter
whether it is a full or incr.)

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] 2 cpool files with same checksum, different (compressed content) but same zcatt'ed content?????

2008-10-31 Thread Craig Barratt
Jeff writes:

 Is there a (reasonably easy) way of identifying which ones have the
 rsync checksum seed and which ones don't???

I'm relucant to even say, because you are heading in an unproductive
direction.  But here goes: a compressed file without checksums starts
with 0x78 and a compressed file with checksums starts with 0xd6 or 0xd7.
See lib/BackupPC/FileZIO.pm.

The file sizes in the example you cite suggests the first has checksums
and the second does not.

Les writes:

 I think you are kind of missing the point that they could both be
 corrupted in other ways...

Exactly!  Can we stop this thread please :)?

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] What does BackupPC_link got error -4 when calling MakeFileLink mean?

2008-10-30 Thread Craig Barratt
Jeffrey writes:

 So what does -4 mean and what can cause it?

Fails to make a hardlink.  Several possible reasons: you are out
of inodes, your cpool and and pc directory are on different file
systems, your BackupPC file system doesn't support hardlinks, or
you have a permissions problem of some kind.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Duplicate files in pool with same CHECKSUM and same CONTENTS

2008-10-30 Thread Craig Barratt
Jeffrey writes:

 Except that it my case some of the duplicated checksums truly are the
 same file (probably due to the link issue I am having)...

Yes.  Just as Holger mentions, if the hardlink attempt fails,
a new file is created in the pool.  You appear to have some
unreliability in your NFS or network setup.

The only other time identical files will have different pool
entries, as people noted, is when $Conf{HardLinkMax} is hit.
Subsequent expiry of backups might cause the identical files
to move below $Conf{HardLinkMax}.

It's not worth the trouble to try to combine those files since
the frequency is so small and the effort to relink them is very
high.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Pool duplicates and corrupted backups

2008-10-30 Thread Craig Barratt
Jeffrey writes:

 Types of Duplicate checksums:
 1. Same checksum but contents differ -- INTENTIONAL - nothing to fix

Right.

 2. Same checksum and compressed content
 I have found many of these but contrary to my earlier postings
 the ones that I examined were not in my error log so they
 don't seem to be due to the nfs/link problems (see later)
 
 I don't know what the source of these are though...

This could happen if there is ever a read error of a file in the
cpool.  Before linking to a cpool file (except when rsync is used
with checksum caching enabled and the checksums match exactly),
the contents are read, decompressed and compared to the incoming
file.  Any read error will cause the match to fail and a new cpool
file will be written.

 3. Same checksums, different compressed content, same zcatted
content As per my earlier email, I found at least
one of these and am not sure why they even exist.

This can happen if one file has rsync checksums and the other does not.
This would happen after case 2 above.  The checksums don't get added 
until the next time the file is read.

This can also happen since the mapping of file - file.z is one-to-many:
there are many different ways to compress a file that all uncompress
the same.  But that shouldn't happen in BackupPC unless you change
the compression level.

 4. Files with *NO* pool entries.
 This seems to be what occurs with the
 files in my error log that cite the failure of
 MakeFileLink

Yes.  There are cases when it just leaves the file there (below
pc/HOST/nnn) without linking to the pool.

 The deeper I probe into this, the more confused I get and the more I
 worry about data integrity on my system...

Look, you *do* have a data integrity problem, and yes, you should worry.

There is no further value in understanding symptoms from a relatively
filesystem-intensive application like BackupPC and working backwards.

You need to fix your hardware/software before using BackupPC again.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupFilesExclude not working properly (for me) with Windows/rsyncd

2008-10-28 Thread Craig Barratt
Jeffrey writes:

 I have:
 $Conf{BackupFilesExclude} = [
 '/Documents and Settings/*/LocalSettings/Temp/*',
 [snip]

You have two sets of quotes here, so this is excluding this path:

/Documents and Settings/*/LocalSettings/Temp/*

instead of

/Documents and Settings/*/LocalSettings/Temp/*

I suspect you entered this into the CGI interface:

/Documents and Settings/*/LocalSettings/Temp/*

You should remove the inner sets of quotes.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental dumps hanging with 'Can't get rsync digests' 'Can't call method isCached'

2008-10-27 Thread Craig Barratt
Jeffrey writes:

 Can't call method isCached on an undefined value at
   /usr/share/BackupPC/lib/BackupPC/Xfer/RsyncFileIO.pm line 165.

That isn't good.  This is the case where it is doing a random check
of the cached checksums (based on $Conf{RsyncCsumCacheVerifyProb}).
I am missing an error check just prior to this line of code.   An
error will occur if there is some problem open/reading the cached
checksum information.  A possible cause might be some file system
corruption.

I'd be curious what the error is.  Just prior to this line
you could add this code:

if ( $err ) {
$fio-log(Can't get rsync digests from $attr-{fullPath}
.  (err=$err, name=$f-{name}));
$fio-{stats}{errorCnt}++;
return -1;
}

Alternatively, just set $Conf{RsyncCsumCacheVerifyProb} to 0.  This
code will be skipped, and I suspect it will fail further down and
there is an error check there.  Seeing the error number would be
helpful.

I would also try removing the --checksum-seed=32761 option from
$Conf{RsyncArgs} and trying the backup again.  This will disable
the use of cached checksums.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How do you set a command to mount nfs backup share?

2008-10-26 Thread Craig Barratt
Jeffrey writes:

 So, ideally, I am looking for a hook to run in advance of any web or
 backup operation that needs access to /var/lib/BackupPC.
 
 Does such a hook exist?
 If not where would be the best place in the code to hook into?

No, there isn't such a hook - BackupPC assumes the pool exists.
Almost every program and CGI request will access some data below
the pool.

You could try putting some code in the BackupPC::Lib initialization,
but you will have to think about what error recovery is appropriate.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Where do I report problems with this mailing list?

2008-10-26 Thread Craig Barratt
Holger writes:

 So if someone with the power to unsubscribe
 him reads this, please do. Thank you.

Done.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Cyrillic file names problem

2008-10-23 Thread Craig Barratt
Yuriy writes:

  What does your browser say what the page encoding is?
 
 ISO-8859-1.

This suggests you are running BackupPC 2.x.  Support for
$Conf{ClientCharset} was added in 3.x.  In 3.x all the
server-side and CGI encodings are utf8.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Feature Request: Link to latest Full

2008-10-16 Thread Craig Barratt
Yazz writes:

 I may be able to just do it with $DumpPostUserCmd but I haven't tested
 that yet. I just think it would be nicer to have it as a built in option.
 
 /bin/ln -sf $topDir/pc/$host/$(/bin/cat $topDir/pc/$host/backups \
 | /bin/grep full | /bin/sort -n | /usr/bin/tail -1 \
 | /bin/awk '{ print $1 }') $topDir/pc/$host/last-full

It's an interesting idea and I'll add it to the todo list for
consideration.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_zipCreate and charset for encoding file names

2008-10-16 Thread Craig Barratt
Fernando writes:

 I'm replying to this old thread because I believe that my question/commentary 
 fit here better than in any other thread regarding this subject.
 
 I'm using BackupPC to backup among others a samba file server that uses a 
 ISO8859-1 charset. To get the characters displayed correctly in the command 
 line I have to set BackupPC's ClientCharset to ISO-8859-1. That's fine. But 
 when I try to download a file to a Ubuntu/Windows desktop I've got encoding 
 problems.
 
 Using the option -e UTF8 with BackupPC_zipCreate in the command line 
 works. But to get it working from the CGI interface I had to hard coded it 
 in lib/BackupPC/CGI/Restore.pm :
 --
 $bpc-cmdSystemOrEvalLong([$BinDir/BackupPC_zipCreate,
  -h, $host,
  -n, $num,
  -c, $In{compressLevel},
  -s, $share,
  @pathOpts,
  @fileList,  # add the ,
  -e UTF8   # HERE
 ],
 sub { print(@_); },
 ---
 
 I'm sure this is not the best way to do it but I haven't had success
 with any other try I did. Does anybody known a better way to do that ?

I added the command-line argument for charset but didn't implement
a CGI setting.  The problem I have is I can't find any documentation
for zip files and any standards around charset encoding for the
file names in a zip file.  Is utf8 the correct default, or does
it depend on which platform is trying to unpack the zip file?

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Feature Request - longer path name support

2008-10-15 Thread Craig Barratt
Ski writes:

 Running Backuppc 2.1.2 on a debian linux server and backing up a linux
 machine, I ran into a problem backing up a file where the entire path
 length was 318 characters.  Once I shrunk the path length it worked
 perfectly.  Has this been fixed in version 3.1?  If not, could you add
 this to the feature request queue.  Thanks.

No, it hasn't been fixed in 3.x.  There are several areas where the
file name paths can be limited in length (none of them in BackupPC).
I'll add this to the todo list.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_archiveStart not using compression

2008-10-12 Thread Craig Barratt
Stephen writes:

 When I run BackupPC_archiveStart from the command line it's not
 using compression for the backup, the backup file is created using
 raw, however if I start the archive job via the web-interface it
 uses compression (gzip by default).
 
 The command I am running is:
 
 /usr/share/backuppc/bin/BackupPC_archiveStart offsite-archive backuppc nphts

Looks like a typo in bin/BackupPC_archiveStart.  This line:

compression = $bpc-{Conf}{CatPath},

should be 

compression = $bpc-{Conf}{ArchiveComp},

Or slightly more correctly, if $bpc-{Conf}{ArchiveComp} is none
you should set compression to $bpc-{Conf}{CatPath} instead.

You might also want to change the value of compext 
when compression is on.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_archiveStart not using compression

2008-10-12 Thread Craig Barratt
Stephen writes:

 hmm okay. That sort of worked, I did the first change now I'm getting an 
 error for gzip:

Set it to the full path, eg: /bin/gzip.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_archiveStart not using compression

2008-10-12 Thread Craig Barratt
Stephen writes:

 Still seems to be creating a .raw archive:

Yes.  The archive is now compressed, in spite of the extension:

/bin/csh -cf /usr/share/backuppc/bin/BackupPC_tarCreate -t -h localhost -n 
24 -s \* . | /bin/gzip  /data/NAS-Mount-MaxBackups/offsite/localhost.24.tar.raw

The extension name is hardcoded too.  In my earlier reply I said:

You might also want to change the value of compext
when compression is on.

Currently in bin/BackupPC_archiveStart:

compext = '.raw',

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to get a remote script to log output in the backup logs

2008-10-05 Thread Craig Barratt
Mark writes:

 I am using backuppc to ssh to a remote host, and use rsync for the
 backups. Before the backup, I have backuppc run a script on the remote
 host to manipulate some database files. It appears that the script is
 excuted, but I keep getting a message in the backuppc logs that the
 script failed. I thought the output of scripts run by backuppc were
 recorded in the backup logs.
 
 This is what the log says:
 
 Executing DumpPreShareCmd: '/usr/bin/ssh -q -x -l me remote.host sudo -u
 root my_script';
 Exec of '/usr/bin/ssh -q -x -l me remote.host -u root my_script'; failed

The script fails to run.

I suspect you are using the CGI editor and you include the enclosing
quotes and semicolon.  You should eliminate those - they are only
needed if you are editing the config.pl file directly (since that
has to be valid perl code and that setting is a string).

So your command in the CGI editor should look like this:

/usr/bin/ssh -q -x -l me remote.host -u root my_script

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] DumpPostShareCmd returned error status 512 but no DumpPostShareCmd defined

2008-10-05 Thread Craig Barratt
Marcel writes:

   - localhost (DumpPostShareCmd returned error status 512)

Look in the XferLOG.bad file for localhost and see what command
was executed.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] What does this mean: Botch on admin job for admin : already in use!! ?

2008-10-02 Thread Craig Barratt
Aleksey writes:

 Hi.  What do these messages in my LOG files, repeating over and over
 again, mean?   Should I be concerned?
 
 2008-09-25 23:13:07 Botch on admin job for  admin : already in use!!
 2008-09-25 23:13:07 Botch on admin job for  admin1 : already in use!!
 2008-09-25 23:18:07 Botch on admin job for  admin2 : already in use!!
 2008-09-25 23:18:07 Botch on admin job for  admin3 : already in use!!
 2008-09-25 23:23:07 Botch on admin job for  admin4 : already in use!!
 2008-09-25 23:23:08 Botch on admin job for  admin5 : already in use!!
 2008-09-25 23:28:07 Botch on admin job for  admin6 : already in use!!
 2008-09-25 23:28:07 Botch on admin job for  admin7 : already in use!!
 
 I searched Google but did not come up with anything.

The error is benign, but it means that the previous
BackupPC_nightly hasn't finished running.  It looks like you
have set $Conf{MaxBackupPCNightlyJobs} to 8.  You are likely
disk seek limited, so that high a value won't help.

It's probably better to increase $Conf{BackupPCNightlyPeriod} to 4
or more.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] next version ?

2008-10-01 Thread Craig Barratt
Philippe writes:

 I went to the sourceforge site and noticed that the last update of ths
 cvs was 9 month old. Does it mean that developpement stopped or that the
 next version is not on this server ?

I don't use the SF CVS site very actively - I tend to do batch updates
closer to the release time.  Other developers coordinate either directly
with me or in some cases do use CVS.

I have been particularly busy this year and my time for BackupPC
development has been reduced.

However, I've been thinking more about major features for BackupPC
4.x and I'm definitely looking forward to being able to spend more
time on this in the future.  I particularly want to change the way
backups are stored so that the only complete/filled backup is the
most recent (whether or not it is a full or incremental), and all
prior backups are stored as reverse-time deltas.  This has a number
of significant benefits.  There has been some good discussion around
this over the last couple of years.

Starting a few months ago Zmanda began funding a developer, Paul Mantz,
to work on BackupPC.  He's developing an FTP XferMethod, and also taking
that as an opportunity to learn more about the architecture.  His work
will likely be the basis of a 3.2 release.

Bottom line, BackupPC development is certainly continuing.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backup failed (auth failed on module cDrive) SOLVED!!!

2008-09-14 Thread Craig Barratt
Bruno writes:

 I double check my host.pl file multiple times but still could not find 
 anything wrong with it. So after my Google searches, I try putting single 
 quotes around the values instead of the double quotes. Like this:
 
 $Conf{XferMethod} = 'rsyncd';
 $Conf{RsyncdUserName}  = 'user';
 $Conf{RsyncdPasswd}= 'pass';
 $Conf{RsyncShareName}  = 'cDrive';
 
 After that I was able to backup the windows machine using backupPC :).

Thanks for the closure on this issue.

This does make sense if your password contains characters like '$' that perl
interpolates inside double quotes.  You don't need to confirm that.

You wouldn't have had this problem if you used the cgi editor.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Strange behavior when using backuppc with rsyncd or rsync+ssh

2008-09-10 Thread Craig Barratt
Louis-Marie writes:

 I also have a remark about backuppc pooling feature: I think the server
 locally detects file duplicates by hashing them after download. As far
 as I know, rsync should also be able to send some kind of hash from
 remote host before download. Wouldn't it be possible to detect
 duplicates using this hash before the file is downloaded? This would for
 example save downloading files again when a user simply renames a
 directory...

Rsync only efficiently transfers changes to files that have the same
path.  Rsync does not detect renamed files - it does a complete transfer
in that case.  (Some work was done patching rsync to do fuzzy matching
to try to find some renamed files.)  BackupPC uses the efficiency of
rsync transfers, and then globally matches identical files.  A renamed
file will be verbatim transferred and then matched to the pool.

If the rsync transfer determines the file is identical, BackupPC
simply adds a hardlink.  If the rsync transfer notices even a small
change to a file, just the change will be transferred, but then
BackupPC needs to reconstruct the full file to determined whether
a match is in the pool.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Question regarding BackupPC_tarPCCopy

2008-09-10 Thread Craig Barratt
Hendrik writes:

 Furthermore: Would it be possible to limit the BackupPC_tarPCCopy command to
 one host only?

Yes this should work, eg;

BackupPC_tarPCCopy /var/lib/backuppc/pc/HOST | (cd /new/backuppc/pc  tar 
xPf -)

Note: the extract still starts at the pc directory, not pc/HOST.

Using your paths you an do this:


cd /mnt/sda5/BackupPC/pc ;
/usr/BackupPC/bin/BackupPC_tarPCCopy /var/lib/BackupPC/pc/HOST |tar xvPF -

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] No backups are running automatically.

2008-09-09 Thread Craig Barratt
James writes:

 Manual full and incremental backups work. No backups are running
 automatically. I'm using the default schedule options, I changed the
 default blackout period to a negative value for testing purposes.
 I'm using rsyncd with Windows XP. See attached config.pl

I don't see a problem.  Have you waited at least a day since
your manual backup?

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] nmblookup being invoked for hosts where backups are disabled (bug)

2008-09-04 Thread Craig Barratt
John writes:

 We were having a weird problem with nmblookup firing for a host that
 was disabled with:
 
   $Conf{BackupsDisable} = '2';
 
 using BackupPC 3.1.0. The same thing happend when set to 1. My claim
 is that BackupPC shouldn't be attempting to resolve any host with
 BackupsDisabled set to 2, and it should not be looked up when it is
 set to 1, unless a user explicitly requests that the host be backed
 up. (I.E. it shouldn't be queued just because it's in the hosts file.)

It still gets queued because there might be backups that need to expire.

You are right - the disable check happens after the host alive check.
It was done that way to handle the dhcp = 1 case (where you don't know
which host it is until you run nmblookup).

But in the normal case that check should be done earlier.  I'll add
that to the todo list.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Ping too slow

2008-09-03 Thread Craig Barratt
Andrew writes:

 A few days ago I noticed that none of my hosts are backing up. All but
 two give the error, no ping (ping too slow: 38.94msec (threshold is
 35msec)) -- or some similar ping.
 
 One such host is named shipping in backuppc. The thing is, I can ping
 from the BackupPC server with no problem:
 $ nmblookup shipping
 192.168.111.126 shipping00
 $ ping 192.168.111.126
 PING 192.168.111.126 (192.168.111.126) 56(84) bytes of data.
 ...
 4 packets transmitted, 4 received, 0% packet loss, time 3002ms
 rtt min/avg/max/mdev = 0.130/0.138/0.151/0.016 ms

BackupPC should parse this last line and extract 0.138msec as
the round trip time.

To see exactly the command it is running and the output it gets,
run:

su backuppc
BackupPC_dump -f -v shipping

Hit ^C after you get past the ping output and parsed result.
What ping output do you get?

In the mean time increase $Conf{PingMaxMsec} to get backups
running again.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] using override RsyncShareName in the gui

2008-08-26 Thread Craig Barratt
Terri writes:

 Where is that stored? Reason I asked, I was reading over the archives
 on how to backup different hosts with different directories and found
 a post on putting a config.pl in the /pc/host directory. I then
 decided to add second path on that same host and used the edit config
 for that particular host in the gui using the override and save in
 RsyncShareName. It didn't appear in the config.pl I created under that
 host and also not in the default /etc/BackupPC config.pl.

In 3.0.0 RsyncdUserName was missing from the config editor
(I assume that is your question).  It's a trivial fix - you
can find it in the archives.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclude and Only combo

2008-08-24 Thread Craig Barratt
Holger writes:

 I also read this as all of / except /proc, /sys, /mnt, /opt, plus
 /opt/zimbra/backup. I would implement this like BackupPC does, though it's
 perhaps not intuitive :-). If you are using rsync(d), that is. It won't work
 with tar or smb.

Excellent explanation.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclude and Only combo

2008-08-22 Thread Craig Barratt
James writes:

 I need to back up a Zimbra mail server, but I can't get the whole thing in 
 any reasonable amount of time.  I want to exclude the usual stuff:
 
 $Conf{BackupFilesExclude} = ['/proc', '/sys', '/mnt'];
 
 But I also want to exclude all of /opt, except /opt/zimbra/backup.
 
 Is this the right way to go about this:
 
 $Conf{BackupFilesExclude} = ['/proc', '/sys', '/mnt', '/opt'];
 $Conf{BackupFilesOnly) = '/opt/zimbra/backup';

Assuming you only want to backup that one directory and nothing
else, since all the excludes are outside/above the one directory
you want all you need is:

$Conf{BackupFilesOnly) = '/opt/zimbra/backup';

Leave $Conf{BackupFilesExclude} empty.

An alternative is to use '/opt/zimbra/backup' for $Conf{TarShareName}
and leave both $Conf{BackupFilesOnly) and $Conf{BackupFilesExclude}
empty.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] busybox tar problem?

2008-08-18 Thread Craig Barratt
James writes:

 I'm trying to back up some Arm processor console servers which only
 have busybox tar available.  Busybox tar does not support --totals
 and I THINK this is why the backups are failing.  I tried writing a
 wrapper script that spits out a bogus --totals line, but so far, no
 luck.  Any ideas?

Yes, most likely that is it.  Does the script write the totals
output to stdout or stderr?

You can also modify the code to not expect the totals line.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


<    2   3   4   5   6   7   8   9   10   11   >