Re: [BackupPC-users] Patch for V3.3.2-1: partial and yearly backups

2019-04-08 Thread Matthias Meyer
I've recognized today my firewall issue with ftp://www.backup4u.at.
It is solved now and you can access it again.

Maybe someone can put the patches into a standard place for backuppc patches?

br
Matthias

Am Samstag 16 Februar 2019, 14:13:40 schrieb Matthias Meyer:
> I've developed two enhancements to BackupPC 3.1.0 and use it since 2010.
> In June 2013 I've done the upgrade to BackupPC V3.2.1 (debian wheezy) and I 
> implemented my patch in this release.
> In Februar 2019 I've done it again to BackupPC V3.3.2-2 (debian 
> buster/testing)
> 
> As requested by Jeffrey Kosowsky in March 2011 I have separated my patch
> into two patches:
> 
> $Conf{FullCntYearly} will allow to make the first backup of a year as a 
> full backup and save this full backups as many years as configured in the 
> GUI.
> 
> The BackupPC-V3.3.2-FullCntYearly.patch will modify the following files:
> 
> backuppc/bin/BackupPC_dump
> backuppc/lib/BackupPC/CGI/EditConfig.pm
> backuppc/lib/BackupPC/Config/Meta.pm
> doc/backuppc/BackupPC.html
> 
> $Conf{useEveryPartial} will allow to store each interrupted backup (both, 
> full as well as incremental) as a partial backup. This partial backups will 
> be used by rsync like incremental backups. The removal of partial backups 
> will happen as for incrementals but as soon as an incremental or a full 
> backup was successfull finished.
> 
> The modification to $Conf{PartialAgeMax} will remove partial backups after 
> $Conf{PartialAgeMax}.
> 
> The BackupPC-V3.3.2-UseEveryPartial.patch will modify the following files:
> 
> backuppc/bin/BackupPC_dump
> backuppc/bin/BackupPC_link
> backuppc/lib/BackupPC/CGI/EditConfig.pm
> backuppc/lib/BackupPC/Config/Meta.pm
> backuppc/lib/BackupPC/Xfer/Rsync.pm
> backuppc/lib/BackupPC/Xfer/RsyncFileIO.pm
> doc/backuppc/BackupPC.html
> 
> 
> As Holger Parplies recommended in March 2011 I've created the patches with:
> 
> % diff -ruN --exclude=*~ backuppc backuppc.FullCntYearly > BackupPC-V3.3.2-
> FullCntYearly.patch
> % diff -ruN --exclude=*~ doc doc.FullCntYearly > BackupPC-V3.3.2-
> FullCntYearly.doc.patch
> % diff -ruN --exclude=*~ backuppc backuppc.UseEveryPartial > BackupPC-
> V3.3.2-UseEveryPartial.patch
> % diff -ruN --exclude=*~ doc doc.UseEveryPartial > BackupPC-V3.3.2-
> UseEveryPartial.doc.patch
> 
> and you can apply the patches by:
> 
> % src=(where you downloaded the patch)
> % cd /usr/share
> % patch -p0 --dry-run --reject-file=oops.rej < $src/BackupPC-V3.3.2-
> FullCntYearly.patch
> % patch -p0 --dry-run --reject-file=oops.rej < $src/BackupPC-V3.3.2-
> FullCntYearly.doc.patch
> % patch -p0 --dry-run --reject-file=oops.rej < $src/BackupPC-V3.3.2-
> UseEveryPartial.patch
> % patch -p0 --dry-run --reject-file=oops.rej < $src/BackupPC-V3.3.2-
> UseEveryPartial.doc.patch
> 
> do not forget to remove --dry-run from the lines above :)
> be sure your file owners are backuppc
> % chown -R backuppc:backuppc backuppc
> % /etc/init.d/apache2 restart
> 
> 
> you can remove the applied patches with:
> % patch -p0 -R < $src/BackupPC-V3.3.2-FullCntYearly.patch
> % patch -p0 -R < $src/BackupPC-V3.3.2-FullCntYearly.doc.patch
> % patch -p0 -R < $src/BackupPC-V3.3.2-UseEveryPartial.patch
> % patch -p0 -R < $src/BackupPC-V3.3.2-UseEveryPartial.doc.patch
> 
> Be carefull! I've developed, applied and tested the patches against the 
> debian version of BackupPC.
> If there is a chance to include this patches into the standard BackupPC 
> version I could redevelop and test it against the BackupPC V3.3 from 
> sourceforge too.
> 
> The patches are available on ftp://www.backup4u.at. You get access with:
>  user=ftpuser
>  password=Backup4U4FTP
> 
> br
> Matthias
> 
> 
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] FEATURE REQUEST - "lock" designated backups from being deleted...

2019-04-08 Thread Matthias Meyer
Am Sonntag 07 April 2019, 17:11:47 schrieb backu...@kosowsky.org:
> Sometimes you want to save a special backup that for example
> corresponds to a specific change (pre/post) on your system. The
> trouble is that with exponential deleting there is no way to
> guarantee that your specific designated backup won't be deleted
> automatically later on.
> 
> In the past, I have simply renamed the backup number to say -save
> which prevents it from being deleted.
> But it also prevents the backup from being part of /backups
> and thus being browsable from the web interface.
> 
> Ideally, it would be nice if one could prevent a specific backup from
> being deleted (or even being part of the exponential schema) by
> either:
> 1. Adding a designated "LOCK" file to the top director (just under the
> backun number)
> 2. Prefixing the entry in the /backups file with a
> character that says essentially, skip over me for deleting purposes
> but otherwise I am still here.
> 
> Any suggestions better than my renaming of the backup tree itself?
> 
> Jeff
> 
> 
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/

You could use my patch 
ftp://www.backup4u.at/BackupPC-V3.3.2-FullCntYearly.patch and rewrite it to 
skip directories containing such a file instead or in addition to the first 
full of a year.

Br
Matthias

PS: user/password = ftpuser/Backup4U4FTP


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Patch for V3.3.2-1: partial and yearly backups

2019-02-16 Thread Matthias Meyer
I've developed two enhancements to BackupPC 3.1.0 and use it since 2010.
In June 2013 I've done the upgrade to BackupPC V3.2.1 (debian wheezy) and I 
implemented my patch in this release.
In Februar 2019 I've done it again to BackupPC V3.3.2-2 (debian 
buster/testing)

As requested by Jeffrey Kosowsky in March 2011 I have separated my patch
into two patches:

$Conf{FullCntYearly} will allow to make the first backup of a year as a 
full backup and save this full backups as many years as configured in the 
GUI.

The BackupPC-V3.3.2-FullCntYearly.patch will modify the following files:

backuppc/bin/BackupPC_dump
backuppc/lib/BackupPC/CGI/EditConfig.pm
backuppc/lib/BackupPC/Config/Meta.pm
doc/backuppc/BackupPC.html

$Conf{useEveryPartial} will allow to store each interrupted backup (both, 
full as well as incremental) as a partial backup. This partial backups will 
be used by rsync like incremental backups. The removal of partial backups 
will happen as for incrementals but as soon as an incremental or a full 
backup was successfull finished.

The modification to $Conf{PartialAgeMax} will remove partial backups after 
$Conf{PartialAgeMax}.

The BackupPC-V3.3.2-UseEveryPartial.patch will modify the following files:

backuppc/bin/BackupPC_dump
backuppc/bin/BackupPC_link
backuppc/lib/BackupPC/CGI/EditConfig.pm
backuppc/lib/BackupPC/Config/Meta.pm
backuppc/lib/BackupPC/Xfer/Rsync.pm
backuppc/lib/BackupPC/Xfer/RsyncFileIO.pm
doc/backuppc/BackupPC.html


As Holger Parplies recommended in March 2011 I've created the patches with:

% diff -ruN --exclude=*~ backuppc backuppc.FullCntYearly > BackupPC-V3.3.2-
FullCntYearly.patch
% diff -ruN --exclude=*~ doc doc.FullCntYearly > BackupPC-V3.3.2-
FullCntYearly.doc.patch
% diff -ruN --exclude=*~ backuppc backuppc.UseEveryPartial > BackupPC-
V3.3.2-UseEveryPartial.patch
% diff -ruN --exclude=*~ doc doc.UseEveryPartial > BackupPC-V3.3.2-
UseEveryPartial.doc.patch

and you can apply the patches by:

% src=(where you downloaded the patch)
% cd /usr/share
% patch -p0 --dry-run --reject-file=oops.rej < $src/BackupPC-V3.3.2-
FullCntYearly.patch
% patch -p0 --dry-run --reject-file=oops.rej < $src/BackupPC-V3.3.2-
FullCntYearly.doc.patch
% patch -p0 --dry-run --reject-file=oops.rej < $src/BackupPC-V3.3.2-
UseEveryPartial.patch
% patch -p0 --dry-run --reject-file=oops.rej < $src/BackupPC-V3.3.2-
UseEveryPartial.doc.patch

do not forget to remove --dry-run from the lines above :)
be sure your file owners are backuppc
% chown -R backuppc:backuppc backuppc
% /etc/init.d/apache2 restart


you can remove the applied patches with:
% patch -p0 -R < $src/BackupPC-V3.3.2-FullCntYearly.patch
% patch -p0 -R < $src/BackupPC-V3.3.2-FullCntYearly.doc.patch
% patch -p0 -R < $src/BackupPC-V3.3.2-UseEveryPartial.patch
% patch -p0 -R < $src/BackupPC-V3.3.2-UseEveryPartial.doc.patch

Be carefull! I've developed, applied and tested the patches against the 
debian version of BackupPC.
If there is a chance to include this patches into the standard BackupPC 
version I could redevelop and test it against the BackupPC V3.3 from 
sourceforge too.

The patches are available on ftp://www.backup4u.at. You get access with:
 user=ftpuser
 password=Backup4U4FTP

br
Matthias


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC_fixLinks.pl - Where to download?

2015-09-02 Thread Matthias Meyer
Hi all,

I have a lot of files in my pc-tree which are not correct linked into my 
pool. I'll like to verify and linked them correctly into my pool (not 
cpool).
I've found some mailings regarding BackupPC_fixLinks.pl as well as hints to 
the location 
http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=BackupPC_FixLinks
Unfortunately I can't find it there :(

Please could I have an advice from you?

Thanks in advance
Matthias
-- 
Don't Panic


--
Monitor Your Dynamic Infrastructure at Any Scale With Datadog!
Get real-time metrics from all of your servers, apps and tools
in one place.
SourceForge users - Click here to start your Free Trial of Datadog now!
http://pubads.g.doubleclick.net/gampad/clk?id=241902991=/4140
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Name server problems in hosts or config.pl

2012-12-14 Thread Matthias Meyer
Bruce Thayre wrote:

 Hi Matthias,
Ah I see, I guess I shouldn't mix bash notation when asking about a
 command I run in a shell.  So what you recommend is exactly what I've
 been doing.  Assume my router is MyRouter.domain, and the client behind
 the router is MyClient.  In MyClient's respective config file I have:
 $Conf{ClientNameAlias} = 'MyRouter.domain';
 
When I run the command:
 
 sudo -u backuppc /usr/share/BackupPC/bin/BackupPC_dump -f -v MyClient
 
I get the output:
 
 Name server doesn't know about MyClient; trying NetBios
 cmdSystemOrEval: about to system /usr/bin/nmblookup MyClient
 cmdSystemOrEval: finished: got output querying MyClient on
 111.222.111.222 name_query failed to find name MyClient
 
 NetBiosHostIPFind: couldn't find IP address for host MyClient
 host not found
 
So what's going on is despite having the ClientNameAlias set,
 backuppc assumes the hostname is a netbios name.  Strangely enough the
 ip address that it dumps out is on my subnet, but is not actually in
 use.  I've set the backup method as rsyncd, so I would think backuppc
 would not use any netbios/samba related stuff. I've also tried using my
 router's ip address just in case I was having DNS problems, but I see
 the exact same behavior.  Sorry for the confusion, and thanks for the
 patience.
 Thanks,
 Bruce

Did you try:
$ ping MyClient
what is the output?
-- 
Don't Panic


--
LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial
Remotely access PCs and mobile devices and provide instant support
Improve your efficiency, and focus on delivering more value-add services
Discover what IT Professionals Know. Rescue delivers
http://p.sf.net/sfu/logmein_12329d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Name server problems in hosts or config.pl

2012-12-10 Thread Matthias Meyer
Bruce Thayre wrote:

 Hi Matthias,
Thanks for your prompt reply!  I gave your recommendations a try and
 had no luck.  Setting CLIENT='name as in $Conf{ClientNameAlias}' doesn't
 actually substitute the value for ClientNameAlias.  Bash doesn't know
 the config file, so just returns that as a string if I try to ping
 it...i.e.
 
 bash-4.1$ echo $CLIENT
 name as in $Conf{ClientNameAlias}

if your client called MyClient than:
CLIENT='MyClient'
ping -c2 $CLIENT
sudo -u backuppc /usr/share/BackupPC/bin/BackupPC_dump -f -v $CLIENT

;)
 
 I also tried adding those nmblookup specific lines to my config, but get
 the same output as before when running the dump command. What confuses
 me is I specify rsyncd as my method, so I'm not sure why backuppc is
 trying to use netbios names for lookups.  I also don't know where it's
 getting this funny ip that isn't actually there, but if it were
 registered would be on our subnet.  For some reason or another, backuppc
 won't try to ping what I specify as ClientNameAlias, and instead tries
 to lookup the host netbios name.  Any ideas?  Thanks so much for the help!
 Bruce
 
 
 On 12/8/2012 1:33 AM, Matthias Meyer wrote:
 Bruce Thayre wrote:

 Hello Everyone,
 I've been a maintainer of an old install of backuppc until our
 server
 recently died.  I've reinstalled and brought all other services back
 except for backuppc.  Many of our backup clients are behind routers and
 I had been using the ClientNameAlias option in their respective
 config.pl files.  However, when I try the test command:
 bash-4.1$ /usr/share/BackupPC/bin/BackupPC_dump -f -v $CLIENT
 try:
 CLIENT='name as in $Conf{ClientNameAlias}'
 ping -c2 $CLIENT
 sudo -u backuppc /usr/share/BackupPC/bin/BackupPC_dump -f -v $CLIENT
 I get the output:

 Name server doesn't know about $CLIENT; trying NetBios
 cmdSystemOrEval: about to system /usr/bin/nmblookup $CLIENT
 cmdSystemOrEval: finished: got output querying $CLIENT on
 $UNREGISTERED_IP_ON_OUR_SUBNET
 name_query failed to find name $CLIENT

 NetBiosHostIPFind: couldn't find IP address for host $CLIENT
 host not found

 I'm a little confused by the output.  I have the host setup with
 DHCP=0, and rsyncd is my backup method in /etc/config.pl.  I'm not quite
 sure why backuppc is trying to use the hostname as a netbios name, and
 why it seems to ignore the router ip I provide in the client's config.pl
 file.  More perplexing is that nmblookup seems to return an ip address,
 but that ip address is not in use.  It won't resolve using nslookup, so
 I'm can't understand where it was found. Any insight would be greatly
 appreciated.
 Thanks,
 Bruce

 My Config:
 /etc/backuppc/config.pl:
 $Conf{NmbLookupPath} = '/usr/bin/nmblookup';
 $Conf{PingPath} = '/bin/ping';

 /etc/backuppc/CLIENT.pl
 $Conf{ClientNameAlias} = 'name of CLIENT';
 $Conf{NmbLookupCmd } = '$nmbLookupPath -A $host';
 $Conf{NmbLookupFindHostCmd } = '$nmbLookupPath $host';
 $Conf{FixedIPNetBiosNameCheck} = '';

 br
 Matthias

-- 
Don't Panic


--
LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial
Remotely access PCs and mobile devices and provide instant support
Improve your efficiency, and focus on delivering more value-add services
Discover what IT Professionals Know. Rescue delivers
http://p.sf.net/sfu/logmein_12329d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Name server problems in hosts or config.pl

2012-12-08 Thread Matthias Meyer
Bruce Thayre wrote:

 Hello Everyone,
I've been a maintainer of an old install of backuppc until our server
 recently died.  I've reinstalled and brought all other services back
 except for backuppc.  Many of our backup clients are behind routers and
 I had been using the ClientNameAlias option in their respective
 config.pl files.  However, when I try the test command:
 bash-4.1$ /usr/share/BackupPC/bin/BackupPC_dump -f -v $CLIENT
try:
CLIENT='name as in $Conf{ClientNameAlias}'
ping -c2 $CLIENT
sudo -u backuppc /usr/share/BackupPC/bin/BackupPC_dump -f -v $CLIENT
 
 I get the output:
 
 Name server doesn't know about $CLIENT; trying NetBios
 cmdSystemOrEval: about to system /usr/bin/nmblookup $CLIENT
 cmdSystemOrEval: finished: got output querying $CLIENT on
 $UNREGISTERED_IP_ON_OUR_SUBNET
 name_query failed to find name $CLIENT
 
 NetBiosHostIPFind: couldn't find IP address for host $CLIENT
 host not found
 
I'm a little confused by the output.  I have the host setup with
 DHCP=0, and rsyncd is my backup method in /etc/config.pl.  I'm not quite
 sure why backuppc is trying to use the hostname as a netbios name, and
 why it seems to ignore the router ip I provide in the client's config.pl
 file.  More perplexing is that nmblookup seems to return an ip address,
 but that ip address is not in use.  It won't resolve using nslookup, so
 I'm can't understand where it was found. Any insight would be greatly
 appreciated.
 Thanks,
 Bruce
 
My Config:
/etc/backuppc/config.pl:
$Conf{NmbLookupPath} = '/usr/bin/nmblookup';
$Conf{PingPath} = '/bin/ping';

/etc/backuppc/CLIENT.pl
$Conf{ClientNameAlias} = 'name of CLIENT';
$Conf{NmbLookupCmd } = '$nmbLookupPath -A $host';
$Conf{NmbLookupFindHostCmd } = '$nmbLookupPath $host';
$Conf{FixedIPNetBiosNameCheck} = '';

br
Matthias
-- 
Don't Panic


--
LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial
Remotely access PCs and mobile devices and provide instant support
Improve your efficiency, and focus on delivering more value-add services
Discover what IT Professionals Know. Rescue delivers
http://p.sf.net/sfu/logmein_12329d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] No more bakup after last partial backup?

2012-12-08 Thread Matthias Meyer
arcticeye wrote:

 Hi John,
 Thanks for your answer. I have copied the script but sh and bash (either)
 don't recognize the declare command. Is there a way to do it manually?
 Should I try to remove the last partial backup manually from
 /lib/var/backuppc/pc/server/ ?? or should I have to remove any kind of
 extra registry or something? Thank you again!
 
 Kind regards,
 
 Matias
 
No - don't remove it manually because there are some relating data 
anywhere.

Probably you have a link from /bin/bash or /bin/sh to /bin/dash
Check, and install if neccessary, /bin/bash and make sure that it is not 
linked to /bin/dash

br
Matthias
-- 
Don't Panic


--
LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial
Remotely access PCs and mobile devices and provide instant support
Improve your efficiency, and focus on delivering more value-add services
Discover what IT Professionals Know. Rescue delivers
http://p.sf.net/sfu/logmein_12329d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_verifyPool mismatchs found - what to do?

2012-12-06 Thread Matthias Meyer
backu...@kosowsky.org wrote:

 Matthias Meyer wrote at about 08:41:34 +0100 on Monday, December 3, 2012:
   Wow - thank you for details :)
   
   Most of the files found are realy top level attrib files.
   But BackupPC_verifyPool.pl some files found like
[1459587] fXLC_LOCALE  (   979) !=
[c5e2162bfb3286475d4b71503593ffcd
[1459783] attrib   (73) !=
[42d8ce042b950daa935fe4e0440d2020
   too.
   
   Anyone have an Idea whats happen here?
   Should I simply rename those files as suggested?
 As mentioned in Holger's and my replies, I would do the following:
 
 1. Check to see you have the latest version of BackupPC and check the
changelogs to verify that the error has been corrected.
I'm running BackupPC V3.1
Unfortunately I've running a patched version ;) to make yearly backups
and do not remove partial backups until a successfull backup is finished.
So, maybee, it is my fault but I didn't believe that ;)

 
 2. Look up the thread I referenced so you can understand what is going
on with the top level attribs.
I've read it. Thanks :) I've a lot of PCs with more than one share.
Blessedly - it isn't a real problem :)
 
 3. Give some more information about the files that are not top-level
attribs so that we can try to understand what is going on. What is
the content? what are their real names? (search the pc-tree) What
types of files? etc. etc. Look for patterns.

I make a regulary backup of my backuppc data with rsync to another device.
The analyzing performs on this backup-device.

3010068 files in 4096 directories checked, 3538 had wrong digests, of these 2 
zero-length.
Allways most of the 3538 files I'm found in /var/lib/backuppc/pc and they can 
be simply renamed.

But some files seems to have special problems.
e.g. of files with wrong MD5-names as well as my analyzis (so far):

1) I can't find the inode in /var/lib/backuppc/pc. I would believe 
BackupPC_nightly will remove this file
144312220 -rw-r- 2 backuppc backuppc 3976 20. Dez 2009 
/var/lib/backuppc/cpool/0/0/7/00754a6772b82e796d3cc12ac84661~0

2) Hardlink-refCount seems to be wrong. How to correct it? e2fsck doesn't do 
this job.
157158688 -rw-r- 3 backuppc backuppc 346 20. Dez 2009 
/var/lib/backuppc/cpool/0/0/3/0035b696f4c9aa129dd310fbac63db~0
157158688:  (3) 4kB 20.Dez.2009 
/var/lib/backuppc/pc/vdr/245/f%2f/fusr/fshare/fdoc/fdefoma/fcopyright

3) the name in cpool isn't a MD5
 a) In which case this can be happen? If BackupPC_link has been interrupted?
 b) realy a Hardlink-refCount of 10003? The backuppc status page say: Pool 
hashing gives 3474 repeated files with longest chain 31,
152963304 -rw-r- 10003 backuppc backuppc 86 13. Dez 2011  
/var/lib/backuppc/cpool/0/1/4/fcommon
152963304:  (10003) 4kB 13.Dez.2011 
/var/lib/backuppc/pc/vdr/645/froot/fusr/fshare/fdoc/fkde/fHTML/fda/fkleopatra/fcommon
152963304:  (10003) 4kB 13.Dez.2011 
/var/lib/backuppc/pc/vdr/645/froot/fusr/fshare/fdoc/fkde/fHTML/fda/fkwatchgnupg/fcommon
152963304:  (10003) 4kB 13.Dez.2011 
/var/lib/backuppc/pc/vdr/645/froot/fusr/fshare/fdoc/fkde/fHTML/fda/fknode/fcommon
152963304:  (10003) 4kB 13.Dez.2011 
/var/lib/backuppc/pc/vdr/645/froot/fusr/fshare/fdoc/fkde/fHTML/fda/fkjots/fcommon
152963304:  (10003) 4kB 13.Dez.2011 
/var/lib/backuppc/pc/vdr/645/froot/fusr/fshare/fdoc/fkde/fHTML/fda/fkorganizer/fcommon

4) another example of wrong Hardlink-refCount together with wrong MD5 filename
62523004 -rw-r- 9 backuppc backuppc 12662 14. Dez 2009  
/var/lib/backuppc/cpool/0/2/2/fshadow.mo
62523004:   (9) 16kB 14.Dez.2009 
/var/lib/backuppc/pc/vdr/406/f%2f/fusr/fshare/flocale/fid/fLC_MESSAGES/fshadow.mo
62523004:   (9) 16kB 14.Dez.2009 
/var/lib/backuppc/pc/vdr/415/f%2f/fusr/fshare/flocale/fid/fLC_MESSAGES/fshadow.mo
62523004:   (9) 16kB 14.Dez.2009 
/var/lib/backuppc/pc/vdr/245/f%2f/fusr/fshare/flocale/fid/fLC_MESSAGES/fshadow.mo
62523004:   (9) 16kB 14.Dez.2009 
/var/lib/backuppc/pc/vdr/381/f%2f/fusr/fshare/flocale/fid/fLC_MESSAGES/fshadow.mo
62523004:   (9) 16kB 14.Dez.2009 
/var/lib/backuppc/pc/vdr/443/f%2f/fusr/fshare/flocale/fid/fLC_MESSAGES/fshadow.mo
62523004:   (9) 16kB 14.Dez.2009 
/var/lib/backuppc/pc/vdr/440/f%2f/fusr/fshare/flocale/fid/fLC_MESSAGES/fshadow.mo
62523004:   (9) 16kB 14.Dez.2009 
/var/lib/backuppc/pc/vhost/427/f%2f/fusr/fshare/flocale/fid/fLC_MESSAGES/fshadow.mo

 
 4. As Holger mentioned, there is no harm not renaming other than
wasted pool space (which is typically quite small for attrib files)
and the inelegance of having wrong md5sum file names. Otherwise,
you can use my routine to fix them automatically or you can write
your own or do it manually.
I will use your script. But first I'ld like to understand what's happend :)

Also interesting. All files from 2012, found by Holgers script, are top level 
attrib files.
The strange things only happens with older files.

-- 
Don't

Re: [BackupPC-users] BackupPC_verifyPool mismatchs found - what to do?

2012-12-02 Thread Matthias Meyer
Wow - thank you for details :)

Most of the files found are realy top level attrib files.
But BackupPC_verifyPool.pl some files found like
 [1459587] fXLC_LOCALE  (   979) != 
c5e2162bfb3286475d4b71503593ffcd
 [1459783] attrib   (73) != 
42d8ce042b950daa935fe4e0440d2020
too.

Anyone have an Idea whats happen here?
Should I simply rename those files as suggested?

br
Matthias
-- 
Don't Panic


--
Keep yourself connected to Go Parallel: 
BUILD Helping you discover the best ways to construct your parallel projects.
http://goparallel.sourceforge.net
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC_verifyPool mismatchs found - what to do?

2012-12-01 Thread Matthias Meyer
Hi,

I've tried the BackupPC_verifyPool.pl from Holger Parplies.
Unfortunately some MD5 errors found and print out. e.g.:
[28878] 083212b41c2482783128c9212f1f8a26 (78) != 
70f6bdb839eed7efdfe8f8b01f4dcbc7

Did I have a problem?
What should/can I do?
How to find out which file it is?

Thanks in advance
Matthias
-- 
Don't Panic


--
Keep yourself connected to Go Parallel: 
DESIGN Expert tips on starting your parallel project right.
http://goparallel.sourceforge.net/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_deleteBackup.sh not working

2012-01-18 Thread Matthias Meyer
Estanislao López Morgan wrote:

 Hi backupers, I am projecting a migration in my backup server file system,
 to a LVM structure in order to enlarge my actual and future disk capacity.
 For that pourpose, I have to delete some backups but I am having problems
 with script /BackupPC_deleteBackup.sh. When I execute it, this is the
 result:
  ./BackupPC_deleteBackup.sh
 ./BackupPC_deleteBackup.sh: 42: declare: not found
 ./BackupPC_deleteBackup.sh: 43: declare: not found
 ./BackupPC_deleteBackup.sh: 44: declare: not found
 ./BackupPC_deleteBackup.sh: 46: Syntax error: ( unexpected
 
 I have already installed bash and other scrips works properly... so the
 problem is with that script in particular.
 
 Someone knows why?
 
 Thanks in advance
 
 Regards
Probably BackupPC_deleteBackup.sh starts with a line #!/bin/sh and will us 
/bin/sh as shell. I would believe /bin/sh is a softlink to /bin/dash.
Change this solftlink or run bash BackupPC_deleteBackup.sh ... instead.

br
Matthias
-- 
Don't Panic


--
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Mail error - how to solve

2012-01-16 Thread Matthias Meyer
MaryG wrote:

 Hi experts. I hope you can help me with this problem:
 I have installed BackupPC 3.2.1 on a DNS-323 embedded ARM NAS, (running
 GNU BSD Net).  I'm using it to back up 4 SOHO PCs. Things seem to be
 working fine, but I'm getting the following error in the log:
 
 01:04:02  admin : Invalid peer certificate (error 20)
 01:04:02  admin : 0 (null)
 01:04:02  admin : john.j...@somemail.com: 0 (null)
 01:04:02  admin : Connected to MDA: /ffp/bin/procmail -d 'backuppc'
 01:04:02  admin : Connected to MTA
 01:04:02  admin : Disconnected to MTA
 01:04:02  admin : procmail: Couldn't create /var/spool/mail/backuppc
 
 This repeats itself for about three times, every night.
 
 What to do?
 
 MaryG
 

At first I would check the permissions of /var/spool/mail/. Maybee create 
and set permissions of /var/spool/mail/backuppc:
mkdir /var/spool/mail/backuppc
chmod ug=rw,o=--- /var/spool/mail/backuppc
chown backuppc:mail /var/spool/mail/backuppc
If this doesn't help than ask in a procmail user group.

br
Matthias
-- 
Don't Panic


--
RSA(R) Conference 2012
Mar 27 - Feb 2
Save $400 by Jan. 27
Register now!
http://p.sf.net/sfu/rsa-sfdev2dev2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Suggestions for the new version

2012-01-16 Thread Matthias Meyer
Brad Alexander wrote:

 Hi gang,
 
 Hope everyone had a great Christmas, Hanukkah, or whatever you celebrate,
 and look forward to a good and prosperous new year.
 
 One of my Christmas presents this year was an upgrade for my desktop.
 Since it had been 4 or 5 years, and I had accumulated a kind of
 frankenbox, I decided to do a nuke and pave. (actually, I decided to
 downsize the drive and put more stuff on the file server)
 
 So...I ran a full backup of the machine, rebuilt over the new year, and
 started restoring. The following are a few observations and suggestions
 for the next version of backuppc based on my experiences:
 
 1. I would like to see better feedback of status of restores, similar to
 what we have with backups. It seems to me that there is very little
 logging of restores (unless I am looking in the wrong place). For
 instance, I started a restore of my home directory, and while I could see
 a limited amount of increase in the used in df, however, after running
 overnigth, it did not apparently complete...Since there are still things
 missing, including .ssh and several other config files. So there was no
 indication of the status of the backup. The only thing in the log file is:
 
 2012-01-02 19:59:18 restore started below directory /home to host defiant
 
I have:
2012-01-07 11:43:01 restore started below directory D to host st-srv-xp
2012-01-07 12:01:11 restore 51 complete (4629 files, 313681638 bytes, 0 
dirs, 0 xferErrs)
 but no indication if it completed. I had also queued other restores, and
 they did not complete either. Since I can't get any kind of indication, I
 am doing the restore to a tar file on the laptop and then scp'ing and
 restoring by hand.
 
 I was thinking that perhaps a status bar color change in the hosts summary
 (we already have green for system backing up, yellow for no ping, gray for
 manual/disabled backups...perhaps blue for restore in progress?)
Yes - nice idea
 
 Perhaps a  status of queued restores, a little more logging, maybe a flow
 indicator? I know I can use tcpdump, but perhaps backuppc could include a
 restore percentage indicator? The final suggestion would be to have a way
 to stop a restore, similar to the stop/dequeue backup button.
Yes - nice idea. Especially flow indicator with is also usefull for the 
backup progress.
 
 Thanks,
 --b

-- 
Don't Panic


--
RSA(R) Conference 2012
Mar 27 - Feb 2
Save $400 by Jan. 27
Register now!
http://p.sf.net/sfu/rsa-sfdev2dev2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Disk was too full !!!

2011-12-28 Thread Matthias Meyer
Luis Calvo wrote:

 Hi,
 
 I receive this message from my BackupPC server:
 
 
---
 The following hosts had an error that is probably caused by a
 misconfiguration.  Please fix these hosts:
 
   -  (Child exited prematurely)
   -  (Child exited prematurely)
   -  (Child exited prematurely)
 
 Yesterday 170 hosts were skipped because the file system containing
 /Storage/Disk was too full.  The threshold in the
 configuration file is 95%, while yesterday the file system was
 up to 100% full.  Please find more space on the file system,
 or reduce the number of full or incremental backups that we keep.
 
---
 
 I saw that exist an script BackupPC_deletebackup and I try to using it
 to delete some backups and get more free space on the disk, but I have
 problems.
 
 I've got a message:
 
 ./BackupPC_deleteBackup: 43: declare: not found
 ./BackupPC_deleteBackup: 44: declare: not found
 ./BackupPC_deleteBackup: 45: declare: not found
 ./BackupPC_deleteBackup: 47: Syntax error: ( unexpected
 
 I'm really newbie on this, can anyone help me, please?
 
 What I need to do to get more free space on the disk for new backups?
 
 Thank's in advance, and regards.
 
Probably you are using dash instead bash. Try ls -lh /bin/bash. I would 
believe it is a softlink to /bin/dash. To resolve the problem you can 
install bash (e.g. apt-get install bash) and run BackupPC_deleteBackup 
again.

Another way is to reduce the amount of stored old backups. See 
$Conf{FullKeepCnt} as well as $Conf{IncrKeepCnt}.

But I would propose to add disk capacity. Because of pooling (file 
deduplication) by backuppc I would believe that your storage requirements 
will be reduced by removing hosts but not by removing backups.

br
Matthias
-- 
Don't Panic


--
Write once. Port to many.
Get the SDK and tools to simplify cross-platform app development. Create 
new or port existing apps to sell to consumers worldwide. Explore the 
Intel AppUpSM program developer opportunity. appdeveloper.intel.com/join
http://p.sf.net/sfu/intel-appdev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] starting backup-PC

2011-11-28 Thread Matthias Meyer
Try
sh -x /etc/init.d/backuppc start
and post the output

br
Matthias
Greer, Jacob - District Tech wrote:

 I am new to both Liniux and Backup-PC I am trying to install on OpenSUSE
 when I try to enable the service that I copied I get the following error:
 
 /etc/init.d/backuppc start returned 126 (unspecified error):
 
 Any advice or help would be greatly appreciated.
 
 Thanks
 
 Jacob

-- 
Don't Panic


--
All the data continuously generated in your IT infrastructure 
contains a definitive record of customers, application performance, 
security threats, fraudulent activity, and more. Splunk takes this 
data and makes sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-novd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Different settings for restore possible?

2011-11-25 Thread Matthias Meyer
I use the $Conf{DumpPreUserCmd} to run a script on my windows client which 
will start the rsync daemon if it is not running.
It is nearly the same as you do before dump during your preusercmd.sh
e.g. pre-restore.cmd:
/bin/rsync --daemon --detach --config=/etc/rsyncd.conf

br
Matthias
Falko Trojahn wrote:

 Hello all,
 
 for backing up Win* machines I use vss snapshots as in this description:
 
 http://www.goodjobsucking.com/?p=62
 
 When trying to restore some files/directories using method 1, the
 restore fails with inet connect: Connection timed out, since there is
 no rsync running on the target machine during restore.
 
 Is there a possiblity to set other XFerMethod etc. for restore?
 
 Or do you see another option here?
 
 using backuppc version 3.1.0-9 on Debian Squeeze.
 
 Thanx for all your support,
 Falko
 
 
 
--
 All the data continuously generated in your IT infrastructure
 contains a definitive record of customers, application performance,
 security threats, fraudulent activity, and more. Splunk takes this
 data and makes sense of it. IT sense. And common sense.
 http://p.sf.net/sfu/splunk-novd2d
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

-- 
Don't Panic


--
All the data continuously generated in your IT infrastructure 
contains a definitive record of customers, application performance, 
security threats, fraudulent activity, and more. Splunk takes this 
data and makes sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-novd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Small issue with trash

2011-11-25 Thread Matthias Meyer
Today I've the same problem.
In my case the owner of the trash-directory was wrong (root).

br
Matthias
Dan Johansson wrote:

 I have a small issue with the trash, or more precisely  the trash-cleaner.
 Sometimes a directory will get stuck in the trash-directories and will
 not be cleaned away by the trash-cleaner, it just sits there forever or
 until manually (rm -rf) removed. Restarting BackupPC does not help.
 Any suggestions what could be wrong?
 
 Regards,

-- 
Don't Panic


--
All the data continuously generated in your IT infrastructure 
contains a definitive record of customers, application performance, 
security threats, fraudulent activity, and more. Splunk takes this 
data and makes sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-novd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Deleted manually

2011-10-26 Thread Matthias Meyer
Estanislao López Morgan wrote:

 Hi fallows from Backup Pc... I have a problem with my backups. I had to
 delete a wrong backup, so I tried to run the famous script to do that, but
 it did not work. 
Probably the script is BackupPC_deleteBackup. Did you get any error-message?

 So I deleted manually. I did it in /var/lib/backuppc/pc/
 and then in /etc/backuppc/ and deleted the .pl and .pl.old that correspond
 to the PC which backup I wanted to erase  and from my hosts list. The
 problem is when I add the PC again to my backup schedule. The backups are
 made but I am not able to see it in the web browser. I am not able to
 browse the backup. Never the least, the backup is made because I can find
 it in /var/lib/backuppc/pc/.
 Thanks in advance !
You have to delete the backups in /var/lib/backuppc/pc/host/backups too.
In addition you should delete the corresponding XferLOG.backup-number
Also you can check owner and file permisions in /etc/backuppc/ as well as in 
/var/lib/backuppc/pc/PC
If you've deleted the .pl you can't add him to the backup schedule. Probably 
you didn't tell us the whole story ;-)

br
Matthias
-- 
Don't Panic


--
The demand for IT networking professionals continues to grow, and the
demand for specialized networking skills is growing even more rapidly.
Take a complimentary Learning@Cisco Self-Assessment and learn 
about Cisco certifications, training, and career opportunities. 
http://p.sf.net/sfu/cisco-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Issue with status page

2011-10-04 Thread Matthias Meyer
Urs Forster wrote:

 Hi all
 
 On the status-page I would expect two graphs in the lower part.
 However the do not show - there are only two placeholders.
 
 Trying to explicitly view the grapgs reveals the following:
 
 The Graphics http://birch/backuppc/index.cgi?image=4; contains
 errors and cannot be displayed
 
 Any hint why backuppc fails to create the graphs?
 
 Thanks
 Urs
 --
Assuming you use Debian because this future is present only in the debian 
package of backuppc.
The graphs will be generated by rrdtools so check if rrdtools are installed.

Some times ago I find some hints how to check rrdtool capabilities. If I 
remeber corect it needs an additional perl library. Try a search engine to 
find it.

br
Matthias
-- 
Don't Panic


--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to run BackupPC_copyPCPool.pl without a BackupPC installation

2011-10-04 Thread Matthias Meyer
Jeffrey J. Kosowsky wrote:

 Matthias Meyer wrote at about 00:04:50 +0200 on Saturday, October 1, 2011:
   Hi (Jeff ;-)
   
   I would like to try your BackupPC_copyPCPool.pl to backup my BackupPC
   storage to another server.
   
   Unfortunately this other server have no BackupPC installed.
   I've copied FileZIO.pm, Lib.pm, jLib.pm, Attrib.pm and Storage.pm from
   /usr/share/backuppc/lib/BackupPC as well as Text.pm from
   /usr/share/backuppc/lib/BackupPC/Storage onto this server.
   
   ~# sudo -u backuppc /usr/share/backuppc/bin/BackupPC_copyPCPool.pl
   No language setting
   BackupPC::Lib-new failed
   
   Is it possible to set the language without installing the whole
   BackupPC package?
   
 
 Well if you look in Lib.pm the call to set the language is in
 ConfigRead which is called from BackupPC::Lib-new
 
 I suppose one could hack Lib.pm but there are probably other hidden
 gotchas so I think a minimal install would be worthwhile (and is very
 easy).
 
 
OK - I have a workstation with backuppc installed too :-) I tried to backup 
this backuppc data (from LVM snapshot) to an external usb-disk:
/usr/share/backuppc/bin/BackupPC_copyPCPool.pl -F -f --noverbose -t 
/snapshot -o /var/lib/backuppc/backup/files

But I get a lot of errors of type:
ERROR: pc/vistabase/0/fWINDOWS/fwinsxs/fx86_microsoft-windows-m..-
management-
console_31bf3856ad364e35_6.0.6001.18000_none_0f734b1075a23eba/attrib 
(inode=744813, nlinks=1) INVALID pc file and UNLINKED to pool

and of type:
Argument cpool/d/2/0/d20ba8a18da819b3005bb9418fb56e91 isn't numeric in 
modulus (%) at /usr/share/backuppc/bin/BackupPC_copyPCPool.pl line 666.
ERROR: Can't write to inode pool file: 
ipool/0/0/0/0/cpool/d/2/0/d20ba8a18da819b3005bb9418fb56e91

Any hint what goes wrong?

Thanks
Matthias
-- 
Don't Panic


--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] How to run BackupPC_copyPCPool.pl without a BackupPC installation

2011-09-30 Thread Matthias Meyer
Hi (Jeff ;-)

I would like to try your BackupPC_copyPCPool.pl to backup my BackupPC 
storage to another server.

Unfortunately this other server have no BackupPC installed.
I've copied FileZIO.pm, Lib.pm, jLib.pm, Attrib.pm and Storage.pm from 
/usr/share/backuppc/lib/BackupPC as well as Text.pm from 
/usr/share/backuppc/lib/BackupPC/Storage onto this server.

~# sudo -u backuppc /usr/share/backuppc/bin/BackupPC_copyPCPool.pl
No language setting
BackupPC::Lib-new failed

Is it possible to set the language without installing the whole BackupPC 
package?

br
Matthias
-- 
Don't Panic


--
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security
threats, fraudulent activity, and more. Splunk takes this data and makes
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2dcopy2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] md4 doesn't match even though VSS is used

2011-09-13 Thread Matthias Meyer
Jeffrey J. Kosowsky wrote:

 Matthias Meyer wrote at about 15:29:09 +0200 on Sunday, September 11,
 2011:
   Dear all,
   
   I've a problem by backing up a large file (9GB) because the internet
   connection of the client interrupts every 24 hours.
   
   BackupPC (V3.1.0) can rsync this file once with status:
   md4 doesn't match: will retry in phase 1; file removed
   With lsof /var/lib/backuppc I can see this phase 1 transfer some
   minutes later.
   But the internet connection will interrupt shortly before this second
   transfer were finished :-(
   
   I am sure that the source file (Windows client) is on a volume shadow
   copy and rsync is using this source because:
   - /etc/rsyncd.conf contains only one path = /cygdrive/Z/
   - ls /cygdrive/ shows only the drives C and D
   - ls /cygdrive/Z lists the same files as ls /cygdrive/C
   
   So it should not possible that the source was changed.
   
   Did /usr/share/backuppc/lib/BackupPC/Xfer/RsyncFileIO.pm compare the
   md4 diggest from the begin of a transfer with a recalculated md4
   diggest at the end of the transfer?
   
   Somebody else have a similiar problem?
   
   Is there any known solution to solving my problem?
   
   What happens if I patch the RsyncFileIO.pm so that it will ignore the
   md4 doesn't match?
   
   I know I should try it instead asking for it. But I'm not sure what the
   meaning of md4 is and hopefully someone can give me a hint.
   
 
 I would not ignore the md4 mismatch. md4sums are used on block sums
 and file sums both to ensure the transfer was errorless and also as
 part of rsync's delta matching algorithm that allows it to only
 transfer over changed blocks.
 
 I'm not sure what is the cause of your problem. But I would first try
 naked rsync (without BackupPC) and with protocol manually set to
 protocol 28 so that md4 sums are used rather than md5sums. See what
 happens when you try to transfer the file...
 
 
I've got the file with native rsync --protocol=28 from the client without an 
error.
In the next step I backup this file with BackupPC onto one machine within my 
LAN and cp -al the directory into the last backup set from the original 
client. Thats work. The original client run the next backup without an 
error.

Maybee my previous error was that I copied the file without the attrib-file 
into the backup set of the original client.

Thanks again
br
Matthias
-- 
Don't Panic


--
BlackBerryreg; DevCon Americas, Oct. 18-20, San Francisco, CA
Learn about the latest advances in developing for the 
BlackBerryreg; mobile platform with sessions, labs  more.
See new tools and technologies. Register for BlackBerryreg; DevCon today!
http://p.sf.net/sfu/rim-devcon-copy1 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] md4 doesn't match even though VSS is used

2011-09-11 Thread Matthias Meyer
Dear all,

I've a problem by backing up a large file (9GB) because the internet 
connection of the client interrupts every 24 hours.

BackupPC (V3.1.0) can rsync this file once with status:
md4 doesn't match: will retry in phase 1; file removed
With lsof /var/lib/backuppc I can see this phase 1 transfer some minutes 
later.
But the internet connection will interrupt shortly before this second 
transfer were finished :-(

I am sure that the source file (Windows client) is on a volume shadow copy 
and rsync is using this source because:
- /etc/rsyncd.conf contains only one path = /cygdrive/Z/
- ls /cygdrive/ shows only the drives C and D
- ls /cygdrive/Z lists the same files as ls /cygdrive/C

So it should not possible that the source was changed.

Did /usr/share/backuppc/lib/BackupPC/Xfer/RsyncFileIO.pm compare the md4 
diggest from the begin of a transfer with a recalculated md4 diggest at the 
end of the transfer?

Somebody else have a similiar problem?

Is there any known solution to solving my problem?

What happens if I patch the RsyncFileIO.pm so that it will ignore the md4 
doesn't match?

I know I should try it instead asking for it. But I'm not sure what the 
meaning of md4 is and hopefully someone can give me a hint.

Thanks in advance
Matthias
-- 
Don't Panic


--
Using storage to extend the benefits of virtualization and iSCSI
Virtualization increases hardware utilization and delivers a new level of
agility. Learn what those decisions are and how to modernize your storage 
and backup environments for virtualization.
http://www.accelacomm.com/jaw/sfnl/114/51434361/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] bakuppc is not running.

2011-09-11 Thread Matthias Meyer
Adi Spivak wrote:
 hi.
 backuppc has stopped working 5 days ago.
 it was due to lack of storage space.
 i used this script:
href=http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=How_to_delete_backups;http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=How_to_delete_backups/a
 which should have released hundreds of GB just from deleteing the
 backups of one PC but it does not, even though in the GUI it shows
 that the backups are gone.
 now, backuppc is just stuck or somthing and is doing nothing for the
 past 5 days, no reorginazing the backups, no incrementals,
 nothing.br how can i fix this?br
 how can i force it do remove the backups that were deleted by the
 script to free room?
 thanks.

First please send plain text mails.
Second:
What the content of your BackupPC Status page? BackupPC will stop working if 
the pool file system is more then (configurable) 95% full.
What is the output of df -h /var/lib/backuppc
Do you find the deleted backups on device? Do you find 
/var/lib/backuppc/pc/deletedPC or numeric directories in it?

For your understanding:
/var/lib/backuppc/pc/deletedPC/backups contains the informations shiown by 
GUI
/var/lib/backuppc/pc/deletedPC/n contains the backups itself.


br
Matthias
-- 
Don't Panic


--
Using storage to extend the benefits of virtualization and iSCSI
Virtualization increases hardware utilization and delivers a new level of
agility. Learn what those decisions are and how to modernize your storage 
and backup environments for virtualization.
http://www.accelacomm.com/jaw/sfnl/114/51434361/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Remote Site Backup though Backuppc

2011-06-03 Thread Matthias Meyer
Pravakar Kumar wrote:

 Hi,
 
 We are using BackupPC to backup Local Site desktops. It is working gr8.
 Now we want to backup desktop of remote sites connected via Internet VPN.
 What is the best procedure to backup desktops files (user's folders) of
 remote site though BackupPC.
 
 Regards,
 
 
 Pravakar Kumar
 Varun Beverages Limited
 09837893004

rsync or rsyncd because rsync will minimize network traffic

br
Matthias
-- 
Don't Panic


--
Simplify data backup and recovery for your virtual environment with vRanger.
Installation's a snap, and flexible recovery options mean your data is safe,
secure and there when you need it. Discover what all the cheering's about.
Get your free trial download today. 
http://p.sf.net/sfu/quest-dev2dev2 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Question about the Server Status cgi page

2011-05-13 Thread Matthias Meyer
Peter Lavender wrote:

 Hi Everyone,
 
 I've been running backuppc for a while now, and have been wondering
 about the backuppc Server Status web page.
 
 It's always shows the same values:
 
 The servers PID is 15721, on host rabbit, version 3.1.0, started at 4/4
 13:49.
 
   * This status was generated at 5/9 20:38.
   * The configuration was last loaded at 4/8 18:32.
   * PCs will be next queued at 5/9 21:00.
   * Other info:
   * 0 pending backup requests from last scheduled wakeup,
   * 0 pending user backup requests,
   * 0 pending command requests,
   * Pool is 0.00GB comprising 0 files and 1 directories (as
 of 5/9 01:00),
   * Pool hashing gives 0 repeated files with longest chain
 0,
   * Nightly cleanup removed 0 files of size 0.00GB (around
 5/9 01:00),
   * Pool file system was recently at 70% (5/9 20:31),
 today's max is 70% (5/9 01:00) and yesterday's max was
 70%.
 
 
 And the RDD graph shows nothing of the BackupPC Pool Size for both 4
 weeks and 52 weeks.
 
 I'm wondering what I'm missing here.. how come this information isn't
 updating?
 
 Just to make sure, the setup I have is non standard in that my $topDir
 points to the mounted NAS where all backups go.
 
 The file system under $topDir looks as you'd expect when looking in
 the /usr/lib/backuppc directory.
 
 Am I missing a configuration setting to tell it where to look for where
 hte backups are stored?
 
 
 
 Thanks
 
 Peter.
Moving of $TopDir is an critical task. There are a lot of threads about 
this.
Much easier is to mount your NAS into /var/lib/backuppc.
However - you are sure the status page always displays
...
The servers PID is 15721, on host rabbit, version 3.1.0, started at 4/4
13:49.

  * This status was generated at 5/9 20:38.
...
If so, try restart backuppc or restart your server.

br
Matthias
-- 
Don't Panic


--
Achieve unprecedented app performance and reliability
What every C/C++ and Fortran developer should know.
Learn how Intel has extended the reach of its next-generation tools
to help boost performance applications - inlcuding clusters.
http://p.sf.net/sfu/intel-dev2devmay
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup retention and deletion.

2011-05-13 Thread Matthias Meyer
Robin Lee Powell wrote:

 
 So, I had FullKeepCntMin set really high, and I lowered it, and it's
 been several days, and yet I still have a ton of old fulls lying
 around.
 
 My questions:
 
 1.  When and how do deletions of the excess occur?
 
 2.  Why aren't they occuring?  (in particular, is it possible that I
 need to completely restart the server for that to work? we *always*
 have backups running, and there's still no stop when all current
 backups are complete option, so I avoid that)
 
 Here's the current config:
 
 $Conf{FullPeriod} = '3.7';
 $Conf{IncrPeriod} = '0.9';
 $Conf{FullKeepCnt} = '25';
 $Conf{FullKeepCntMin} = '5';
 $Conf{FullAgeMax} = '60';
 $Conf{IncrKeepCnt} = '45';
 $Conf{IncrKeepCntMin} = '10';
 $Conf{IncrAgeMax} = '60';
 
 And here's the list of backups with types and dates:
 
 # awk -F' ' '{ print $1, $2, strftime(%F, $3); }' backups
 0 full 2011-02-16
 4 full 2011-02-21
 10 full 2011-02-25
 16 full 2011-03-01
 22 full 2011-03-05
 28 full 2011-03-10
 34 full 2011-03-14
 40 full 2011-03-18
 46 full 2011-03-22
 51 full 2011-03-26
 52 incr 2011-03-27
 53 incr 2011-03-28
 54 incr 2011-03-28
 55 incr 2011-03-29
 56 incr 2011-03-30
 57 full 2011-03-31
 58 incr 2011-03-31
 59 incr 2011-04-01
 60 incr 2011-04-02
 61 incr 2011-04-03
 62 full 2011-04-04
 63 incr 2011-04-06
 64 incr 2011-04-07
 65 incr 2011-04-07
 66 full 2011-04-08
 67 incr 2011-04-09
 68 incr 2011-04-09
 69 incr 2011-04-10
 70 incr 2011-04-11
 71 incr 2011-04-12
 72 full 2011-04-12
 73 incr 2011-04-13
 74 incr 2011-04-14
 75 incr 2011-04-15
 76 incr 2011-04-15
 77 full 2011-04-16
 78 incr 2011-04-17
 79 incr 2011-04-17
 80 incr 2011-04-18
 81 incr 2011-04-19
 82 full 2011-04-20
 83 incr 2011-04-22
 84 incr 2011-04-22
 85 incr 2011-04-23
 86 full 2011-04-24
 87 incr 2011-04-26
 88 incr 2011-04-27
 89 incr 2011-04-27
 90 incr 2011-04-28
 91 full 2011-04-29
 92 incr 2011-04-29
 93 incr 2011-04-30
 94 incr 2011-05-02
 95 full 2011-05-03
 96 incr 2011-05-04
 97 incr 2011-05-04
 98 incr 2011-05-05
 99 incr 2011-05-06
 100 full 2011-05-07
 101 incr 2011-05-08
 102 incr 2011-05-09
 103 incr 2011-05-10
 104 full 2011-05-11
 105 incr 2011-05-11
 106 incr 2011-05-12
 107 incr 2011-05-13
 108 full 2011-05-13
 
 
 Thanks for your help.
 
 -Robin
 
Where is your problem? You have 22 full backups already achieved. So your 
server will also keep the next 3 full and will remove the #0 since the 4. 
full backup will occur.
The count of already achieved incrementals is 45. Exactly what you want.

br
Matthias


-- 
Don't Panic


--
Achieve unprecedented app performance and reliability
What every C/C++ and Fortran developer should know.
Learn how Intel has extended the reach of its next-generation tools
to help boost performance applications - inlcuding clusters.
http://p.sf.net/sfu/intel-dev2devmay
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup backuppc pool with rsync offsite

2011-04-25 Thread Matthias Meyer
Hi Jeffrey,

Thanks for sending your perl script.
Unfortunately I can't answer you because:

   - The following addresses had permanent fatal errors -
j...@kosowsky.org

   - Transcript of session follows -
... while talking to smtp.secureserver.net.:

 550 5.7.1 SPF unauthorized mail is prohibited.
554 5.0.0 Service unavailable


So I write it to the mailing list:
Within my debian I need
use lib /usr/share/backuppc/lib;
instead
use lib /usr/share/BackupPC/lib;
I would believe there is no solution by perl?
Because I don't want a symlink from /usr/share/BackupPC to 
/usr/share/backuppc I changed your source.

I get an error:
Can't locate BackupPC/jLib.pm in @INC ...
and unfortunately I don't found enough information about this jLib in the 
internet.
What is it? I believe it is not http://jlib.sourceforge.net/index.html;.


br
Matthias
-- 
Don't Panic


--
Fulfilling the Lean Software Promise
Lean software platforms are now widely adopted and the benefits have been 
demonstrated beyond question. Learn why your peers are replacing JEE 
containers with lightweight application servers - and what you can gain 
from the move. http://p.sf.net/sfu/vmware-sfemails
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] How to disable mailings from BackupPC

2011-04-06 Thread Matthias Meyer
Hi,

I've run a copy of my production BackupPC within my development environment.
Is it possible to disable mailing from BackupPC within my development 
environment?
As an workaround I've set $Conf{SendmailPath} = /usr/sbin/sendmail-no

Thanks
Matthias
-- 
Don't Panic


--
Xperia(TM) PLAY
It's a major breakthrough. An authentic gaming
smartphone on the nation's most reliable network.
And it wants your games.
http://p.sf.net/sfu/verizon-sfdev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] bare metal restore?

2011-04-04 Thread Matthias Meyer
Neal Becker wrote:

 Carl Wilhelm Soderstrom wrote:
 
 On 04/04 07:40 , Neal Becker wrote:
 Are there instructions for using backuppc for bare metal restore?
 
 Probably somewhere. It's fairly straightforward tho.
 
 Boot the bare-metal machine with Knoppix (or your choice of rescue
 disks). Partition and format the drives.
 Mount the partitions in the arrangement you want. (you'll have to make
 some directories in order to have mount points).
 
 Set up a listening netcat process to pipe to tar. will look something
 like: netcat -l -p |tar -xpv -C /path/to/mounted/empty/filesystems
 
 on the BackupPC server, become the backuppc user
 (Presuming it's a Debian box) run
 '/usr/share/backuppc/bin/BackupPC_tarCreate -n backup number -h
 hostname -s sharename path to files to be restored | netcat
 bare-metal machine '
 
 the 'backup number' can be '-1' for the most recent version.
 
 An example of the BackupPC_tarCreate command might be:
 /usr/share/backuppc/bin/BackupPC_tarCreate -n -1 -h target.example.com -s
 / / | netcat target.example.com 
 
 
 Thanks.
 
 Would there be a similar procedure using rsync?
 
rsync wouldn't be a good solution in this szenario.
You don't have any data on the client. So rsync wouldn't find anything to 
compare with.
Because that - other solutions, like tar, are smarter because faster.

br
Matthias
-- 
Don't Panic


--
Xperia(TM) PLAY
It's a major breakthrough. An authentic gaming
smartphone on the nation's most reliable network.
And it wants your games.
http://p.sf.net/sfu/verizon-sfdev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Viewing detail of a backup in progress?

2011-04-04 Thread Matthias Meyer
Carl Wilhelm Soderstrom wrote:

 On 04/04 07:14 , Holger Parplies wrote:
 in particular, they are compressed, so the end of the file is in my
 experience usually a considerable amount behind the file currently
 copying. This is also the reason you can't simply switch off buffering
 for the log files (compression needs reasonably sized chunks to operate
 on for efficient results). It might make sense to think about
 (optionally) writing log files uncompressed and compressing them after
 the backup has finished. Wanting to follow backup progress seems to be a
 frequent enough requirement. Putting the log files on a disk separate
 from the pool FS should probably be encouraged in this case ;-).
 
 These are all terribly good points.
 
 Perhaps the current file can simply be stored in memory and presented via
 the web interface? Is there a variable that already exists and can be read
 by the web interface to present the current file being copied?
 
Not realy, not yet. But it will counted during backup and BackupPC_dump get 
them at the end of an backup:
my @results = $xfer-run();
$tarErrs   += $results[0];
$nFilesExist   += $results[1];
$sizeExist += $results[2];
$sizeExistComp += $results[3];
$nFilesTotal   += $results[4];
$sizeTotal += $results[5];

Furthermore BackupPC_dump use eventhandler like:
$SIG{TTIN} = \catch_signal;

So it should be no problem to add an additional eventhandler
$SIG{IO} = \write_status;

which than will collect the actual transfer rates and write them into a 
file.
But maybee - it is possibly a problem to write in a file if the event occurs 
during writing in a file.

Any Ideas?

br
Matthias
-- 
Don't Panic


--
Xperia(TM) PLAY
It's a major breakthrough. An authentic gaming
smartphone on the nation's most reliable network.
And it wants your games.
http://p.sf.net/sfu/verizon-sfdev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Keeping 1 golden backup for a year

2011-03-25 Thread Matthias Meyer
Brad Alexander wrote:

 Okay, this kind of confuses me. I would like to have the following backup
 strategy:
 
 6 incrementals (*2)
 1 weekly (*2)
 1 yearly (*1)
 
 Currently,
 
 $Conf{FullKeepCnt} = [ 1, 0, 1, 0, 0, 1 ];
 
 I know this isn't right, but it's not what I would anticipate seeing. For
 instance, one of my backup targets has:
 
 Backup# Type Filled Level Start Date
  626 full yes 02/21 23:00
  645 full yes 03/15 01:02
  649 incr  no 13/19 01:00
  650 incr  no 13/20 01:00
  651 incr  no 13/21 01:00
  652 full yes 03/22 01:00
  653 incr  no 13/23 01:00
  654 incr  no 13/24 01:00
  655 incr  no 13/25 01:00
 
 incrs between 3/16 and 3/18 are missing because of a power outage and the
 backuppc filesystem not mounting correctly.
 
 So how should I set up my FullKeepCnt to keep one backup for a year and
 two sets of fulls/incrementals for the past two weeks?
 
 Thanks,
 --b
There are a lot of threads about this. In my point of view it was one of 
the few room for improvements within BackupPC.

I've developed a patch for V3.1 which can be found in backuppc.devel. It 
will solve your requirements. But I didn't migrate the patch to V3.2 and I 
don't know when I will do that.

br
Matthias
-- 
Don't Panic


--
Create and publish websites with WebMatrix
Use the most popular FREE web apps or write code yourself; 
WebMatrix provides all the features you need to develop and 
publish your website. http://p.sf.net/sfu/ms-webmatrix-sf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] host resolution problem

2011-03-20 Thread Matthias Meyer
Gabriel Rossetti wrote:

 On 03/19/2011 09:49 PM, Papp Tamas wrote:
 On 03/19/2011 09:02 PM, Gabriel Rossetti wrote:
 Hello everyone,

 I read up on how BackupPC finds hosts, in my case nmblookup works. The
 problem is that when it tries to run the ssh command, it uses the host's
 name and not the IP it finds with nmblookup, so ssh exits complaining it
 can't find the host. How can I get BackupPC to use the IP found using
 nmblookup? I thought it would automatically do this since it can't
 resolve it directly, but I guess not.

Whats about to configure $Conf{SshPath} to use a self made script which use 
ssh by IP instead name.

Another (in my point of view much better) way is to configure your clients 
to tell their DNS-names to the DNS service.
In Linux you can reach this by editing /etc/dhcp3/dhclient.conf or 
/etc/dhcp/dhclient.conf (at least in Debian)
Write there a line 'send host-name hostname;'

br
Matthias
-- 
Don't Panic


--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Weekly full with no incremental configuration leads to daily full backups

2011-03-15 Thread Matthias Meyer
Hi,

I try to configure a client to make only weekly full backups but no 
incrementals:

$Conf{FullAgeMax} = '732';
$Conf{FullKeepCnt} = [
  '4',
  '0',
  '13'
];
$Conf{FullKeepCntMin} = '15';
$Conf{IncrAgeMax} = '0';
$Conf{IncrKeepCnt} = '0';
$Conf{IncrLevels} = [
  '0'
];

But the above configuration leads to daily full backups.

Anyone have a configuration like mine?

Thanks in advance
Matthias
-- 
Don't Panic


--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup backuppc pool with rsync offsite

2011-03-15 Thread Matthias Meyer
Jeffrey J. Kosowsky wrote:

 
 I think there is a 3rd camp:
   3. Scripts that understand the special structure of the pool and pc
   trees and efficiently create lists of all hard links in pc
   directory.
 a] BackupPC_tarPCCOPY
 Included in standard BackupPC installations. It uses a perl
  script to recurse through the pc directory, calculate (and
   cache if you have enough memory) the file name md5sums and
 then uses that to create a tar-formatted file of the hard
   links that need to be created. This routine has been
   well-tested at least on smaller systems.
 
 b] BackupPC_copyPcPool
 Perl script that I recently wrote that should be significantly
 faster than [a], particularly on machines with low memory
 and/or slower cpus. This script creates a new temporary
 inode-number indexed pool to allow direct lookup of links and
 avoid having to calculate and check file name md5sums.  The
 pool is then rsynced (without hard links -- i.e. no -H flag)
 and then the restore script is run to recreate the hard
 links. I recently used this to successfully copy over a pool of
 almost 1 million files and a pc tree of about 10 million files.
 See the recent archives to retrieve a copy.
  
Hi Jeffrey,

I can't find your BackupPC_copyPcPool. I looking for it on the wiki as well 
as in backuppc.general.
What/where are this recent archive ?

Thanks in advance
Matthias
--
Don't Panic


--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] unexpected response rsync version 3.0.7 protocol version 30

2010-12-20 Thread Matthias Meyer
Saturn2888 wrote:

 Anyone know anything about this?
 
 r...@name:~# rsync r...@localhost::
 rsync: server sent rsync  version 3.0.7  protocol version 30 rather than
 greeting rsync error: error starting client-server protocol (code 5) at
 main.c(1524) [Receiver=3.0.7]
 
 I've been getting this same error for months now. I wish I knew what was
 causing it so I can fix it. I have two servers both with this issue. Both
 have BackupPC and Zentyal 2.0 installed.
 
What is the rsync-Version (rsync --version) on noth machines?
What is Zentyal?

br
Matthias
-- 
Don't Panic


--
Lotusphere 2011
Register now for Lotusphere 2011 and learn how
to connect the dots, take your collaborative environment
to the next level, and enter the era of Social Business.
http://p.sf.net/sfu/lotusphere-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Can't call method getStats on an undefined value

2010-12-20 Thread Matthias Meyer
Jeffrey J. Kosowsky wrote:

 Whenever DumpPreShareCmd fails, I get the following errors in my log:
 DumpPreShareCmd returned error status 32256... exiting
 Can't call method getStats on an undefined value at
 /usr/share/BackupPC/bin/BackupPC_dump line 1160.
 
 I understand the first line, but don't understand why I'm getting the
 second error.
 BTW, getStats appears several times in BackupPC_dump, but none are on
 line 1160.

Hi Jeffrey,
I would believe that the line number reference to the following else
statement.
Did you try to debug it and set a breakpoint to 1160 and 1162?

My condolence, it seems to be a hard to evaluating error/feature.

br
Matthias
-- 
Don't Panic


--
Lotusphere 2011
Register now for Lotusphere 2011 and learn how
to connect the dots, take your collaborative environment
to the next level, and enter the era of Social Business.
http://p.sf.net/sfu/lotusphere-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rrdtool patch didn't display actual graphics

2010-12-19 Thread Matthias Meyer
Tyler J. Wagner wrote:

 On Tue, 2010-12-14 at 04:02 +0100, Matthias Meyer wrote:
 Hi,
 
 After a system crash and restore the famous rrdtool graphic (a patch in
 the debian package of BackupPC 3.1.0) didn't be shown in an actual
 version. It seems that the graphic from before the crash will be shown.
 The /var/lib/backuppc/log/pool.rrd as well as /var/log/backuppc/pool.rrd
 seems to be updated every night.
 ~#ls -dl /var/lib/backuppc /var/lib/backuppc/log
 /var/lib/backuppc/log/pool.rrd /var/log/backuppc/
 /var/log/backuppc/pool.rrd
 drwxr-xr-x 9 backuppc backuppc  4096 11. Dez 14:55 /var/lib/backuppc
 drwxr-x--- 2 backuppc backuppc  4096 13. Dez 21:53 /var/lib/backuppc/log
 -rw-r- 1 backuppc backuppc 11744 13. Dez 02:57
 /var/lib/backuppc/log/pool.rrd
 drwxr-xr-x 2 backuppc backuppc  4096 14. Dez 03:00 /var/log/backuppc/
 -rw-r- 1 backuppc backuppc 11744 14. Dez 03:55
 /var/log/backuppc/pool.rrd
 
 Another rrdtool, running on my server within webmin, works right.
 Therefore I assume that all necessary components are available/installed.
 
 Did your CPU architecture change (i386 to amd64) after the crash? If so:
 

http://www.tolaris.com/2010/09/06/rrdtool-this-rrd-was-created-on-other-architecture/
 
 If not, use rrdtool info /var/lib/backuppc/log/pool.rrd to verify the
 file is good.
 
 Regards,
 Tyler
 
Thanks for the hint.
I don't know why but starting with the next day all working fine.
I didn't change anything.
Maybee it is possible that:
rrdtool runs only after a succcessfull run of BackupPC_nightly
BackupPC_nightly didn't run successfull some nights because of server
restarts or something like this.

Thanks nevertheless
Matthias
-- 
Don't Panic


--
Lotusphere 2011
Register now for Lotusphere 2011 and learn how
to connect the dots, take your collaborative environment
to the next level, and enter the era of Social Business.
http://p.sf.net/sfu/lotusphere-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Different Backup schedule issue (hourly and daily)

2010-12-19 Thread Matthias Meyer
mailing.lists.wam mailing.lists.wam wrote:

 Hi,
 
 I'm having strange issues with BackupPC 3.1.0 on ubuntu 10.04.
 
 1) I would like to configure backup schedule for a specified host this
 way, but couldn't find suitable config:
 
 Backup every hour and keep 72 backups ( 3 days) == IncrPeriod 0.03 ?
 IncrKeepCNT 72 ? IncrKeepCntMin 72 ? IncrLevels ?
 Keep 32 Full backups (a bit more than a month) = FullPeriod = 0.9 ?
 FullKeepCntMin = 32 ?
 Keep 12 Months backups (1 year)

IncrPeriod=0.04 (because one day = 24hours, 1/24=0,0416 
~0,04)
IncrKeepCnt=72
IncrLevels = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, ... , 24]
FullPeriod=0,97 (every day a full backup)
FullKeepCnt=[32, 0, 0, 0, 0, 12](32 backups at 1 * $Conf{FullPeriod} = 
1day
(0 backups at 2 * $Conf{FullPeriod} = 
2day
(0 backups at 4 * $Conf{FullPeriod} = 
4day
(0 backups at 8 * $Conf{FullPeriod} = 
8day
(0 backups at 16 * $Conf{FullPeriod} = 
16day
(12 backups at 32 * $Conf{FullPeriod} = 
32day

 
 In this case does the deduplication take care of hard disk drive space ?
Deduplication will also work in the above case and will save hard disk
space.
 
 2) For the other hosts i would like to setup schedule this way:
 Incremental each day
 Full on Sunday afternoon
 Keep 2 Fulls backups and 6 incremental for each full backup,
 I setup the schedule like this :
 FullPeriod = 6.9
 FullKeepCNT = 2
 IncrPeriod = 0.9
 IncrKeepCNT = 6
 IncrKeepCntMin = 1
 IncrLevels = 1,2,3,4,5,6
 Blackout Periods hourbegin=8, hourEnd=22, weekDays=1,2,3,4,5
 
 Is this setup correct ? BackupPC would'nt make some backup sometimes, many
 hosts has Nothing to do state and Incremental Age of backup  1.

I didn't know if BackupPC supports so frequent backups.

If you will save 6 incremental for each full you need IncrKeepCnt = 12.
Nobody can guarantee that the full backup runs at sunday.
e.g.:
Incrementals run fine from Monday until Saturday.
Client is down on Sunday.
The full backup will run at the following Monday. The next 6 incrementals
will run from Tuesday until Sunday and the next full will run at monday
again.

 Thanks for all your advices.

br
Matthias
-- 
Don't Panic


--
Lotusphere 2011
Register now for Lotusphere 2011 and learn how
to connect the dots, take your collaborative environment
to the next level, and enter the era of Social Business.
http://p.sf.net/sfu/lotusphere-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] rrdtool patch didn't display actual graphics

2010-12-13 Thread Matthias Meyer
Hi,

After a system crash and restore the famous rrdtool graphic (a patch in the
debian package of BackupPC 3.1.0) didn't be shown in an actual version.
It seems that the graphic from before the crash will be shown.
The /var/lib/backuppc/log/pool.rrd as well as /var/log/backuppc/pool.rrd seems
to be updated every night.
~#ls -dl /var/lib/backuppc /var/lib/backuppc/log /var/lib/backuppc/log/pool.rrd 
/var/log/backuppc/ /var/log/backuppc/pool.rrd
drwxr-xr-x 9 backuppc backuppc  4096 11. Dez 14:55 /var/lib/backuppc
drwxr-x--- 2 backuppc backuppc  4096 13. Dez 21:53 /var/lib/backuppc/log
-rw-r- 1 backuppc backuppc 11744 13. Dez 02:57 
/var/lib/backuppc/log/pool.rrd
drwxr-xr-x 2 backuppc backuppc  4096 14. Dez 03:00 /var/log/backuppc/
-rw-r- 1 backuppc backuppc 11744 14. Dez 03:55 /var/log/backuppc/pool.rrd

Another rrdtool, running on my server within webmin, works right. Therefore
I assume that all necessary components are available/installed.

http://localhost/backuppc/index.cgi?image=5 will shown the same grafik as 
BackupPC.

Any hints?
Thanks in advance
Matthias
-- 
Don't Panic


--
Lotusphere 2011
Register now for Lotusphere 2011 and learn how
to connect the dots, take your collaborative environment
to the next level, and enter the era of Social Business.
http://p.sf.net/sfu/lotusphere-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Generating folders to backup from script

2010-11-30 Thread Matthias Meyer
Cyril Lavier wrote:

 Matthias Meyer wrote:
 Cyril Lavier wrote:

   
 Hi.

 I need to backup some folders on a server, which are not static, and I
 usually generate the folder list using a command (find).

 On the documentation, I can't find anything on using a command to
 generate the folder list.

 I tried some things, like putting a command which edits the .pl file in
 $Conf{DumpPreUserCmd}, but it doesn't seems to work.

 If you have any idea, this could help me.

 Thanks.

 
 What's about backing up the root folder and exclude folders which are
 should not backed up?

 If you use rsync you can also modify the rsyncd.conf on client-side (your
 server) during $Conf{DumpPreUserCmd}.

 rsyncd.conf:
 [DynamicFolders]
  path = /whereverTheRoot
  exclude = *
  include = /whereverTheRoot/dynFolder1 /whereverTheRoot/dynFolder2 ...

 br
 Matthias
   
 Hi Matthias, that's a good idea.
 
 I was only thinking on rsync+ssh, and rsyncd also exists.
 
 For information, I tried to rsync the complete root folder. And it took
 hours only to generate the file list to transfer.
 
 This is why I had to use a find command to divide to file tree in
 smaller ones to be faster to backup.
 
 I will try your solution and reply to the list to inform about the
 results.
 
 Thanks.
 
Interesting Idea.
Changeing of rsyncd.conf will work definitly. I do that within
Windows/cygwin to map the share to a VSS mounted drive.

Can you explain your idea a little bit?
Did you backup all files but in another way as rsync did?
Why do you believe find is faster than rsync?

br
Matthias
-- 
Don't Panic


--
Increase Visibility of Your 3D Game App  Earn a Chance To Win $500!
Tap into the largest installed PC base  get more eyes on your game by
optimizing for Intel(R) Graphics Technology. Get started today with the
Intel(R) Software Partner Program. Five $500 cash prizes are up for grabs.
http://p.sf.net/sfu/intelisp-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Generating folders to backup from script

2010-11-28 Thread Matthias Meyer
Cyril Lavier wrote:

 Hi.
 
 I need to backup some folders on a server, which are not static, and I
 usually generate the folder list using a command (find).
 
 On the documentation, I can't find anything on using a command to
 generate the folder list.
 
 I tried some things, like putting a command which edits the .pl file in
 $Conf{DumpPreUserCmd}, but it doesn't seems to work.
 
 If you have any idea, this could help me.
 
 Thanks.
 
What's about backing up the root folder and exclude folders which are should
not backed up?

If you use rsync you can also modify the rsyncd.conf on client-side (your
server) during $Conf{DumpPreUserCmd}.

rsyncd.conf:
[DynamicFolders]
 path = /whereverTheRoot
 exclude = *
 include = /whereverTheRoot/dynFolder1 /whereverTheRoot/dynFolder2 ...

br
Matthias
-- 
Don't Panic


--
Increase Visibility of Your 3D Game App  Earn a Chance To Win $500!
Tap into the largest installed PC base  get more eyes on your game by
optimizing for Intel(R) Graphics Technology. Get started today with the
Intel(R) Software Partner Program. Five $500 cash prizes are up for grabs.
http://p.sf.net/sfu/intelisp-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] New release of BackupPC_deleteBackup - who can put it into the wiki?

2010-10-31 Thread Matthias Meyer
Matthias Meyer wrote:
 
 Where is the problem?
 More than a year ago I asked for an update of the mediawiki entry.
 The entry is realy outdated!
 
 Maybee someone can add me to the editor group so that I can update the
 entry.
 Send me a mail and ask me for the necessary account informations.
 
 Thanks in advance
 Matthias

Access granted. Thanks Graig :-)
How_to_delete_backups is up2date now:
https://sourceforge.net/apps/mediawiki/backuppc/index.php?title=How_to_delete_backups

br
Matthias
-- 
Don't Panic


--
Nokia and ATT present the 2010 Calling All Innovators-North America contest
Create new apps  games for the Nokia N8 for consumers in  U.S. and Canada
$10 million total in prizes - $4M cash, 500 devices, nearly $6M in marketing
Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store 
http://p.sf.net/sfu/nokia-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] New release of BackupPC_deleteBackup - who can put it into the wiki?

2010-10-30 Thread Matthias Meyer
Matthias Meyer wrote:

 Jeffrey J. Kosowsky wrote:
 
 Matthias Meyer wrote at about 22:04:42 +0200 on Sunday, October 10, 2010:
   Robin Lee Powell wrote:
   
On Sun, Dec 06, 2009 at 01:13:57AM +0100, Matthias Meyer wrote:
Hi,

I have a new release of the BackupPC_deleteBackup script.
Unfortunately I can't put it into the wiki
   
  

(http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=How_to_delete_backups).
Jeffrey would do that but I didn't reach him via email :-(
 
 Sorry - I didn't receive any such email. Probably the best way to
 reach me is via the mailing list if email fails.
 
 Of course, Matthias is the original owner -- I just added a few tweaks
 ;)
 
 For my own internalversion control, can you tell me if any behaviors
 changed other than adding --remove
 Note I did a diff and noticed a fair number of changes but wasn't sure
 which are cosmetic, which are bug fixes, which are polish and
 which are significant. The reason I am asking is that I notice at
 least one change that would break the script on my
 install. Specifically, the Fedora package version stores the config in
 /etc/BackupPC while you changed it to /etc/backuppc. For this change
 maybe the path could look both places or maybe there should be a
 user-defined variable at the top of the script.
 
 Beyond that, I was wondering whether any of the fixes or changes
 may similarly work on some systems but cause breaks on others.
 
 Hi Jeffrey,
 
 All improvements which was provided in the wiki are included in this new
 release.
 
 In addition I've added:
 - remove XferLOG.number[.z] together with backup number
 - --remove will remove a host from the /etc/backuppc/hosts as well as all
 of his backups
 - (today) look for /etc/BackupPC as well as for /etc/backuppc
 
 All the rest from a diff should be cosmetics or changes which implemented
 not from me.
 
 
 So I would believe you can change the wiki text into:
 
 How to delete backups from the archive:
 Put this script somewhere in your path. It includes usage information when
 run without arguments.
 
 It will remove incremental backups too, if a full backup should be removed
 where the incrementals based on. The incrementals will also be removed if
 they are filled. Here is room for improvement :-)
 
 FreeBSD
 if you want to use BackupPC on a FreeBSD system, you have to change
 the 'stat' calls. Change the two occurencies of the line:
 
 BackupTime=`stat -c %y
 $TopDir/pc/$client/$BackupNumber/backupInfo | awk '{print $1}'`
 
 into
 
 BackupTime=`stat -f %Sm -t %Y-%m-%d
 $TopDir/pc/$client/$BackupNumber/backupInfo`
 
 br
 Matthias

Where is the problem?
More than a year ago I asked for an update of the mediawiki entry.
The entry is realy outdated!

Maybee someone can add me to the editor group so that I can update the
entry.
Send me a mail and ask me for the necessary account informations.

Thanks in advance
Matthias
-- 
Don't Panic


--
Nokia and ATT present the 2010 Calling All Innovators-North America contest
Create new apps  games for the Nokia N8 for consumers in  U.S. and Canada
$10 million total in prizes - $4M cash, 500 devices, nearly $6M in marketing
Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store 
http://p.sf.net/sfu/nokia-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Freeing up disk space (maybe with IncrFill)

2010-10-19 Thread Matthias Meyer
Jason A. Spiro wrote:

 Hi all,
 
 To all developers and other contributors on this list:  Thank you for
 contributing to BackupPC.
 
 My client's BackupPC server's 1TB backup-storage RAID array is almost
 out of space (96% full).  He's running BackupPC 3.1.0 on Ubuntu.  The
 server backs up perhaps a few dozen hosts.  It's set to keep at least
 one full backup and at least sixty incremental backups per host.
 
 1.  I guessed that IncrFill was a good way to save space:  this way
 only one full would be stored per machine.  Was I right?  Is IncrFill
 the best way to save space?
 
 2.  I enabled this a few hours ago and manually ran BackupPC_nightly
 but the server hasn't filled any incrementals.  Is there a reasonably
 easy way to force it to fill the incrementals?  (Otherwise I may just
 set it not to keep any incrementals:  I figure that should make
 BackupPC discard all of them, perhaps right after I reload the config
 file.)
 
 Thanks in advance,
 Jason Spiro
 

You will find some hints within the documentation:
- BackupPC's CGI interface automatically fills incremental backups
- The default for $Conf{IncrFill} is off, since there is no need to fill
incremental backups.

Every time you save a backup you need the space for new files. BackupPC will
save space by storing identical files only once.
But if you fill a backup you will only create hardlinks within the backup to
unchanged files within prior backups.

br
Matthias

-- 
Don't Panic


--
Download new Adobe(R) Flash(R) Builder(TM) 4
The new Adobe(R) Flex(R) 4 and Flash(R) Builder(TM) 4 (formerly 
Flex(R) Builder(TM)) enable the development of rich applications that run
across multiple browsers and platforms. Download your free trials today!
http://p.sf.net/sfu/adobe-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] New release of BackupPC_deleteBackup - who can put it into the wiki?

2010-10-18 Thread Matthias Meyer
Jeffrey J. Kosowsky wrote:

 Matthias Meyer wrote at about 22:04:42 +0200 on Sunday, October 10, 2010:
   Robin Lee Powell wrote:
   
On Sun, Dec 06, 2009 at 01:13:57AM +0100, Matthias Meyer wrote:
Hi,

I have a new release of the BackupPC_deleteBackup script.
Unfortunately I can't put it into the wiki
   
  
(http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=How_to_delete_backups).
Jeffrey would do that but I didn't reach him via email :-(
 
 Sorry - I didn't receive any such email. Probably the best way to
 reach me is via the mailing list if email fails.
 
 Of course, Matthias is the original owner -- I just added a few tweaks
 ;)
 
 For my own internalversion control, can you tell me if any behaviors
 changed other than adding --remove
 Note I did a diff and noticed a fair number of changes but wasn't sure
 which are cosmetic, which are bug fixes, which are polish and
 which are significant. The reason I am asking is that I notice at
 least one change that would break the script on my
 install. Specifically, the Fedora package version stores the config in
 /etc/BackupPC while you changed it to /etc/backuppc. For this change
 maybe the path could look both places or maybe there should be a
 user-defined variable at the top of the script.
 
 Beyond that, I was wondering whether any of the fixes or changes
 may similarly work on some systems but cause breaks on others.
 
Hi Jeffrey,

All improvements which was provided in the wiki are included in this new
release.

In addition I've added:
- remove XferLOG.number[.z] together with backup number
- --remove will remove a host from the /etc/backuppc/hosts as well as all of
his backups
- (today) look for /etc/BackupPC as well as for /etc/backuppc

All the rest from a diff should be cosmetics or changes which implemented
not from me.


So I would believe you can change the wiki text into:

How to delete backups from the archive:
Put this script somewhere in your path. It includes usage information when 
run without arguments.

It will remove incremental backups too, if a full backup should be removed 
where the incrementals based on. The incrementals will also be removed if 
they are filled. Here is room for improvement :-)

FreeBSD
if you want to use BackupPC on a FreeBSD system, you have to change 
the 'stat' calls. Change the two occurencies of the line: 

BackupTime=`stat -c %y 
$TopDir/pc/$client/$BackupNumber/backupInfo | awk '{print $1}'`

into 

BackupTime=`stat -f %Sm -t %Y-%m-%d 
$TopDir/pc/$client/$BackupNumber/backupInfo`

br
Matthias
-- 
Don't Panic


BackupPC_deleteBackup.sh
Description: application/shellscript
--
Download new Adobe(R) Flash(R) Builder(TM) 4
The new Adobe(R) Flex(R) 4 and Flash(R) Builder(TM) 4 (formerly 
Flex(R) Builder(TM)) enable the development of rich applications that run
across multiple browsers and platforms. Download your free trials today!
http://p.sf.net/sfu/adobe-dev2dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] New release of BackupPC_deleteBackup - who can put it into the wiki?

2010-10-10 Thread Matthias Meyer
Robin Lee Powell wrote:

 On Sun, Dec 06, 2009 at 01:13:57AM +0100, Matthias Meyer wrote:
 Hi,
 
 I have a new release of the BackupPC_deleteBackup script.
 Unfortunately I can't put it into the wiki

(http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=How_to_delete_backups).
 Jeffrey would do that but I didn't reach him via email :-(
 
 Anybody else here would put it into the wiki?
 
 Did this ever get done?  I'm guessing not, given the last-modified
 date.  Could you send the new version to the list in the meantime?
 
 -Robin
 
Unfortunately, - no - :-(

I will attach the actual version.

br
Matthias

-- 
Don't Panic


BackupPC_deleteBackup.sh
Description: application/shellscript
--
Beautiful is writing same markup. Internet Explorer 9 supports
standards for HTML5, CSS3, SVG 1.1,  ECMAScript5, and DOM L2  L3.
Spend less time writing and  rewriting code and more time creating great
experiences on the web. Be a part of the beta today.
http://p.sf.net/sfu/beautyoftheweb___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] recurrent backup failed' messages

2010-07-30 Thread Matthias Meyer
Jack M. Nilles wrote:

 
 *Re: [BackupPC-users] recurrent backup failed' messages*
 From: Matthias Meyer matthias.me...@gm... - 2010-07-29 20:12
 
 Jack M. Nilles wrote:
 
  I'm moving BackupPC to a new machine, running SUSE 11.2. After
  installation I get nothing but the dreaded pink background and:
  'backup failed (Unable to read 4 bytes)'
  messages for each host being backed up. Yet, if I start a backup via:

  sudo -u backuppc /usr/local/BackupPC/bin/BackupPC_dump -v -fhost

  I do indeed get the backup, along with verbose output, provided I keep
  tabs on each backup section, since a password is required at each
  section.

  What little detail am I missing here?
 
 To tell us something about your configuration
   - connection to client (smb-share, ssh, rsync, ... )
   - xfermethod (smb, rsync, rsyncd, ... )
   - OS of client
 
 The connections to clients are all ssh
 Xfer method is rsync
 Client OSes are OS X and Linux
 
 Currently I'm testing this by putting the backed up files on the server
 but plan to move them to a USB hard drive when all is in operating order.
 
 The $Conf{RsyncShareName} entries usually have multiple entries; the
 BackupPC_dump fails during the transition from entry to entry if I'm not
 immediately there to input the root password. The files are being backed
 up but it sure isn't automatic.
 
 It almost seems like I'm getting timeouts and need to change some
 configuration setting to avoid it.

If so you should have some logfiles. As well from BackupPC as from client
side. Try to increase log verbosity of rsync.

br
Matthias
-- 
Don't Panic


--
The Palm PDK Hot Apps Program offers developers who use the
Plug-In Development Kit to bring their C/C++ apps to Palm for a share
of $1 Million in cash or HP Products. Visit us here for more details:
http://p.sf.net/sfu/dev2dev-palm
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] recurrent backup failed' messages

2010-07-29 Thread Matthias Meyer
Jack M. Nilles wrote:

 I'm moving BackupPC to a new machine, running SUSE 11.2. After
 installation I get nothing but the dreaded pink background and:
 'backup failed (Unable to read 4 bytes)'
 messages for each host being backed up. Yet, if I start a backup via:
 
 sudo -u backuppc /usr/local/BackupPC/bin/BackupPC_dump -v -f host
 
 I do indeed get the backup, along with verbose output, provided I keep
 tabs on each backup section, since a password is required at each section.
 
 What little detail am I missing here?

To tell us someting about your configuration
 - connection to client (smb-share, ssh, rsync, ... )
 - xfermethod (smb, rsync, rsyncd, ... )
 - OS of client

 I never had this problem with the 
 old machine running SUSE 11.1.

First of all -  
 
 jackn

-- 
Don't Panic


--
The Palm PDK Hot Apps Program offers developers who use the
Plug-In Development Kit to bring their C/C++ apps to Palm for a share
of $1 Million in cash or HP Products. Visit us here for more details:
http://p.sf.net/sfu/dev2dev-palm
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Trying to work around Windows filename too long problem

2010-07-27 Thread Matthias Meyer
James Ward wrote:

 full backup started for directory cDrive; updating partial #305
 Connected to tartarus:873, remote version 29
 Negotiated protocol version 28
 Connected to module cDrive
 Sending args: --server --sender --numeric-ids --perms --owner --group -D
 --links --hard-links --times --block-size=2048 --recursive --ignore-times
 . . Sent exclude: /Documents and Settings/*/Cookies Sent exclude:
 /Documents and Settings/*/Local Settings/Temporary Internet Files Sent
 exclude: /Documents and Settings/*/Local Settings/Temp Sent exclude:
 /Documents and Settings/*/NTUSER.DAT* Sent exclude: /Documents and
 Settings/*/ntuser.dat* Sent exclude: /Documents and Settings/*/Local
 Settings/Application Data/Microsoft/Windows/UsrClass.dat* Sent exclude:
 /Documents and Settings/*/Local Settings/Application
 Data/Mozilla/Firefox/Profiles/*/Cache Sent exclude: /Documents and
 Settings/*/Local Settings/Application
 Data/Mozilla/Firefox/Profiles/*/OfflineCache Sent exclude: /Documents and
 Settings/*/Recent Sent exclude: *.lock Sent exclude: Thumbs.db Sent
 exclude: IconCache.db Sent exclude: Cache*
 Sent exclude: cache*
 Sent exclude: /WINDOWS
 Sent exclude: /RECYCLER
 Sent exclude: /MSOCache
 Sent exclude: /System Volume Information
 Sent exclude: /AUTOEXEC.BAT
 Sent exclude: /BOOTSECT.BAK
 Sent exclude: /CONFIG.SYS
 Sent exclude: /hiberfil.sys
 Sent exclude: /pagefile.sys
 Sent exclude: .Trash
 Sent exclude: /Trash
 Sent exclude: /automount
 Sent exclude: /Network
 Sent exclude: /private/var/automount
 Sent exclude: /private/var/run
 Sent exclude: /private/var/vm
 Sent exclude: /private/var/tmp
 Sent exclude: /private/tmp
 Sent exclude: Caches
 Sent exclude: CachedMessages
 Sent exclude: /dev/fd
 Remote[1]: rsync: readlink ygdrive/c/WINDOWS/system32/c:/Program
 Files/Symantec/Symantec Endpoint Protection

Manager/Inetpub/ClientPackages/75a504099329ffbe483107be4c78de0c/full/Symantec
 Endpoint

Protection.mpkg/Contents/plugins/ClientType.bundle/Contents/Resources/English.lproj/ClientType.nib/classes.nib
 (in cDrive) failed: File name too long (91)

 additional similar messages snipped by Matthias 

 Xfer PIDs are now 14475 
 [ skipped 482 lines ]
 Can't write 34046 bytes to socket
 Read EOF:
 Tried again: got 0 bytes
 Child is aborting
 Parent read EOF from child: fatal error!
 Done: 380 files, 319900497 bytes
 Got fatal error during xfer (Child exited prematurely)
 Backup aborted (Child exited prematurely)
 Not saving this as a partial backup since it has fewer files than the
 prior one (got 380 and 380 files versus 12806)
 


Hi James,

Please write your answer below quoted text. So the thread is easier to read.
Also remove lines without interest to make a message short and clear.
:-)

Both cases, with and without the exclude results in a Read EOF:
One with the error Connection reset by peer and the other with Child is
aborting.

I would believe Connection reset by peer is a network interruption
and Child is aborting  means the client program die.
But I am not sure. A client logfile would be helpfully.

br
Matthias
-- 
Don't Panic


--
The Palm PDK Hot Apps Program offers developers who use the
Plug-In Development Kit to bring their C/C++ apps to Palm for a share 
of $1 Million in cash or HP Products. Visit us here for more details:
http://ad.doubleclick.net/clk;226879339;13503038;l?
http://clk.atdmt.com/CRS/go/247765532/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Trying to work around Windows filename too long problem

2010-07-26 Thread Matthias Meyer
James Ward wrote:

 Hi,
 
 Last week, Symantec Antivirus server software was upgraded on a couple of
 servers.  BackupPC has been faling to back them up since with filename too
 long errors.  I'm trying to exclude the directory containing these problem
 files, but now I'm getting:
 
 
 full backup started for directory cDrive; updating partial #305
 Connected to tartarus:873, remote version 29
 Negotiated protocol version 28
 Connected to module cDrive
 Sending args: --server --sender --numeric-ids --perms --owner --group -D
 --links --hard-links --times --block-size=2048 --recursive --ignore-times
 . . Sent exclude: /Documents and Settings/*/Cookies Sent exclude:
 /Documents and Settings/*/Local Settings/Temporary Internet Files Sent
 exclude: /Documents and Settings/*/Local Settings/Temp Sent exclude:
 /Documents and Settings/*/NTUSER.DAT* Sent exclude: /Documents and
 Settings/*/ntuser.dat* Sent exclude: /Documents and Settings/*/Local
 Settings/Application Data/Microsoft/Windows/UsrClass.dat* Sent exclude:
 /Documents and Settings/*/Local Settings/Application
 Data/Mozilla/Firefox/Profiles/*/Cache Sent exclude: /Documents and
 Settings/*/Local Settings/Application
 Data/Mozilla/Firefox/Profiles/*/OfflineCache Sent exclude: /Documents and
 Settings/*/Recent Sent exclude: *.lock Sent exclude: Thumbs.db Sent
 exclude: IconCache.db Sent exclude: Cache*
 Sent exclude: cache*
 Sent exclude: /WINDOWS
 Sent exclude: /RECYCLER
 Sent exclude: /MSOCache
 Sent exclude: /System Volume Information
 Sent exclude: /AUTOEXEC.BAT
 Sent exclude: /BOOTSECT.BAK
 Sent exclude: /CONFIG.SYS
 Sent exclude: /hiberfil.sys
 Sent exclude: /pagefile.sys
 Sent exclude: .Trash
 Sent exclude: /Trash
 Sent exclude: /automount
 Sent exclude: /Network
 Sent exclude: /private/var/automount
 Sent exclude: /private/var/run
 Sent exclude: /private/var/vm
 Sent exclude: /private/var/tmp
 Sent exclude: /private/tmp
 Sent exclude: Caches
 Sent exclude: CachedMessages
 Sent exclude: /dev/fd
 Sent exclude: /Program Files/Symantec/Symantec Endpoint Protection
 Manager/Inetpub/ClientPackages Read EOF: Connection reset by peer
 Tried again: got 0 bytes
 fileListReceive() failed
 Done: 0 files, 0 bytes
 Got fatal error during xfer (fileListReceive failed)
 Backup aborted (fileListReceive failed)
 Not saving this as a partial backup since it has fewer files than the
 prior one (got 0 and 0 files versus 12806)
 
 These servers have backed up fine for years until now.  Any ideas?
 
 Thanks in advance,
 
 James
 
Read EOF: Connection reset by peer indicates that your client interrupt
the connection.
Do you have a log from client side?

What happens if you remove the exclude of /Program Files/Symantec/Symantec
Endpoint Protection Manager/Inetpub/ClientPackages
The backup work (with long filename error)?

br
Matthias
-- 
Don't Panic


--
The Palm PDK Hot Apps Program offers developers who use the
Plug-In Development Kit to bring their C/C++ apps to Palm for a share 
of $1 Million in cash or HP Products. Visit us here for more details:
http://ad.doubleclick.net/clk;226879339;13503038;l?
http://clk.atdmt.com/CRS/go/247765532/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backuppc restore

2010-07-24 Thread Matthias Meyer
Huba Zsolt wrote:

 Hi
 
 I'm trying to configure a simple backup solution for a development
 server, backuppc runs on the server. I use Ubuntu 8.04, backuppc
 3.0.0.
 It seems that creating backups working fine (0 errors on summary page)
 but restore doesn't work. I tried to restore more files with different
 permissions but neither of them were restored. I always get this error
 message:
 
 Running: /usr/bin/ssh -q -x -l root localhost env LC_ALL=C /bin/tar -x
 -p --numeric-owner --same-owner -v -f - -C /tmp
 Running: /usr/share/backuppc/bin/BackupPC_tarCreate -h localhost -n 6
 -s /tmp -t /teszt1
 Xfer PIDs are now 7668,7669
 Tar exited with error 65280 () status
 restore failed: BackupPC_tarCreate failed
 
 I also would like to ask what is the preferred method to copy the
 created backup to offline media? I would like to make the backup to
 the server itself then I would like to copy/sync to an external hdd.
 Now I sync the backup with this command: rsync -a /backupdir
 /externalHddDir
 
 thanks for help.
 
 Hubi

I didn't know exactly because I am using rsyncd instead tar.
But google is your friend:
e.g.
http://www.backupcentral.com/phpBB2/two-way-mirrors-of-external-mailing-lists-3/backuppc-21/tar-exited-with-error-65280-69620/

br
Matthias
-- 
Don't Panic


--
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] generate first backup

2010-07-04 Thread Matthias Meyer
Bob Wooden wrote:

 I have been using BackupPC for six months or more now to backup one
 computer here at my home. This gave myself a little time to learn long
 term ramifications of various settings I tried. (My BackupPC server runs
 on a Ubuntu 10.04LTS (lucid) server, if that is important to anyone.)
 
 Now, adding additional machine to the BackupPC host list, I had issues
 getting the first backup to run. I checked and re-checked my
 passwordless ssh settings. Confirmed that 'ssh -l root
 [client_ip_address] whoami' returned correct answer as 'su backuppc'
 user (answer came back as 'root'). But, when I returned to the home page
 of the client, clicking the 'Start Full Backup' would fail.
 
 Then, I remembered experiencing this during my initial setup six months
 or so ago.
 
 On the server, I would run the troubleshoot command:
 
 'sudo -u backuppc /usr/share/backuppc/bin/BackupPC_dump -v -f
 [backupclient]'
 
 
 After the command line completed, clients began working properly.
 
 Now, I have completed adding three other clients to my backup list. All
 three had to have the troubleshoot command run once to complete the
 setup process.
 
 The bottom line here is that BackupPC would not backup my clients before
 I ran the troubleshoot command line instruction. I did not change any
 client setting or configuration. And after running the command, BackupPC
 functions properly, proceeding with automatic backups, etc. like it
 should.
 
 I am not complaining, but is there an easier way? Does everyone have to
 run their first backup by the command troubleshoot line, like I did?
 
Hello Bob,

I have some Linux as well as some Windows client with public/private key
over ssh and I didn't have had this problem with any of my clients.
Do you have established the host key on the client?
What is the error message of 'Start Full Backup' would fail?

The BackupPC_dump isn't a troubleshoot command but the normal backup
command. Normaly called from the backuppc daemon.

br
Matthias
-- 
Don't Panic


--
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclude a day

2010-06-27 Thread Matthias Meyer
Gerald Brandt wrote:

 Hi,
 
 I'd like to exclude a day in the backup cycle.  Currently, backups occur
 on every day of the week.  However, on Saturdays, the BackupPC server is
 doing automated archives and I'm copying VM snapshots to it as well.  That
 makes the system quite busy and backups slow way down.  For example, and
 incremental that takes 250 minutes normally now takes 700 minutes.
 
 Is there a way to stop backups on Saturdays?
 
 Thanks,
 Gerald
 

Yes. See $Conf{BlackoutPeriods}.

br
Matthias
-- 
Don't Panic


--
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup never completes

2010-06-26 Thread Matthias Meyer
Child is aborting
Got exit from child
indicates a problem on client side. Try to increase log verbosity on client
side (look at man rsync)

br
Matthias
-- 
Don't Panic


--
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup never completes

2010-06-17 Thread Matthias Meyer
B. Alexander wrote:

 Hey,
 
 I have a single virtual host (OpenVZ) that never completes a backup.
 Neither incrementals nor fulls complete. I don't see any errors in the
 logs:
 
 2010-06-15 18:14:19 incr backup started back to 2010-06-12 07:00:02
 (backup #624) for directory /lib/modules
 2010-06-15 18:14:19 incr backup started back to 2010-06-12 07:00:02
 (backup #624) for directory /home
 2010-06-15 18:14:19 incr backup started back to 2010-06-12 07:00:02
 (backup #624) for directory /etc
 2010-06-15 18:14:21 incr backup started back to 2010-06-12 07:00:02
 (backup #624) for directory /var/backups
 2010-06-15 18:14:22 incr backup started back to 2010-06-12 07:00:02
 (backup #624) for directory /var/cache/apt
 2010-06-15 18:14:34 incr backup started back to 2010-06-12 07:00:02
 (backup #624) for directory /var/lib/apt
 2010-06-15 18:14:43 incr backup started back to 2010-06-12 07:00:02
 (backup #624) for directory /var/lib/dpkg
 2010-06-16 07:04:24 Aborting backup up after signal INT
 2010-06-16 07:04:25 Got fatal error during xfer (aborted by user
 (signal=INT)) 2010-06-16 07:04:34 full backup started for directory
 /lib/modules (baseline backup #624)
 2010-06-16 07:04:35 full backup started for directory /home (baseline
 backup #624)
 2010-06-16 07:04:41 full backup started for directory /etc (baseline
 backup #624)
 2010-06-16 07:04:50 full backup started for directory /var/backups
 (baseline backup #624)
 2010-06-16 07:04:51 full backup started for directory /var/cache/apt
 (baseline backup #624)
 2010-06-16 07:05:02 full backup started for directory /var/lib/apt
 (baseline backup #624)
 2010-06-16 07:05:15 full backup started for directory /var/lib/dpkg
 (baseline backup #624)
 
 /var/lib/dpkg is not empty, and this host has backed up successfully for
 well over a year. The only difference is that I moved it to another
 physical host, however, the other 5 VMs on that machine back up fine.
 
 Any ideas on where to look for the problem? I have rebooted the VM, and
 the backup still stops at the same point.
 
 Thanks,
 --b

Maybee you could tell us which backup method you use!?
- you could improve log verbosity on server side
- you could try watch lsof /var/lib/backuppc

br
Matthias
-- 
Don't Panic


--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Retrieve the proper incremental level

2010-06-17 Thread Matthias Meyer
Inno wrote:

 Hello,
 
 I use incremental level (1,2,3,4 correspond to Monday, Tuesday, Wednesday,
 Thursday). But Wednesday and Thursday have bugged last week.It stopped
 at level 2. Am I required to reactivate two incremental to retrieve the
 proper level?
 
 Thanks.
 

The incremental level didn't respond to week days !?
What do you mean with bugged ? buggy, failed, ...?
backuppc would not stop at any level. If a backup failed it will retried
at the next WakeupSchedule.
But you will get a backup of level 3 for Friday.

br
Matthias
-- 
Don't Panic


--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsyncd, Parent read EOF from child

2010-06-17 Thread Matthias Meyer
Alexander Moisseev wrote:

 I have BackupPC configured to backup several directories on the same
 server as different BackupPC hosts. It works without a problem more than 2
 years. But now backup of one directory interrupts during transfer when it
 still works normally for other ones.
 

There are a lot of problems with rsyncd in windows. At least with cygwin
prior V1.7.
I found a lot of information (but no solution) about this in different
mailing lists within the last two years :-(
So it is very interesting that it does work with cygwin-rsyncd-2.6.8_0,
Windows Server 2003 Std R2.

I use rsync instead rsyncd and happy with that.

Got fatal error during xfer (Child exited prematurely) means that your
cygwin-rsyncd die.
You should increase log verbosity on cygwin-rsyncd and check against this
log.

br
Matthias
-- 
Don't Panic


--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Host groups

2010-06-09 Thread Matthias Meyer
Tóth Andor wrote:

 Hello,
 
 I?d like to make host groups, and set configuration according to group
 membership (e.g.: webservers, databases). I have made a workaround by
 creating symlinks for host configurations, but does BackupPC have some
 built in solutions for this?
 
 Regards,
 Andor

Unfortunately, No

-- 
Don't Panic


--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Complex include/excludes with rsync?

2010-06-06 Thread Matthias Meyer
Brian Mathis wrote:

 I am running backups using the rsync transport method.  The docs on
 BackupFilesOnly indicate if this is used at all, then only the files
 listed there are backed up.  To me this implies that excludes are then
 ignored.  The docs on BackupFilesExclude warn that one can only
 specify BackupFilesOnly and BackupFilesExclude once for smb shares,
 which seems to contradict this.
 
 I am trying to perform something like the following:
 exclude /home
 include /home/user/data
 
 Is this possible without resorting to a custom rsync command line?  Am
 I missing something here where includes and excludes are applied in a
 certain order?
 
 Thanks.
 
 --
 
I include directories by BackupFilesOnly and exclude other directories
by exclude from =  within the rsyncd.conf.
But nevertheless your example above would not work. If you exclude /home you
can't include something among them.

br
Matthias
-- 
Don't Panic


--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync restore to Vista - chown fail

2010-05-16 Thread Matthias Meyer

I switched succesfully to cygwin V1.7.5 and it seems to work great :-)
4 things are to be emphasized:
1) usernames in v1.5 are case insensitive but not in v1.7.5.
e.g. ssh me...@cygwinhost is not the same as ssh me...@cygwinhost
  http://cygwin.com/ml/cygwin-announce/2010-03/msg7.html

2) sshd_config:
AuthorizedKeysFile .ssh/authorized_keys is not the same
as #AuthorizedKeysFile .ssh/authorized_keys.
I have to remark the line in v1.7.5 but not in v1.5.
  http://cygwin.com/ml/cygwin-announce/2010-04/msg00026.html

3) I've installed the cygwin v1.7.5 in the same directory as the v1.5
before. In this case all the cygnus keys within the registry have to be
removed. If someone install v1.7 in another directory as v1.5 they can work
in parallel.

4) within Vista some windows programs (e.g. attrib) have problems with
junction points if they run under a user with privilege SeBackupPrivilege.
This can be solved by: cygdrop -p SeBackupPrivilege attrib /S /D ...

thanks to the newsgroup cyg...@cygwin.com for their help in solutions :-)

br
Matthias
-- 
Don't Panic


--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Vista client backup problem

2010-05-14 Thread Matthias Meyer
Guillaume JAOUEN wrote:

 Child is aborting

means that your vista client abort the connection.

Do you have the rsync logfile from your vista client?

br
Matthias
-- 
Don't Panic


--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync restore to Vista - chown fail

2010-05-10 Thread Matthias Meyer
ALSto wrote:

 Matthias Meyer wrote:
 Hello,

 I backup some Vista clients with backuppc 3.1.0 and rsyncd.
 The backup works perfect but the restore not.
 I get an error:
 2010/05/02 23:44:46 [2756] receiving file list
 2010/05/02 23:44:46 [2756] rsync:
 chown /Users/matthias/Desktop/.logfile.lnk.Ud0Zgp (in C) failed:
 Permission denied (13)

 I use the same configuration with Windows XP and for that it will work.

 Someone have a working restore to a Vista Client.

 Thanks in advance
 Matthias
   
 Hi Matthias;
 
 I have issues with two Vista clients also. I get the files restored to
 the right folder but the target folder is strangely now shared.
 Permissions are reset to a mess with Everyone Full Control and have no
 previous Owner. I have not been able to figure this out for sometime now
 so I have resorted to restoring files to .tar or .zip and expanding to
 copy them back to the necessary folders so perms get reset by parent.
 
 It appears either Cygwin rsync or rsyncd is mangling the original ACL or
 cannot read it upon restore. The XFer log indicates a correct permission
  group ownership when it is backed up.
 
   Allen...
 
It is a problem with cygwin V1.5. I will switsch to V1.7.
Hopefully I didn't run within troubles with the rsyncd ;-)

br
Matthias
-- 
Don't Panic


--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] rsync restore to Vista - chown fail

2010-05-02 Thread Matthias Meyer
Hello,

I backup some Vista clients with backuppc 3.1.0 and rsyncd.
The backup works perfect but the restore not.
I get an error:
2010/05/02 23:44:46 [2756] receiving file list
2010/05/02 23:44:46 [2756] rsync:
chown /Users/matthias/Desktop/.logfile.lnk.Ud0Zgp (in C) failed:
Permission denied (13)

I use the same configuration with Windows XP and for that it will work.

Someone have a working restore to a Vista Client.

Thanks in advance
Matthias
-- 
Don't Panic


--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] NFS mount filled backuppc partition, need help recovering

2010-04-18 Thread Matthias Meyer
B. Alexander wrote:

 Hi all,
 
 I shot myself in the foot, and need to pick your brains about how to
 recover. My backup machine has a 500GB drive using LVM for my backup
 partition. I managed to fill it up. Currently, backups are not running
 (which makes me nervous), and the backuppc partition on this machine is
 at:
 
 /dev/mapper/vg01-backuppc
  488367552 485798240   2569312 100% /media/backuppc
 
 I have a machine, my workstation (defiant), that I has a bunch of stuff on
 it in /media/archive. The backup is about 250 GB before pooling and
 compression. Well, I have been slowly migrating data from /media/archive
 to another machine which now has a 1.5 TB drive. All well and good, except
 that without thinking about backups, I nfs mounted the filesystem to which
 I am migrating stuff from defiant. I mounted it under /media/archive, so
 when the last full backup ran on defiant, it filled the filesystem on the
 backup host.
 
 I got an email from the Backuppc Genie, and started clearing some old
 backups. After (I assume) Backuppc_nightly ran, I am still at 100%. I
 started digging in to see why the filesystem was still full. In the
 /media/backuppc/pc/defiant directory, there were two large directories,
 
 # du -sh *
 219G869
 210G944
 
 However, BackupPC_delete only shows backup number
869:/media/backuppc/pc/defiant
 
 # BackupPC_delete -c defiant -l
 BackupNumber 869 - full-Backup from 2010-01-23
 
 and the web interface shows the same. Only 869.
 
 Can I delete the 944 directory without adverse effects? If not, what is
 the best way to free up this drive space?

Try cat /media/backuppc/pc/defiant/backups
If there is no backupp number 944 you can remove it.
You have to remove /media/backuppc/pc/defiant/XferLOG.944
or /media/backuppc/pc/defiant/XferLOG.944.z too.

 
 Also, I have another 120GB of free space in another volume group. Is there
 a way to integrate that into the backup filesystem? With all the hard
 links, I wasn't sure how best to allocate the space.

I use LVM too and I am be able to add and remove size from my backup volume.
The hardlinks are no problem.

 
 /dev/mapper/vg00-archive
  156962116 32840 156929276   1% /media/archive
 
 Unfortunately, since I use LUKS encryption on the drives, spanning a VG
 across two drives is contraindicated. (It decrypts the first drive, but
 not the second, so the volume group can't open.) Can I still use the 150GB
 on vg00-archive in /media/backuppc?

 But I didn't use LUKS encryption. So I can't say if that will work with
resized volumes.

! You should test it ;-) !

br
Matthias
-- 
Don't Panic


--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental backup: What files changed?

2010-04-05 Thread Matthias Meyer
Luis Paulo wrote:

 I was trying to get a way to find what files have changed in an
 incremental backup.
 Does any one has a solution for it already?
 
 I've look at /var/lib/backuppc/pc/host/backups and to the XferLOG, but
 i'm getting nowhere.
 I did a manual incremental a few minutes after another, and I get:
 
 echo -e  n   type\tFiles\tExist\tNew\tlevel; \
  awk '{print $1, ,$2,\t,$5,\t,$7,\t,$9,\t,$21}'
 /var/lib/backuppc/pc/portatil/backups
  n  typeFilesExistNewlevel
 35   incr  36  27  43  4
 
 Those are the numbers that appears on the GUI. How they relate (or if)
 with each other I don't know. (btw, It's a level 4 incremental following a
 level 3)
 
 But if I count the create files on XferLOG.35.z I get *19491* files (15
 pool, 11 same, 0 skip, 0 delete).
 /usr/share/backuppc/bin/BackupPC_zcat
 /var/lib/backuppc/pc/portatil/XferLOG.35.z |grep -c ^  create
 
 Don't know where to go from here.
 Really appreciate any help.
 
 EDIT: I look better to the log and its almost all directories
 
 If I exclude directories, I have 10 create (9 regular, 1 p), 15 pool (10
 reg, 3 c, 2 l), 11 same. Create files are logs and pid, as expected.
 
 /usr/share/backuppc/bin/BackupPC_zcat
 /var/lib/backuppc/pc/portatil/XferLOG.31.z |grep ^  create |grep -v ^
 create d
 
 Did some tests with other log files and it looks reasonable. Although it
 don't show what directories were created, its a start. How it relates with
 backups file values beats me.
 
 Any one as a better solution, please? Thanks.

The count not only including directories but also the files attrib. One in
each directory.
I believe your counting (19491 Total files, 10 regular files) can't be
correct. Regular files must be an odd number because count of directories
and attrib files must be an even number.

You can do a ls -R /var/lib/backuppc/pc/yourhost/backupnumber

br
Matthias
-- 
Don't Panic


--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Check-if-alive-pings alternatives

2010-02-18 Thread Matthias Meyer
Sorin Srbu wrote:

-Original Message-
From: Matthias Meyer [mailto:matthias.me...@gmx.li]
Sent: Thursday, February 11, 2010 11:35 PM
To: backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] Check-if-alive-pings alternatives

Sorin Srbu wrote:
 Short of making the router visible on the network for pings, is there
 any other way to circumvent this problem? Maybe connecting to the
 ssh-port or something? Ideas and pointers are greatly appreciated!

It isn't necessary that BackupPC use ping.
It is configurable via $Conf{PingCmd}.
My clients start a ssh-tunnel to my server and my PingCmd check the
established connection with netstat.
 
 Do you use any particular switches with that then?
 
Yes, within my ping-command.
br
Matthias
-- 
Don't Panic


--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC not cleaning out cpool.

2010-02-14 Thread Matthias Meyer
Joseph Holland wrote:

 # du -xsmh /var/data/backuppc/
 544G/var/data/backuppc/
 
 
 Joe.
 
 On 11/02/2010 22:45, Matthias Meyer wrote:
 Joseph Holland wrote:


 We are having a problem with many of our BackupPC servers in our company
 at the moment.  We are running Debian Lenny and BackupPC 3.1.0.  The
 data volumes since we upgraded to this version have seemed to just be
 filling up at a constant rate.

 The web interface is saying that the pool is 136.98GB but when you do a
 du -smh on the topdir /var/data/backuppc it returns 543GB used.
  
 Try du -xsmh

 br
 Matthias

 
strangely, indeed.
du delivers a roughly estimation, not an exact value. But we are speaking
about 140GB vs 540GB.

1) du and GUI use the same directory? I have /var/lib/backuppc and
not /var/data/backuppc.

2) There are no extra files from you on your /var/data/backuppc? I don't
know how the GUI calculates the usage of space.

Sorry
Matthias
-- 
Don't Panic


--
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Check-if-alive-pings alternatives

2010-02-11 Thread Matthias Meyer
Sorin Srbu wrote:

 Hi all,
 
 I have a server that sits behind a router (server is NAT:ed) and allows
 ssh connections in. That is to say, the *only* thing it allows in is ssh
 connections.
 
 Now, BackupPC uses pings to check if the machine to be backed up is alive.
 Since the router in question doesn’t respond to any pings, it's in a
 pseudo-stealth mode, then BackupPC thinks the machine is down and doesn't
 initiate any backups even though the machine is actually alive and
 responding otherwise.
 
 Short of making the router visible on the network for pings, is there any
 other way to circumvent this problem? Maybe connecting to the ssh-port or
 something? Ideas and pointers are greatly appreciated!

It isn't necessary that BackupPC use ping.
It is configurable via $Conf{PingCmd}.
My clients start a ssh-tunnel to my server and my PingCmd check the
established connection with netstat.

br
Matthias
-- 
Don't Panic


--
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC not cleaning out cpool.

2010-02-11 Thread Matthias Meyer
Joseph Holland wrote:

 We are having a problem with many of our BackupPC servers in our company
 at the moment.  We are running Debian Lenny and BackupPC 3.1.0.  The
 data volumes since we upgraded to this version have seemed to just be
 filling up at a constant rate.
 
 The web interface is saying that the pool is 136.98GB but when you do a
 du -smh on the topdir /var/data/backuppc it returns 543GB used.

Try du -xsmh

br
Matthias
-- 
Don't Panic


--
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] rsyncd restore issue

2010-02-07 Thread Matthias Meyer
Hello,

I have two files (both Access DB). One with 2MB and the other with 200MB.
Both are backed up with BackupPC 3.1.0 using rsyncd from my Windows-PC.
If I delete the 200MB file on my Windows-PC the restore need only some
minutes.
If I restore both, only the first one (2MB) will be restored.
After hours of duration I abort the restore process.

I've also tried to chmod ugo=rwx the files in front of a restore. But alos
without success.

I've tried loglevel=9 and checked the RestoreLOG.1.z (208MB).
Unfortunately I didn't found a error messages.

Any hint how to find the reason?

Thanks in advance
Matthias
-- 
Don't Panic


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] where the files come from, for rsync comparison?

2010-02-03 Thread Matthias Meyer
Craig Barratt wrote:

 Matthias writes:
 
 I believe that RsyncFileIO.pm decompress a file into share/RStmp and
 than compare it (rsync) with the file from the client.
 
 Yes, but only if the file is bigger than 16MB (otherwise
 it just does it in memory) and delays doing it until it
 knows there is a difference in the file.
 
 But I don't understand why lsof sometimes list a file from cpool and
 sometimes a file from an already stored backup:
 
 BackupPC_ 21822 backuppc6r   REG  254,1 16968527 13049275
 /var/lib/backuppc/cpool/5/1/1/5114e50f30d80556ce4c51cc1cdd7fa7
 BackupPC_ 21822 backuppc9u   REG  254,1 40595312  6493166
 /var/lib/backuppc/pc/st-srv-xp/new/fPROGRAMS/RStmp
 
 BackupPC_ 21822 backuppc6r   REG  254,1 188638976 60093825
 /var/lib/backuppc/pc/st-srv-xp/288/fPROGRAMS/fmicrotech/fAusgabeverzeichnisHis.MBD
 BackupPC_ 21822 backuppc7r   REG  254,1 190840832  6493151
 /var/lib/backuppc/pc/st-srv-xp/new/fPROGRAMS/RStmp
 
 I think what you are seeing is two different steps: the latter is
 the file being decompressed into RStmp (although I would expect the
 FD status to be 7u, not 7r; not sure why).  The second step is
 probably comparing the RStmp file to a candidate file in the pool.
 
 Craig
 
Thanks
Matthias
-- 
Don't Panic


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] where the files come from, for rsync comparison?

2010-02-01 Thread Matthias Meyer
I believe that RsyncFileIO.pm decompress a file into share/RStmp and than
compare it (rsync) with the file from the client.
But I don't understand why lsof sometimes list a file from cpool and
sometimes a file from an already stored backup:

BackupPC_ 21822 backuppc6r   REG  254,1 16968527 13049275 
/var/lib/backuppc/cpool/5/1/1/5114e50f30d80556ce4c51cc1cdd7fa7
BackupPC_ 21822 backuppc9u   REG  254,1 40595312  6493166 
/var/lib/backuppc/pc/st-srv-xp/new/fPROGRAMS/RStmp

BackupPC_ 21822 backuppc6r   REG  254,1 188638976 60093825 
/var/lib/backuppc/pc/st-srv-xp/288/fPROGRAMS/fmicrotech/fAusgabeverzeichnisHis.MBD
BackupPC_ 21822 backuppc7r   REG  254,1 190840832  6493151 
/var/lib/backuppc/pc/st-srv-xp/new/fPROGRAMS/RStmp

Is that by accident or by algorithm?

br
Matthias
-- 
Don't Panic


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] md4 doesn't match - how to solve?

2010-02-01 Thread Matthias Meyer
Michael Stowe wrote:

 Hello,

 I'm wondering about not finishing a backup from a windows client the last
 couple of days.

 The XferLOG stats:
 :
   create   770544/18  196608
 microtech/Daten/NT/Mand.MESSE-2009/DTAHistory.MBD
 microtech/Daten/NT/Mand.MESSE-2009/Dokumente.MBD: md4 doesn't match: will
 retry in phase 1; file removed
   create   770544/18  196608
 microtech/Daten/NT/Mand.MESSE-2009/Einsatz.MBD
 :
   same 700  1003/513   69912
 nexcelent.bpxchange_old2/bpXchange.sfs
   create d 770544/18   0 xerox
   create d 770544/18   0 xerox/nwwia
 Read EOF:
 Tried again: got 0 bytes
 finish: removing in-process file
 microtech/Daten/NT/Mand.MESSE-2009/Dokumente.MBD
 Child is aborting
 Done: 15109 files, 25829421376 bytes

 Unfortunately the internet connection of my client interrupts each night
 :-(
 In this case obviously during the retry in phase 1?

 What the reason for the md4 doesn't match could be?
 
 Most likely, the file is being modified.
 
 Can I help BackupPC to find the correct md4?
 
 No.
 
 e.g. delete the file from backup or exclude it for one full run?
 
 You might need to exclude it, or use a VSS technique to guarantee
 consistency.

I use VSS. So I will try to exclude it.

Thanks
Matthias
-- 
Don't Panic


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] autossh issues

2010-01-31 Thread Matthias Meyer
Trey Nolen wrote:

 We've been using Backuppc for years to backup machines over the WAN.  It
 has worked nearly flawlessly for us in the past, but we are starting to
 see issues with new machines that we bring online.  I think the trouble
 may have to do with the recent updates to Cygwin.
 
 When backing up remote Windows machines over a WAN, we setup a
 persistent tunnel using autossh and then backup to localhost on some
 port which is redirected to rsync on the remote server.   Here is a
 typical example:
 /usr/lib/autossh/autossh -2 -N -M 38022 -L 9035:localhost:873 -C
 administra...@remotehost.domain.com 
 
 In our config.pl, we will have some lines like these:
 $Conf{XferMethod}='rsyncd';
 $Conf{ClientNameAlias} = localhost;
 $Conf{RsyncdClientPort} = 9035;
 
 
 
 Now, our issue lately is that the tunnels always seem to be dropping.
 We get errors like this:
 Warning: remote port forwarding failed for listen port 38022
 
 That invariably restarts the ssh tunnel, which drops any active rsync.
 We've tried with other commands like this:
 /usr/lib/autossh/autossh -o ServerAliveInterval 59 -o
 ServerAliveCountMax 20 -2 -N -M 0 -L 9035:localhost:873 -C
 administra...@remotehost.domain.com 
 
 
 But these fail as well.   We are also seeing a build up of 10's to 100's
 of sshd.exe and rsync.exe processes on the remote machines which is
 bringing them to a crawl.
 
 Does anyone have any ideas of ways we can change what we are doing with
 the new Cygwin that will help?   Is there a way to install the old
 Cygwin?
 
 One other thing: we have been replacing the rsync.exe that comes with
 the new Cygwin with an older one that uses protocol 28.  We did that
 because 1) that is the highest protocol the rsync perl module supports,
 and 2) the newer Cygwin rsync seems to leave .pid files laying around
 when the server is rebooted, and subsequently won't restart until you
 delete the .pid.
 
 Thanks.
 
 Trey Nolen

try netstat --tcp -pl --numeric-ports | grep 38022 on your BackupPC
Server.
If there is a output like:
tcp0  0 127.0.0.1:38022 0.0.0.0:*   LISTEN 
32284/sshd: backupp
tcp6   0  0 ::1:38022   :::*LISTEN 
32284/sshd: backupp

Just kill the process 32284 and your client should connect immediatly.

br
Matthias
-- 
Don't Panic


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Bakuppc shutdown clinets

2010-01-31 Thread Matthias Meyer
egrimisu wrote:

 
 hi and thanks for the repley, unfortunetly that does not work. i put the
 command there and after the pc's backup the pc does not shut down.
 
 http://img177.imageshack.us/img177/65/66670236.jpg
 

DumpPostUserCmd will be executed on your BackupPC Server.
Try /sbin/halt and your server will shutdown after this backup is done.
Probably not what you want ;-)

You have to establish a way to execute commands on your client pc.
I recommend ssh but there are other possibilities too (e.g. winexe).

br
Matthias
-- 
Don't Panic


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backuppc - not backing up files, only folder structure

2010-01-26 Thread Matthias Meyer
stratisphere wrote:

 
 We've been using backup pc for quite some time, using the old version
 without issues. After about 6 months ago and some hardware issues, we
 decided to upgrade.
 
 Yesterday I built a box up running backuppc on debian and used apt-get to
 install the lot... nice and simple. I've also spent time configuring the
 config.pl file to our needs.
 
 It all backs up, however it doesnt backup files, only the folder
 structure. At the moment i'm only testing it with one host which is a
 Windows 2008 server. If needed I can test it with something else?
 
 (P.s. i've had a quick search of the net/forums/mailing list but cant find
 anything... if there is anything which is relevant to this issue, i'd be
 greatful for a link!)
 
 Thanks,
 Adrian
 

Which transport do you use (smb, rsync, ...)?
What are the confifuration of your shares, includes as well as excludes?
Did you get some hints from the Xferlog?

br
Matthias
-- 
Don't Panic


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Comments on this backup plan please

2010-01-26 Thread Matthias Meyer
PD Support wrote:

 We are going to be backing up around 30 MS-SQL server databases via ADSL
 to a number of regional servers running CentOS (about 6 databases per
 backup server). 10 sites are 'live' as of now and this is how we have
 started...
 
 The backups are between about 800MB and 5GB at the moment and are made as
 follows:
 
 1) A stored procedure dumps the database to SiteName_DayofWeek.bak eg:
 SHR_Mon.bak
 
 2) We create a local ZIP copy eg: !BSHR_Mon.zip. The !B means the file is
 EXCLUDED from backing up and is just kept as a local copy, cycled on a
 weekly basis.
 
 3) We rename SHR_DayofWeek.bak to SiteName.bak
 
 4) We split the .bak file into 200MB parts (.part1 .part2 etc.) and these
 are synced to the backup server via backuppc
 
 This gives us a generically-named daily backup that we sync
 (backupPC/rsyncd) up to the backup server nightly.
 
 We split the files so that if there is a comms glitch during the backing
 up of the large database file and we end up with a part backup, the next
 triggering of the backup doesn't have to start the large file again - only
 the missing/incomplete bits.

--partial as RsyncArgs should do this job.

br
Matthias
-- 
Don't Panic


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] unable to restore via zip archives

2010-01-19 Thread Matthias Meyer
Johan Cwiklinski wrote:

 Hello,
 
 Le 02/10/2009 18:15, Boyko Yordanov - Exsisto Ltd. a écrit :
 Hello list!

 I've been playing w/ BackupPC version 3.2.0beta0 for a couple of days
 and I am facing the following issue:

 I am unable to restore files via zip archives with compression (1-9).
 The web interface prompts me to save the file, but then the file is
 usually just a few kilobytes in size (no matter how many files I am
 selecting for restoration) and I am unable to extract anything from
 it. I am able to download a zero compression zip archive, but
 nothing if I set the compression to higher values. The other restore
 methods are working fine. I have all the necessary perl modules
 installed including Archive::Zip and the ones it depends on. There are
 no errors in the BackupPC log file, actually it just logs a normal
 entry like I've successfully downloaded the zip archive - but I did
 not.. at least the file I get seems somehow broken.

 Could it be a bug or a known issue? Anyone noticing the same behavior
 w/ his BackupPC setup? I have two boxes running this version of
 BackupPC - I can't use zip archives for restoration on both of them,
 its just not working for me.

 Thanks!

 Boyko
   

I running BackupPC on Debian and I can restore via zip files with
compression 9.
Did you try a restore via .tar file too?
Do you can restore your files with a normal restore via your transfer
method?

br
Matthias

-- 
Don't Panic


--
Throughout its 18-year history, RSA Conference consistently attracts the
world's best and brightest in the field, creating opportunities for Conference
attendees to learn about information security's most important issues through
interactions with peers, luminaries and emerging and established companies.
http://p.sf.net/sfu/rsaconf-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] where does BackupPC_zipCreate write the file before transmission

2009-12-23 Thread Matthias Meyer
Hi,

I assume BackupPC_zipCreate read the files from the numbered dump and write
them local into a .zip file. This local .zip file will be transfered to the
destination.
I can't find out where this local .zip file is located.
Is my assumption wrong?

Does BackupPC_zipCreate write the .zip to stdout and it will be transfered
by BackupPC?
In this case, is it possible to run two different BackupPC_zipCreate in
parallel?

Thanks
Matthias
-- 
Don't Panic


--
This SF.Net email is sponsored by the Verizon Developer Community
Take advantage of Verizon's best-in-class app development support
A streamlined, 14 day to market process makes app distribution fast and easy
Join now and get one step closer to millions of Verizon customers
http://p.sf.net/sfu/verizon-dev2dev 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] ssh don't work to backup localhost

2009-12-23 Thread Matthias Meyer
Claude Gélinas wrote:

 I'm trying to setup the backup of the localhost with backuppc. I already
 backup several other linux machine via ssh. I've setuped all them via
 running the following command as backuppc user:
 
 ssh-keygen -t dsa
 cd .ssh
 ssh-copy-id -i id_dsa.pub r...@oligoextra.phyto.qc.ca

I would believe you must:
ssh-copy-id -i id_dsa.pub backu...@oligoextra.phyto.qc.ca
because you need the publich key in /var/lib/backuppc/.ssh/authorized_keys
and not in /root/.ssh/authorized_keys

cat id_dsa.pub  /var/lib/backuppc/.ssh/authorized_keys
should also do the job.

br
Matthias
-- 
Don't Panic


--
This SF.Net email is sponsored by the Verizon Developer Community
Take advantage of Verizon's best-in-class app development support
A streamlined, 14 day to market process makes app distribution fast and easy
Join now and get one step closer to millions of Verizon customers
http://p.sf.net/sfu/verizon-dev2dev 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsyncd via ssh-redirected port

2009-12-21 Thread Matthias Meyer
Guido Schmidt wrote:

 Matthias Meyer wrote:
 Guido Schmidt wrote:
 What works? The opening and closing of the tunnel.
 What does not? The connection to it. Nothing in the rsyncd-logs on
 host.example.com.

 If I leave DumpPostUserCmd empty the tunnel stays open and I can use it
 with rsync as user backuppc on a shell providing the password by hand:

   rsync -av --list-only --port=32323 backu...@localhost::Alles
   /home/backuppc/test/

 Do you provide the password during your script?
 
 The ssh-connection works (authenticated via public key). The password I
 refered to is for connecting to rsyncd and that is stored in
 $Conf{RsyncdPasswd}.
 
 It seems that backuppc does not reach the point where it actually tries
 to connect to rsync daemon. There are no entries in the rsyncd-log
 (there are when I use the rsync-command above). How can I find out more
 what happens and what not?
 
I don't really know what the problem :-(
You can increase the loglevel with $Conf{XferLogLevel}.
What happens if you start your tunnel interactive and leave DumpPreUser as well
as CmdDumpPostUserCmd empty.
Try your interactive:
  rsync -av --list-only --port=32323 backu...@localhost::Alles 
/home/backuppc/test/

If it work, start a backup via BackupPC.

Why do you need the identification by rsync? I would believe you can trust your
ssh-tunnel and dont't need an additional authentication.

br
Matthias
-- 
Don't Panic


--
This SF.Net email is sponsored by the Verizon Developer Community
Take advantage of Verizon's best-in-class app development support
A streamlined, 14 day to market process makes app distribution fast and easy
Join now and get one step closer to millions of Verizon customers
http://p.sf.net/sfu/verizon-dev2dev 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] SMB Restore Issues - Trailing slashes reversed

2009-12-20 Thread Matthias Meyer
Craig Connoll wrote:

 Hi all.
 
 I have backuppc installed and working as I want it too but I have an
 issue when trying to restore a windows backup.
 
 All permissions are correct and no failures except when restoring to a
 window machines.
 
 I think the problem is the trailing slashes. They are forward slashed
 instead of back slashed which windows uses.
 
 Original file/dir
 172.16.10.222:/c$/Program Files/prog1/DataRetrieval 22_03_07.zip
 
 Will be restored to
 172.16.10.222:/c$/Program Files/prog1/DataRetrieval 22_03_07.zip
 
 
 Now when I try to change the slashes during the restore procedure it
 produces this:
 172.16.10.222:/c$/\Program Files\AES Energy Tracker/DataRetrieval
 22_03_07.zip
 
 Is there some way to fix this?
 Any help will be much appreciated
 
 Regards,
 
 Crashinit6

Do you use smb or tar or rsync?
What is your problem exactly on the windows machine?
Are the files will be restored or not?
Are the files in the correct directories or not?
How do you see that the permissions are correct?

br
Matthias
-- 
Don't Panic


--
This SF.Net email is sponsored by the Verizon Developer Community
Take advantage of Verizon's best-in-class app development support
A streamlined, 14 day to market process makes app distribution fast and easy
Join now and get one step closer to millions of Verizon customers
http://p.sf.net/sfu/verizon-dev2dev 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsyncd via ssh-redirected port

2009-12-19 Thread Matthias Meyer
Guido Schmidt wrote:

 Dear backuppc-users,
 
 I'm happily using BackupPC 3.1.0 for quite a while.
 
 I'm now trying to backup a public host (host.example.com) via an
 ssh-redirected port. I don't allow any command execution on that host (and
 therefore cannot use the wait command), so I wrote a script
 (sshtunnelcontrol, see below) to open and close the tunnel when needed. It
 is called as DumpPreUserCmd and DumpPostUserCmd.
 
 What works? The opening and closing of the tunnel.
 What does not? The connection to it. Nothing in the rsyncd-logs on
 host.example.com.
 
 If I leave DumpPostUserCmd empty the tunnel stays open and I can use it
 with rsync as user backuppc on a shell providing the password by hand:
 
   rsync -av --list-only --port=32323 backu...@localhost::Alles
   /home/backuppc/test/
 
Do you provide the password during your script?
I don't know how BackupPC can know the password for the ssh connection.
I believe $Conf{RsyncdUserName} and $Conf{RsyncdPasswd} refers to the rsync
secrets and not to the ssh connection.

I would suggest to use public/private key for ssh.

br
Matthias
-- 
Don't Panic


--
This SF.Net email is sponsored by the Verizon Developer Community
Take advantage of Verizon's best-in-class app development support
A streamlined, 14 day to market process makes app distribution fast and easy
Join now and get one step closer to millions of Verizon customers
http://p.sf.net/sfu/verizon-dev2dev 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Cannot start backuppc

2009-12-13 Thread Matthias Meyer
Robert J. Phillips wrote:

 My raid drive failed that stores all the data.  I have fixed this
 problem (rebuilt the raid and had to re-install the xfs file system).
 All the data is lost that was on the array.
 
  
 
 I am running the Beta 3.2.0 version of backuppc and I ran sudo perl
 configure.pl to let it rebuild the data folders.  When I try to start
 backuppc I get an error that it cannot create a test hardlink between a
 file in /mnt/backup/pc and /mnt/backup/cpool.
 
  
 
 How do I fix it??
It is a problem with your filesystem. BackupPC must make hardlinks between
this two directories.
You are sure this two directories on the same drive/volume?
Try to make a hardlink between this two directories.

br
Matthias
-- 
Don't Panic


--
Return on Information:
Google Enterprise Search pays you back
Get the facts.
http://p.sf.net/sfu/google-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] convert from rsync to rsyncd

2009-12-08 Thread Matthias Meyer
Andy Thompson wrote:

 I want to convert a host from rsync to rsyncd in order to test speed
 differences between the two.  My concern though is that currently with
 rsync my share name is /, which don't believe is valid using rsyncd.  If I
 do this, will it duplicate all of the data for this server since they are
 different share/path names?  It's a couple hundred gig of data and I don't
 really want it duplicated for any amount of time
 
 thanks
 
 -andy
Don't worry. BackupPC didn't store duplicate data. It checks each file and
if it is the same as another file, stored by the same or another PC, it
only make a hardlink.

br
Matthias
-- 
Don't Panic


--
Return on Information:
Google Enterprise Search pays you back
Get the facts.
http://p.sf.net/sfu/google-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] New release of BackupPC_deleteBackup - who can put it into the wiki?

2009-12-05 Thread Matthias Meyer
Hi,

I have a new release of the BackupPC_deleteBackup script.
Unfortunately I can't put it into the wiki
(http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=How_to_delete_backups).
Jeffrey would do that but I didn't reach him via email :-(

Anybody else here would put it into the wiki?

Thanks
Matthias
-- 
Don't Panic


--
Join us December 9, 2009 for the Red Hat Virtual Experience,
a free event focused on virtualization and cloud computing. 
Attend in-depth sessions from your desk. Your couch. Anywhere.
http://p.sf.net/sfu/redhat-sfdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Bakuppc shutdown clinets

2009-11-28 Thread Matthias Meyer
egrimisu wrote:

 
 Thanks Matthias,
 
 I looked over it but i don't know how to call these commands, the only
 thing that i figured out is that the commands are:
 
 if called from windows
 shutdown /s /f /m \\clientcomputername
 
 and called from linux:
 net rpc shutdown -t 10 -f -C Shut Down message -W DOMAINNAME -U
 adminusername%adminpass -I 192.168.0.x
 
 I hope you can help further. Thanks again
 
 
 
I use cygwin/ssh:

From the backuppc box you can call ssh user@host shutdown /s /f /t 0
If you have already a script, running on your client (see
$Conf{DumpPostUserCmd}), you can easy add at the end of your
script shutdown /s /f /t 0.

br
Matthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] removing an old backup

2009-11-19 Thread Matthias Meyer
Jeffrey J. Kosowsky wrote:

 There is a bash script written by Matthias Meyer (and available by
 googling on the web).
 I think called BackupPC_deleteBackup (or maybe that's just what I
 called it ;)
 

Hi Jeffrey,

I can't answer your mail:

   - Transcript of session follows -
... while talking to smtp.secureserver.net.:
 DATA
 550 5.7.1 SPF unauthorized mail is prohibited.
554 5.0.0 Service unavailable

br
Matthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Suggestion for improvement of BackupPC_dump

2009-11-18 Thread Matthias Meyer
Problem:

It isn't unusual that internet connections abort once a day. A lot of 
providers are doing that. Therefore it isn't unusual too that a full 
backup aborts and will be saved as an partial backup. If the next full 
backup aborts too, it will only be saved if it contains more files than 
the previous partial.
If a big file will be added on client after a partial backup is stored 
than it is possible that the transfer of this new file, within the next 
full run, needs a lot of time. If this full backup aborts too than it is 
possible that it contains less files than the previous backup. It will be 
cancelled and this big file have to be retransmitted within the next full 
run.

In average I can transmit 6GB per day, depending on the internet upload 
bandwith of the client. So it isn't unlikely that I will never backup this 
client, because of the above problem.
Don't advise the $Conf{PartialAgeMax}. This would change a reliable 
backup into a game of luck ;-) (I know, that's not really true. But I didn't
believe this would be a reliable solution)

Solution:

BackupPC_dump compare the filecount and decide if the old partial backup 
must be removed or not. If BackupPC_dump remove the old partial it will 
rename the $TopDir/pc/$client/new directory into $TopDir/pc/$client/nnn 
after this. In the other case it will only remove the
$TopDir/pc/$client/new.

Instead this, BackupPC_dump should move the $TopDir/pc/$client/new 
always over $TopDir/pc/$client/nnn, overwriting existing files and 
creating new files in $TopDir/pc/$client/nnn. The NewFileList contains
all new files anyway. Therefore BackupPC_link should do his job perfectly
too.

Advantage:

+ All transmitted files will be saved within the actual backup.
  Nevertheless if a full backup will be aborted after a short or long time 
  or not.
+ The continuation (strictly speaking the 2nd or later continuation) of a
  partial backup must not retransmit files which are always transmitted.

Disadvantage:

None known.
Such a full backup can have a duration of more than a few days. It is not
sure which version of changed files are in the backup or not. But the
duration of a [full] backup isn't really influenceable. So also without 
the above improvement we didn't know which version of a file is within 
the backup.
e.g.: If we use snapshots and a file will be deleted after the start of a 
backup it will be stored within the backup. If we don't use snapshots that 
isn't sure. It depends from the time where it was deleted. If we use 
different shares the above case isn't sure too.

There is no different between the BackupPC_dump implementation and the above
suggestion, but less retransmits.

What is your opinion about this?

br
Mattthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Did backuppc/rsync end up finishing successfully?

2009-11-18 Thread Matthias Meyer
Jeffrey J. Kosowsky wrote:

 One of my recent backup logs had a single error as follows:
 ...
 create d 700  1005/513   0 spoolerlogs
 Remote[1]: rsync error: timeout in data send/receive (code 30) at
 /home/lapo/packaging/rsync-3.0.6-1/src/rsync-3.0.6/io.c(200)
 [sender=3.0.6] Read EOF:
 
 Then the log proceeds:
   Tried again: got 0 bytes
   delete   700  1005/5131024 Documents and
   Settingsnetworkingntuser.dat.LOG
   delete   700  1005/513  262144 Documents and
   Settingsnetworkingntuser.dat Child is aborting
   Done: 969 files, 587612347 bytes
 
 
 So my questions are as follows:
 1. What does the Read EOF line mean and is it likely to be the cause
or effect (or unrelated) to the previous rsync timeout?

related to the previous rsync timeout

 
 2. When the log says Tried again: got 0 bytes and proceeds to list
two more files before aborting, does that mean that the backup
resumed and completed successfully? If so, were any files likely to
be lost?

No. The backup was not completed. Maybee you have a partial backup at least.
Nevertheless, the two files listed are deleted (within your backup) because
theire transmission was not finished.
 
 3. When it says Child is aborting, is that a normal completion or is
there an issue?
 

Your windows client canceld the connection. Reason can be the provider or
any other network problem. Theoretically also firewall or antivirus. But
until now I didn't saw that.

br
Matthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] removing an old backup

2009-11-18 Thread Matthias Meyer
Jeffrey J. Kosowsky wrote:

 There is a bash script written by Matthias Meyer (and available by
 googling on the web).
 I think called BackupPC_deleteBackup (or maybe that's just what I
 called it ;)
 
No, that was the name from me :-)
I have an updated version of this script.
Included your patch as well as can remove an entiry host.

Unfortunateyl I can not put it into the wiki.
http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=How_to_delete_backups
Don't know how :-(

Can I mail the script to someone which put it into the wiki?

br
Matthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Concrete proposal for feature extension for pooling

2009-11-15 Thread Matthias Meyer
If I understand right (sorry for my english ;-)) this proposal will support
a backup of backuppc to another disk/machine.
I believe today this is an important lack what should be solved.

Unfortunately I am not a perl programmer. But I can support in testing.
I have server as well as clients which can be used by me and others.

br
Matthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] TextFileWrite: Failed to verify /etc/BackupPC/config.pl.new

2009-11-10 Thread Matthias Meyer
Michael Mansour wrote:

 Hi,
 
 Since updating to the latest BackupPC version, I've never really had to
 make a mod to the config via the web interface.
 
 Anyway, I just tried to and got the error in the subject:
 
 TextFileWrite: Failed to verify /etc/BackupPC/config.pl.new
 
 Checking the /etc/BackupPC directory I have:
 
 -rw-r- 1 backuppc apache 78507 Jul 19 06:04 config.pl
 -rw-r- 1 backuppc apache 0 Nov 10 12:19 config.pl.new
 -rw-r- 1 backuppc apache 76236 Apr 29  2009 config.pl.old
 -rw-r- 1 backuppc apache 75640 Jul  4  2008 config.pl.pre-3.1.0
 -rw-r- 1 backuppc apache 76236 Jul 19 06:03 config.pl.pre-3.2.0beta0
 
 so it seems to create the file but can't verify it?
 
 Any ideas how I can fix this?
 
 Thanks.
 
 Michael.
 
I run V3.1.0 and have no config.pl.new. Nevertheless, the file is empty.
Try to remove it.
br
Matthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Compression Issue

2009-11-10 Thread Matthias Meyer
Heath Yob wrote:

 It appears that I'm not getting any compression on my backups at least
 with my Windows clients.
 I think my mac clients are being compressed since it's actually
 stating a compression level in the host summary.
 
 I have the compression level set to 9.
 
 I have the Compress::Zlib perl library installed.
 
 ppo-backup:/home/heathy# perl -MCompress::Zlib -e print \Module
 installed.\\n\;
 Module installed.
 
 Is there a secret to SMB compression?
 
 Heath
 
I don't believe compression constraint on transport.
Do you have files in /var/lib/backuppc/pool?
Check your configuration:
grep CompressLevel /etc/backuppc/*.p

br
Matthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] UPDATED: Fully automated script for creating shadow copies a

2009-11-10 Thread Matthias Meyer
Michael Stowe wrote:

 Michael Stowe wrote:
 Your scripts run within a normal user which is in admin group?
 My problem, what is inexplicable for me, is that the at command will work
 from a vista command box but not from a ssh-session. Even thought both
 use the same account.

 Did you have to rebase cygwin within Windows7?
 I tried cygwin within the release canditate but it wasn't stable. The
 cygwin
 newsgroup told me I have to rebase all within Windows7.

 br
 Matthias
 --
 Don't Panic
 
 Yes, a normal Admin user -- I don't use ssh at all, nor a full version of
 cygwin.
 
 It's documented here:  http://www.goodjobsucking.com/?p=62
 
 (There's a download link in there somewhere, if you're interested, but
 you'll need your own version of vshadow.exe.)
 
Thanks
Matthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to reenable/reactivate a host for backup

2009-11-09 Thread Matthias Meyer
ckandreou wrote:

 
 In the host summary I get the following:
 Host User #Full Full Age (days) Full Size (GB) Speed (MB/s) #Incr Incr Age
 (days) Last Backup (days) State Last attempt emsportal backuppc 1 529.9
 14.14 3.38 1 30.9 30.9 auto disabled
 
 
 How do I reenable host to be backed up?
 

In the GUI. Select the host and Edit Config, Schedule
or direct within the config (/etc/backuppc/host.pl) of your host.

$Conf{BackupsDisable} = 0

br
Matthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] UPDATED: Fully automated script for creating shadow copies a

2009-11-09 Thread Matthias Meyer
Jeffrey J. Kosowsky wrote:

 Matthias Meyer wrote at about 14:56:02 +0100 on Sunday, November 8, 2009:
   Hello Jeffrey,
   
   Did your script work in Vista too?
 I have not tried it under Vista, but I would like to get it to work
 there. One potential issue could be that shadow mounts (and
 vshadow.exe) might have a different interface. Feel free to look into
 that and if you get back to me on differences, I will try to fix the
 code. Also, make sure you have the right version of vshadow for Vista.
I have the vista version of vshadow.
 
 
   Within a ssh-session I try at 14:39 attrib in Vista but I get an
   access denied.
 
 I'm not sure what you are doing here. Are you trying to test whether
 'at' works from an ssh session? I see no reason why it shouldn't. But
 running attrib wouldn't do much unless you piped the output
 elsewhere. Check to make sure your path and permissions are right.
This is only a test. at 14:39 attrib  test.txt would be better.
 
   If I try it from a vista command box it will work.
   In both, ssh-session as well as vista command box, I use the same user.
   
 
 If you are interested, I would be happy to send you updated versions of my
 scripts...
Thanks. But I have no fully cygwin installation. Some programmes you need
(e.g. dirname) are not installed on my windoze.
I just try to steal your ideas ;-)

br
Matthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] UPDATED: Fully automated script for creating shadow copies a

2009-11-09 Thread Matthias Meyer
Michael Stowe wrote:

 Matthias Meyer wrote at about 14:56:02 +0100 on Sunday, November 8, 2009:
   Hello Jeffrey,
  
   Did your script work in Vista too?
 I have not tried it under Vista, but I would like to get it to work
 there. One potential issue could be that shadow mounts (and
 vshadow.exe) might have a different interface. Feel free to look into
 that and if you get back to me on differences, I will try to fix the
 code. Also, make sure you have the right version of vshadow for Vista.
 
 My scripts are quite different, but they do work under Vista and Windows 7
 without changes; a different version of vshadow is required, but that's
 the extent of the changes required.  I have to imagine the same would be
 the case for your scripts.
 
Your scripts run within a normal user which is in admin group?
My problem, what is inexplicable for me, is that the at command will work
from a vista command box but not from a ssh-session. Even thought both use
the same account.

Did you have to rebase cygwin within Windows7?
I tried cygwin within the release canditate but it wasn't stable. The cygwin
newsgroup told me I have to rebase all within Windows7.

br
Matthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


  1   2   3   >