[BackupPC-users] Problem with backuppc

2009-11-18 Thread KOUAO aketchi
Hello,

We have a server running under debian/sarge with backuppc. All think is right , 
but my storage  space's  utilization exceeds now 95% required. Also , we want 
to back up the pc workstations on an usb extern disk with Backuppc. The problem 
is how to configure the pc config.file in order to put all the backup data on 
this external disk. The disks in  the server will be used for the backup of the 
servers only in order to reduce this percentage.
Thanks a lot four your help




  --
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem with backuppc

2009-11-18 Thread Kameleon
I am sure there are others that will chime in on this but as I see it you
have a few options.

1. Setup LVM and use the external disk as a permanent addition to the system
2. Mount the external disk as the directory that will house your desktops
backups

Honestly, I would be wary about using an external USB disk. Alot of them
have power saving features that will power it down after a short period and
could cause issues with your backups. I would invest in another internal
drive or even mount via NFS or iSCSI another drive in a separate machine.



On Wed, Nov 18, 2009 at 6:05 AM, KOUAO aketchi aketc...@yahoo.fr wrote:

 Hello,

 We have a server running under debian/sarge with backuppc. All think is
 right , but my storage  space's  utilization exceeds now 95% required. Also
 , we want to back up the pc workstations on an usb extern disk with
 Backuppc. The problem is how to configure the pc config.file in order to put
 all the backup data on this external disk. The disks in  the server will be
 used for the backup of the servers only in order to reduce this percentage.
 Thanks a lot four your help




 --
 Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
 trial. Simplify your report design, integration and deployment - and focus
 on
 what you do best, core application coding. Discover what's new with
 Crystal Reports now.  http://p.sf.net/sfu/bobj-july
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem with backuppc

2009-11-18 Thread Tino Schwarze
On Wed, Nov 18, 2009 at 07:00:09AM -0600, Kameleon wrote:

 I am sure there are others that will chime in on this but as I see it you
 have a few options.
 
 1. Setup LVM and use the external disk as a permanent addition to the system
 2. Mount the external disk as the directory that will house your desktops
 backups
 
 Honestly, I would be wary about using an external USB disk. Alot of them
 have power saving features that will power it down after a short period and
 could cause issues with your backups. I would invest in another internal
 drive or even mount via NFS or iSCSI another drive in a separate machine.

You might want to consider at least any RAID as well. Otherwise, if one
of your disk breaks, the LVM is broken and _all_ your backups are gone.

To answer the original question: It is not currently possible to have
BackupPC use different storages. The whole data has to reside on one
file system. How to create a file system spanning multiple disks/RAIDs
and how to manage that is an operating system issue.

HTH,

Tino.

-- 
What we nourish flourishes. - Was wir nähren erblüht.

www.lichtkreis-chemnitz.de
www.tisc.de

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Multi Profiles

2009-11-18 Thread Steven J. Reilly
Hi,

Just started using backuppc and it is a great tool.

I was wondering if it is possible to have multiple profiles for the one 
host.

For example if you want to backup /home daily but only backup the system 
files (/etc, /boot etc.) on a weekly basis.

This sounds like I would need two hostname.pl files but how would that 
be handled.

Regards,

Steve

-- 

=
Steven J. Reilly, EDA Engineer
Allegro Microsystems Europe Ltd
Stuart House, Eskmills Park,
Musselburgh, EH21 7PB, Scotland.
Tel:+44 (0)131 273 4306
Fax:+44 (0)131 273 4301
e-mail: srei...@allegromicro.com

Any views expressed by me in this mail are mine ... all mine!

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Multi Profiles

2009-11-18 Thread Tony Molloy
On Wednesday 18 November 2009 17:50:40 Steven J. Reilly wrote:
 Hi,

 Just started using backuppc and it is a great tool.

 I was wondering if it is possible to have multiple profiles for the one
 host.


No problem. I have 10's of profiles for a particular file server. One for 
each user. So that I can allow individual users do their own restores.

 For example if you want to backup /home daily but only backup the system
 files (/etc, /boot etc.) on a weekly basis.

 This sounds like I would need two hostname.pl files but how would that
 be handled.

Yep, two different hostname.pl files using the same $Con{ClientNameAlias} 
setting in each pointing to the actual hostname.


Regards,

Tony

 Regards,

 Steve



-- 

Dept. of Comp. Sci.
University of Limerick.

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Did backuppc/rsync end up finishing successfully?

2009-11-18 Thread Jeffrey J. Kosowsky
One of my recent backup logs had a single error as follows:
...
create d 700  1005/513   0 spoolerlogs
Remote[1]: rsync error: timeout in data send/receive (code 30) at 
/home/lapo/packaging/rsync-3.0.6-1/src/rsync-3.0.6/io.c(200) [sender=3.0.6]
Read EOF: 

Then the log proceeds:
Tried again: got 0 bytes
delete   700  1005/5131024 Documents and 
Settingsnetworkingntuser.dat.LOG
delete   700  1005/513  262144 Documents and 
Settingsnetworkingntuser.dat
Child is aborting
Done: 969 files, 587612347 bytes


So my questions are as follows:
1. What does the Read EOF line mean and is it likely to be the cause
   or effect (or unrelated) to the previous rsync timeout?

2. When the log says Tried again: got 0 bytes and proceeds to list
   two more files before aborting, does that mean that the backup
   resumed and completed successfully? If so, were any files likely to
   be lost?

3. When it says Child is aborting, is that a normal completion or is
   there an issue?

Thanks

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem with backuppc

2009-11-18 Thread Tyler J. Wagner

On Wednesday 18 November 2009 18:03:01 Tino Schwarze wrote:
 To answer the original question: It is not currently possible to have
 BackupPC use different storages. The whole data has to reside on one
 file system. How to create a file system spanning multiple disks/RAIDs
 and how to manage that is an operating system issue.

If you need different storages for different hosts, the solution is to have 
multiple BackupPC installs.

Regards,
Tyler

-- 
Once is happenstance. Twice is coincidence. The third time it's enemy
action.
   -- Auric Goldfinger

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Suggestion for improvement of BackupPC_dump

2009-11-18 Thread Matthias Meyer
Problem:

It isn't unusual that internet connections abort once a day. A lot of 
providers are doing that. Therefore it isn't unusual too that a full 
backup aborts and will be saved as an partial backup. If the next full 
backup aborts too, it will only be saved if it contains more files than 
the previous partial.
If a big file will be added on client after a partial backup is stored 
than it is possible that the transfer of this new file, within the next 
full run, needs a lot of time. If this full backup aborts too than it is 
possible that it contains less files than the previous backup. It will be 
cancelled and this big file have to be retransmitted within the next full 
run.

In average I can transmit 6GB per day, depending on the internet upload 
bandwith of the client. So it isn't unlikely that I will never backup this 
client, because of the above problem.
Don't advise the $Conf{PartialAgeMax}. This would change a reliable 
backup into a game of luck ;-) (I know, that's not really true. But I didn't
believe this would be a reliable solution)

Solution:

BackupPC_dump compare the filecount and decide if the old partial backup 
must be removed or not. If BackupPC_dump remove the old partial it will 
rename the $TopDir/pc/$client/new directory into $TopDir/pc/$client/nnn 
after this. In the other case it will only remove the
$TopDir/pc/$client/new.

Instead this, BackupPC_dump should move the $TopDir/pc/$client/new 
always over $TopDir/pc/$client/nnn, overwriting existing files and 
creating new files in $TopDir/pc/$client/nnn. The NewFileList contains
all new files anyway. Therefore BackupPC_link should do his job perfectly
too.

Advantage:

+ All transmitted files will be saved within the actual backup.
  Nevertheless if a full backup will be aborted after a short or long time 
  or not.
+ The continuation (strictly speaking the 2nd or later continuation) of a
  partial backup must not retransmit files which are always transmitted.

Disadvantage:

None known.
Such a full backup can have a duration of more than a few days. It is not
sure which version of changed files are in the backup or not. But the
duration of a [full] backup isn't really influenceable. So also without 
the above improvement we didn't know which version of a file is within 
the backup.
e.g.: If we use snapshots and a file will be deleted after the start of a 
backup it will be stored within the backup. If we don't use snapshots that 
isn't sure. It depends from the time where it was deleted. If we use 
different shares the above case isn't sure too.

There is no different between the BackupPC_dump implementation and the above
suggestion, but less retransmits.

What is your opinion about this?

br
Mattthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Did backuppc/rsync end up finishing successfully?

2009-11-18 Thread Matthias Meyer
Jeffrey J. Kosowsky wrote:

 One of my recent backup logs had a single error as follows:
 ...
 create d 700  1005/513   0 spoolerlogs
 Remote[1]: rsync error: timeout in data send/receive (code 30) at
 /home/lapo/packaging/rsync-3.0.6-1/src/rsync-3.0.6/io.c(200)
 [sender=3.0.6] Read EOF:
 
 Then the log proceeds:
   Tried again: got 0 bytes
   delete   700  1005/5131024 Documents and
   Settingsnetworkingntuser.dat.LOG
   delete   700  1005/513  262144 Documents and
   Settingsnetworkingntuser.dat Child is aborting
   Done: 969 files, 587612347 bytes
 
 
 So my questions are as follows:
 1. What does the Read EOF line mean and is it likely to be the cause
or effect (or unrelated) to the previous rsync timeout?

related to the previous rsync timeout

 
 2. When the log says Tried again: got 0 bytes and proceeds to list
two more files before aborting, does that mean that the backup
resumed and completed successfully? If so, were any files likely to
be lost?

No. The backup was not completed. Maybee you have a partial backup at least.
Nevertheless, the two files listed are deleted (within your backup) because
theire transmission was not finished.
 
 3. When it says Child is aborting, is that a normal completion or is
there an issue?
 

Your windows client canceld the connection. Reason can be the provider or
any other network problem. Theoretically also firewall or antivirus. But
until now I didn't saw that.

br
Matthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Was BackupPC_trashClean ever fixed not to hog CPU

2009-11-18 Thread Les Mikesell
Jeffrey J. Kosowsky wrote:
 In the past, people have mentioned that sometimes BackupPC_trashClean
 seems to get hung and hog almost 100% of CPU.
 
 This is quite insidious since what apparently happens is that when
 trashClean has trouble removing a file(s), it just continues to loop
 through trying.

Filesystem problem?

 However, no error message is posted to the log or the web
 interface. In my case, it seems that this had been going on for days
 without any warning.

Can you use strace -p  to see the system call that is happening?

 Also, Craig mentioned in the past that this could be due to permission
 error and suggested running sudo -u backuppc rm -rv trash/* to
 check. Well in my case, the 'rm' succeeded suggesting that the problem
 may lie elsewhere.
 
 But again, I think there are two issues that need to be fixed:
 1. trash_Clean should not be allowed to go into a tight endless loop
 (I think Craig had mentioned inserting a wait in the loop but not sure
 if it ever was done)
 2. Also, errors in trash removal should be logged and added to the
 regular email error messages that are sent out.

If backuppc owns the directory, an unlink can't fail.

-- 
   Les Mikesell
lesmikes...@gmail.com



--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Was BackupPC_trashClean ever fixed not to hog CPU

2009-11-18 Thread Jeffrey J. Kosowsky
Jeffrey J. Kosowsky wrote at about 15:39:44 -0500 on Wednesday, November 18, 
2009:
  In the past, people have mentioned that sometimes BackupPC_trashClean
  seems to get hung and hog almost 100% of CPU.
  
  This is quite insidious since what apparently happens is that when
  trashClean has trouble removing a file(s), it just continues to loop
  through trying.
  
  However, no error message is posted to the log or the web
  interface. In my case, it seems that this had been going on for days
  without any warning.
  
  Also, Craig mentioned in the past that this could be due to permission
  error and suggested running sudo -u backuppc rm -rv trash/* to
  check. Well in my case, the 'rm' succeeded suggesting that the problem
  may lie elsewhere.
  
  But again, I think there are two issues that need to be fixed:
  1. trash_Clean should not be allowed to go into a tight endless loop
  (I think Craig had mentioned inserting a wait in the loop but not sure
  if it ever was done)
  2. Also, errors in trash removal should be logged and added to the
  regular email error messages that are sent out.
  

Let me take some of the above back.
Looking at the code for BackupPC_trashClean I see that it runs every 5
minutes and it does log on failure.

However, in my case, it seems to somehow have gotten 'stuck'. The last
log for trashClean was 9 days ago and just gave the startup message:
2009-11-09 22:38:11 Running BackupPC_trashClean (pid=17217)
So, somehow it was running for the last 9 days but neither removed the
trash nor logged an error. And hogged almost all my cpu usage.

Restarting backuppc, seems to have removed everything in 'trash' and
fixed the problem at least for now -- but not sure what went wrong to
cause this.

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Suggestion for improvement of BackupPC_dump

2009-11-18 Thread Christian Völker
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Hi,

to be honest I don't have a clue if your suggested solution will fix or
npt break something else.
But what I know is that I had the same issue. I solved it by upgrading
the line to a leased line ;)
But on other clients I still backup over broadband lines with cut-offs
after 24hrs. :-(

So if this is a solution I'd second that to implement it in a future
release- just from my point of view as user :-)

Greets

Christian

SO I'd really
Matthias Meyer wrote:
 Problem:
 
 It isn't unusual that internet connections abort once a day. A lot of 
 providers are doing that. Therefore it isn't unusual too that a full 
 backup aborts and will be saved as an partial backup. If the next full 
 backup aborts too, it will only be saved if it contains more files than 
 the previous partial.
 If a big file will be added on client after a partial backup is stored 
 than it is possible that the transfer of this new file, within the next 
 full run, needs a lot of time. If this full backup aborts too than it is 
 possible that it contains less files than the previous backup. It will be 
 cancelled and this big file have to be retransmitted within the next full 
 run.
 
 In average I can transmit 6GB per day, depending on the internet upload 
 bandwith of the client. So it isn't unlikely that I will never backup this 
 client, because of the above problem.
 Don't advise the $Conf{PartialAgeMax}. This would change a reliable 
 backup into a game of luck ;-) (I know, that's not really true. But I didn't
 believe this would be a reliable solution)
 
 Solution:
 
 BackupPC_dump compare the filecount and decide if the old partial backup 
 must be removed or not. If BackupPC_dump remove the old partial it will 
 rename the $TopDir/pc/$client/new directory into $TopDir/pc/$client/nnn 
 after this. In the other case it will only remove the
 $TopDir/pc/$client/new.
 
 Instead this, BackupPC_dump should move the $TopDir/pc/$client/new 
 always over $TopDir/pc/$client/nnn, overwriting existing files and 
 creating new files in $TopDir/pc/$client/nnn. The NewFileList contains
 all new files anyway. Therefore BackupPC_link should do his job perfectly
 too.
 
 Advantage:
 
 + All transmitted files will be saved within the actual backup.
   Nevertheless if a full backup will be aborted after a short or long time 
   or not.
 + The continuation (strictly speaking the 2nd or later continuation) of a
   partial backup must not retransmit files which are always transmitted.
 
 Disadvantage:
 
 None known.
 Such a full backup can have a duration of more than a few days. It is not
 sure which version of changed files are in the backup or not. But the
 duration of a [full] backup isn't really influenceable. So also without 
 the above improvement we didn't know which version of a file is within 
 the backup.
 e.g.: If we use snapshots and a file will be deleted after the start of a 
 backup it will be stored within the backup. If we don't use snapshots that 
 isn't sure. It depends from the time where it was deleted. If we use 
 different shares the above case isn't sure too.
 
 There is no different between the BackupPC_dump implementation and the above
 suggestion, but less retransmits.
 
 What is your opinion about this?
 
 br
 Mattthias

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.5 (GNU/Linux)
Comment: Using GnuPG with CentOS - http://enigmail.mozdev.org/

iD8DBQFLBGZW0XNIYlAXmzsRAjWFAKCvWjXGkQ6L3Gj3+x1HaJSCPWdZCACgka5t
+CwlNxuLn6G62f9oVplnCbs=
=P5NR
-END PGP SIGNATURE-

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] removing an old backup

2009-11-18 Thread Matthias Meyer
Jeffrey J. Kosowsky wrote:

 There is a bash script written by Matthias Meyer (and available by
 googling on the web).
 I think called BackupPC_deleteBackup (or maybe that's just what I
 called it ;)
 
No, that was the name from me :-)
I have an updated version of this script.
Included your patch as well as can remove an entiry host.

Unfortunateyl I can not put it into the wiki.
http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=How_to_delete_backups
Don't know how :-(

Can I mail the script to someone which put it into the wiki?

br
Matthias
-- 
Don't Panic


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Incremental Errors

2009-11-18 Thread Heath Yob

I'm getting an new error with my incremental.
It would seem when the incremental runs on it's own I sometimes get  
the following error:
2009-11-18 16:00:00 incr backup started back to 2009-11-17 15:00:01   
(backup #11) for share C$
2009-11-18 16:00:00 incr backup started back to 2009-11-17 15:00:01   
(backup #11) for share C$
2009-11-18 16:04:51 Got fatal error during xfer (No backup directory / 
var/lib/backuppc/pc/spangler-dc7900/new)
2009-11-18 16:04:51 incr backup 12 complete, 78 files, 1244034627  
bytes, 1 xferErrs (0 bad files, 0 bad shares, 1 other)

2009-11-18 16:04:51 removing incr backup 5
2009-11-18 16:04:56 Backup aborted (No backup directory /var/lib/ 
backuppc/pc/spangler-dc7900/new)
If I run the incremental manually there are no errors and it runs just  
fine.  Anyone else run into this?


Thanks,

heath--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental Errors

2009-11-18 Thread Michael Stowe
 I'm getting an new error with my incremental.
 It would seem when the incremental runs on it's own I sometimes get
 the following error:
 2009-11-18 16:00:00 incr backup started back to 2009-11-17 15:00:01
 (backup #11) for share C$
 2009-11-18 16:00:00 incr backup started back to 2009-11-17 15:00:01
 (backup #11) for share C$
 2009-11-18 16:04:51 Got fatal error during xfer (No backup directory /
 var/lib/backuppc/pc/spangler-dc7900/new)
 2009-11-18 16:04:51 incr backup 12 complete, 78 files, 1244034627
 bytes, 1 xferErrs (0 bad files, 0 bad shares, 1 other)
 2009-11-18 16:04:51 removing incr backup 5
 2009-11-18 16:04:56 Backup aborted (No backup directory /var/lib/
 backuppc/pc/spangler-dc7900/new)
 If I run the incremental manually there are no errors and it runs just
 fine.  Anyone else run into this?

No, but I'd be pretty concerned about your /var/lib/backuppc/.../new
directory apparently going away in the middle of a backup.  You should
probably check for filesystem corruption.

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental Errors

2009-11-18 Thread Adam Goryachev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Michael Stowe wrote:
 I'm getting an new error with my incremental.
 It would seem when the incremental runs on it's own I sometimes get
 the following error:
 2009-11-18 16:00:00 incr backup started back to 2009-11-17 15:00:01
 (backup #11) for share C$
 2009-11-18 16:00:00 incr backup started back to 2009-11-17 15:00:01
 (backup #11) for share C$
 2009-11-18 16:04:51 Got fatal error during xfer (No backup directory /
 var/lib/backuppc/pc/spangler-dc7900/new)
 2009-11-18 16:04:51 incr backup 12 complete, 78 files, 1244034627
 bytes, 1 xferErrs (0 bad files, 0 bad shares, 1 other)
 2009-11-18 16:04:51 removing incr backup 5
 2009-11-18 16:04:56 Backup aborted (No backup directory /var/lib/
 backuppc/pc/spangler-dc7900/new)
 If I run the incremental manually there are no errors and it runs just
 fine.  Anyone else run into this?
 
 No, but I'd be pretty concerned about your /var/lib/backuppc/.../new
 directory apparently going away in the middle of a backup.  You should
 probably check for filesystem corruption.

Or work out why you have two backups starting for the same host at the
same time!

Regards,
Adam

- --
Adam Goryachev
Website Managers
www.websitemanagers.com.au
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAksEm7oACgkQGyoxogrTyiWXzACgzKy0XNzaNWz02AP6cmlAARZZ
FfkAoJQwe4aXmssV56n+Gvo+1WRaINJN
=dyEV
-END PGP SIGNATURE-

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental Errors

2009-11-18 Thread Heath Yob
Yeah I saw that.  I wasn't sure why that entry is in there twice.   
I'll look at my schedule and see what's up.

Sent from my iPhone

On Nov 18, 2009, at 5:13 PM, Adam Goryachev 
mailingli...@websitemanagers.com.au 
  wrote:

 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1

 Michael Stowe wrote:
 I'm getting an new error with my incremental.
 It would seem when the incremental runs on it's own I sometimes get
 the following error:
 2009-11-18 16:00:00 incr backup started back to 2009-11-17 15:00:01
 (backup #11) for share C$
 2009-11-18 16:00:00 incr backup started back to 2009-11-17 15:00:01
 (backup #11) for share C$
 2009-11-18 16:04:51 Got fatal error during xfer (No backup  
 directory /
 var/lib/backuppc/pc/spangler-dc7900/new)
 2009-11-18 16:04:51 incr backup 12 complete, 78 files, 1244034627
 bytes, 1 xferErrs (0 bad files, 0 bad shares, 1 other)
 2009-11-18 16:04:51 removing incr backup 5
 2009-11-18 16:04:56 Backup aborted (No backup directory /var/lib/
 backuppc/pc/spangler-dc7900/new)
 If I run the incremental manually there are no errors and it runs  
 just
 fine.  Anyone else run into this?

 No, but I'd be pretty concerned about your /var/lib/backuppc/.../new
 directory apparently going away in the middle of a backup.  You  
 should
 probably check for filesystem corruption.

 Or work out why you have two backups starting for the same host at the
 same time!

 Regards,
 Adam

 - --
 Adam Goryachev
 Website Managers
 www.websitemanagers.com.au
 -BEGIN PGP SIGNATURE-
 Version: GnuPG v1.4.9 (GNU/Linux)
 Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

 iEYEARECAAYFAksEm7oACgkQGyoxogrTyiWXzACgzKy0XNzaNWz02AP6cmlAARZZ
 FfkAoJQwe4aXmssV56n+Gvo+1WRaINJN
 =dyEV
 -END PGP SIGNATURE-

 --- 
 --- 
 --- 
 -
 Let Crystal Reports handle the reporting - Free Crystal Reports 2008  
 30-Day
 trial. Simplify your report design, integration and deployment - and  
 focus on
 what you do best, core application coding. Discover what's new with
 Crystal Reports now.  http://p.sf.net/sfu/bobj-july
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/