[BackupPC-users] BackupPC doesn't backup anymore

2008-07-28 Thread Diederik De Deckere

Hi,

I'm running BackupPC for a long time.  We shutdown our backup server  
for a week and now BackupPC doesn't perform any scheduled backup.   
There are no errors, we can start a backup manually.  All hosts are  
marked green with a status 'idle' and last attempt 'done'.  I've been  
googling a little but can't find a solution.  What could be the cause?


thanks,
Diederik De Deckere



-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPc not cleaning out cpool

2008-07-28 Thread Joseph Holland
So I'm using BackupPC version 3.0 and in the last couple of weeks our 
disk usage has gone through the roof.  We're currently at 96% full on 
750GB disks.  If you look at the web interface of the BackupPC it says 
that it's only using 158GB in the pool, but if you do a du -smh on the 
  cpool directory it comes back with over 500GB.  The BackupPC trash 
cleans seem to be working (you can see that it cleaned 4GB from the pool 
last night).

Anyone ever see anything like this before.


Joe.

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] sync with bittorrent

2008-07-28 Thread Daniel Denson
I recently read a tip on lifehacker about checking and fixing downloaded 
ISO media with bittorrent.  bittorrent is designed for small incremental 
part downloads and organizing that data which could make it a nice fit 
for remote filesystem syncing with any filesystem that can do readable 
snapshots.

consider make an LVM snapshot and then a torrent file for it.  setup 
your backuppc server as a bittorrent tacker.  send the torrent to the 
remote machine and run it with rtorrent or some cli torrent client.

i'm going to time torrent creation on my filesystem, i will return results

thanks

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] sync with bittorrent

2008-07-28 Thread Daniel Denson
1m5.6s to add 6159 files or  .01 seconds per file.  the torrent is 200K
i did a single directory, ill try this whole server next.

Daniel Denson wrote:
 I recently read a tip on lifehacker about checking and fixing 
 downloaded ISO media with bittorrent.  bittorrent is designed for 
 small incremental part downloads and organizing that data which could 
 make it a nice fit for remote filesystem syncing with any filesystem 
 that can do readable snapshots.

 consider make an LVM snapshot and then a torrent file for it.  setup 
 your backuppc server as a bittorrent tacker.  send the torrent to the 
 remote machine and run it with rtorrent or some cli torrent client.

 i'm going to time torrent creation on my filesystem, i will return 
 results

 thanks


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] sync with bittorrent

2008-07-28 Thread Daniel Denson
too many files for bittorrent to handle, need to wrap the fileset up 
with tar unfortunately.  also, bittorrent wont handle the hardlinks so 
again it needs wrapped up in tar.

oh well, it was a thought.

Daniel Denson wrote:
 1m5.6s to add 6159 files or  .01 seconds per file.  the torrent is 200K
 i did a single directory, ill try this whole server next.

 Daniel Denson wrote:
 I recently read a tip on lifehacker about checking and fixing 
 downloaded ISO media with bittorrent.  bittorrent is designed for 
 small incremental part downloads and organizing that data which could 
 make it a nice fit for remote filesystem syncing with any filesystem 
 that can do readable snapshots.

 consider make an LVM snapshot and then a torrent file for it.  setup 
 your backuppc server as a bittorrent tacker.  send the torrent to the 
 remote machine and run it with rtorrent or some cli torrent client.

 i'm going to time torrent creation on my filesystem, i will return 
 results

 thanks



-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] sync with bittorrent

2008-07-28 Thread Martin Leben
Daniel Denson wrote:
 I recently read a tip on lifehacker about checking and fixing downloaded 
 ISO media with bittorrent.  bittorrent is designed for small incremental 
 part downloads and organizing that data which could make it a nice fit 
 for remote filesystem syncing with any filesystem that can do readable 
 snapshots.
 
 consider make an LVM snapshot and then a torrent file for it.  setup 
 your backuppc server as a bittorrent tacker.  send the torrent to the 
 remote machine and run it with rtorrent or some cli torrent client.


Hmm... Have understood you correct if what you want to achieve is a sync of a 
large file set without the huge memory overhead of rsync?

/Martin


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] sync with bittorrent

2008-07-28 Thread Daniel Denson

yes

Martin Leben wrote:

Daniel Denson wrote:
  
I recently read a tip on lifehacker about checking and fixing downloaded 
ISO media with bittorrent.  bittorrent is designed for small incremental 
part downloads and organizing that data which could make it a nice fit 
for remote filesystem syncing with any filesystem that can do readable 
snapshots.


consider make an LVM snapshot and then a torrent file for it.  setup 
your backuppc server as a bittorrent tacker.  send the torrent to the 
remote machine and run it with rtorrent or some cli torrent client.




Hmm... Have understood you correct if what you want to achieve is a sync of a 
large file set without the huge memory overhead of rsync?


/Martin


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
  
-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Howto backup BackupPC running on a RAID1 with mdadm for offline-storage

2008-07-28 Thread Les Mikesell
Daniel Denson wrote:
 i think you missed the point on sending zfs snapshots.  *if* the remote 
 filesystem is in sync with the snapshot you have then it should work, 
 but what if that is not the case?

Don't do the next one until the previous has succeeded.  All you need is 
a status returned for the transfer and some simple scripting for that.

 if the remote filesystem is not 
 synced with any specific snapshot, then there is no base level target to 
 snapshot from so the new smaller snapshot cannot be created because 
 common sync point is not known.

Don't discard your previous snapshots at either end until the update 
succeeds.

 you would first have to make sure that 
 the remote filesystem matched a snapshot on the local system and then 
 proceed if it does, if it does not then what?  run a 100+GB complete 
 sync over a 1.5Mb link?

Don't let that happen.  If an incremental doesn't succeed, revert the 
target and start that update over again.

 there is the problem, what the remote system 
 falls out of sync because of a deleted snapshot or missed sync?  how 
 will you check to see if the remote filesystem and local snapshot are 
 the same?

If the incremental update status says it succeeds, I'd expect it to have 
actually succeeded.  I thought that zfs had its own internal 
checksumming concepts to ensure that it is consistent.  In any case, if 
you felt you had to do a check you could probably do an 'rsync -n' on 
the pool and each pc directory separately.  You don't need to 
reconstruct the hardlinks in this case, just verify that the contents 
exist and match - so you don't take the hit of having to get all the 
filenames loaded in one run.

 I think the intent of the zfs send is just to migrate a filesystem from 
 one system to another.

Then why would it have an incremental mode?  The real question is 
whether it is efficient enough in handling hardlinks to make it worth 
scripting the updates to make sure the updates are applied consistently.

 as far as rsync in concerned, i think that you need to have a ton of ram and 
 a fast CPU to make large fileset transfers work, which I have.  I doubt a 1GB 
 p3 1Ghz is going to cut it.

Hmmm, you probably need a 64-bit OS too, so you can use that ram in one 
process.

-- 
   Les Mikesell
[EMAIL PROTECTED]


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Howto backup BackupPC running on a RAID1 with mdadm for offline-storage

2008-07-28 Thread Daniel Denson
I am just using 32bit ubuntu with 4GB(3.4GB available) and it is working 
nicely. 


   as far as rsync in concerned, i think that you need to have a ton of
   ram and a fast CPU to make large fileset transfers work, which I
   have.  I doubt a 1GB p3 1Ghz is going to cut it. Hmmm, you probably
   need a 64-bit OS too, so you can use that ram in one process.


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC doesn't backup anymore

2008-07-28 Thread Les Mikesell
Diederik De Deckere wrote:
 Hi,
 
 I'm running BackupPC for a long time.  We shutdown our backup server for 
 a week and now BackupPC doesn't perform any scheduled backup.  There are 
 no errors, we can start a backup manually.  All hosts are marked green 
 with a status 'idle' and last attempt 'done'.  I've been googling a 
 little but can't find a solution.  What could be the cause?

The most likely reason for this is that you've consumed more that 95% of 
the free space on the filesystem.

-- 
   Les Mikesell
[EMAIL PROTECTED]

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] how to compare the active file system to the last backup

2008-07-28 Thread Rob Owens
Rsync has a couple of useful options:  --dry-run and --ignore-existing. 
  I'm not sure if the rsync-perl module supports those options, but if 
it did you could use those options in your restore command (combined 
with --verbose) to output a list of files that rsync would restore if 
you weren't using the --dry-run option.

Note that using --ignore-existing will cause rsync to miss any files on 
your system that were corrupted by the backup, but not completely lost. 
  Hmm, maybe --ignore-times would be more appropriate than 
--ignore-existing if you're concerned about corrupted files...

-Rob

Emilie Ann Phillips wrote:
 My system just crashed rather nastily and I would like to verify that
 fsck recovered everything properly.
 
 Is there a way to diff the mounted file system vs the latest backup
 without restoring the backup to a temporary location and doing the
 diff by hand?
 
 Emilie
 
 -
 This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
 Build the coolest Linux based applications with Moblin SDK  win great prizes
 Grand prize is a trip for two to an Open Source event anywhere in the world
 http://moblin-contest.org/redirect.php?banner_id=100url=/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


The information transmitted is intended only for the person or entity to
which it is addressed and may contain confidential and/or privileged
material. If you are not the addressee, any disclosure, reproduction,
copying, distribution, or other dissemination or use of this transmission in
error please notify the sender immediately and then delete this e-mail.
E-mail transmission cannot be guaranteed to be secure or error free as
information could be intercepted, corrupted lost, destroyed, arrive late or
incomplete, or contain viruses.
The sender therefore does not accept liability for any errors or omissions
in the contents of this message which arise as a result of e-mail
transmission. If verification is required please request a hard copy
version.




-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Fatal error during xfer

2008-07-28 Thread Steve Blackwell
I just installed BackupPC and I started by attempting to backup just my
local machine which is also the BackupPC server. 

Since this is a local machine I don't need ssh so I set my TarClientCmd
to $tarPath -c -v -f - -C $shareName+ --totals. This worked OK but
some directories were not backed up because the user, backuppc, doesn't
have root privileges. Next I set up sudo to allow backuppc to run tar
as root. I su'd to backuppc to test and it worked fine. My new
TarClientCmd is sudo $tarPath -c -v -f - -C $shareName+ --totals.

The problem is that when backuppc runs a backup at the schedules time,
I get this error:
 
Exec failed for sudo /bin/tar -c -v -f - -C / --totals
--newer=2008-07-27\ 17:56:09 . 
tarExtract: Done: 0 errors, 0 filesExist, 0 sizeExist, 0 sizeExistComp,
0 filesTotal, 0 sizeTotal 
Got fatal error during xfer (Exec failed for sudo /bin/tar -c -v -f
- -C / --totals --newer=2008-07-27\ 17:56:09 .) 
Backup aborted (Exec failed for id; sudo /bin/tar -c -v -f
- -C / --totals --newer=2008-07-27\ 17:56:09 .)

Any suggestions.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Fatal error during xfer

2008-07-28 Thread Steve Blackwell
 Adam Goryachev [EMAIL PROTECTED] wrote: 
 Set the full path to sudo eg /bin/sudo
 
 Maybe that will help.
 
 Regards,
 Adam  

Thanls for the suggestion but I still get the same error.

Stev e.

 Steve Blackwell wrote:  
  I just installed BackupPC and I started by attempting to backup
  just my local machine which is also the BackupPC server. 
 
  Since this is a local machine I don't need ssh so I set my
  TarClientCmd to $tarPath -c -v -f - -C $shareName+ --totals. This
  worked OK but some directories were not backed up because the user,
  backuppc, doesn't have root privileges. Next I set up sudo to allow
  backuppc to run tar as root. I su'd to backuppc to test and it
  worked fine. My new TarClientCmd is sudo $tarPath -c -v -f - -C
  $shareName+ --totals.
 
  The problem is that when backuppc runs a backup at the schedules
  time, I get this error:
   
  Exec failed for sudo /bin/tar -c -v -f - -C / --totals
  --newer=2008-07-27\ 17:56:09 . 
  tarExtract: Done: 0 errors, 0 filesExist, 0 sizeExist, 0
  sizeExistComp, 0 filesTotal, 0 sizeTotal 
  Got fatal error during xfer (Exec failed for sudo /bin/tar -c -v -f
  - -C / --totals --newer=2008-07-27\ 17:56:09 .) 
  Backup aborted (Exec failed for id; sudo /bin/tar -c -v -f
  - -C / --totals --newer=2008-07-27\ 17:56:09 .)
 
  Any suggestions.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Fatal error during xfer

2008-07-28 Thread Craig Barratt
Steve writes:

 Since this is a local machine I don't need ssh so I set my TarClientCmd
 to $tarPath -c -v -f - -C $shareName+ --totals. This worked OK but
 some directories were not backed up because the user, backuppc, doesn't
 have root privileges. Next I set up sudo to allow backuppc to run tar
 as root. I su'd to backuppc to test and it worked fine. My new
 TarClientCmd is sudo $tarPath -c -v -f - -C $shareName+ --totals.
 
 The problem is that when backuppc runs a backup at the schedules time,
 I get this error:
 
 Exec failed for sudo /bin/tar -c -v -f - -C / --totals
 --newer=2008-07-27\ 17:56:09 .

You have to use a full path for sudo.  BackupPC doesn't use a shell
to execute commands (to reduce potential security holes), so you
need a full path to the executable.

Craig

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/