[BackupPC-users] backing up localhost

2007-05-13 Thread M. Sabath
I set up backuppc.

It works with the clients but not with localhost.

Empty folders and some few files do get backed up. 


The command:
/usr/share/backuppc/bin/BackupPC_dump -v -f localhost 


commes up with entries like:

tarExtract: Can't link /var/lib/backuppc/pc/localhost/new/f%
2fetc/fhdparm.conf
to /var/lib/backuppc/pool/1/d/5/1d5a89be4b5f331ad4eaedd493e5eeff


I tried:
http://backuppc.sourceforge.net/faq/localhost.html

but it doesn't help. The folders stay empty.


I use Ubuntu 7.04 with backuppc 2.1.2
(the root user I activated)


Thank you

Markus


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC presentation online

2007-05-13 Thread Ski Kacoroski
Go to http://video.google.com and search for BackupPC.  This is a
presentation I gave at LinuxfestNW 2007.

Appreciate your comments and corrections (I am sure there are some :).

cheers,

ski

-- 
When we try to pick out anything by itself, we find it
 connected to the entire universeJohn Muir

Chris Ski Kacoroski, [EMAIL PROTECTED], 206-501-9803

-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] API for full size?

2007-05-13 Thread Keith Edmunds
Is there a way for a script to retrieve the Full Size/GB as reported on
the Host Summary page? I want to write a script that checks that users
haven't exceeded their backup quota.

Thanks for any pointers -
Keith

-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Status of rsync backup

2007-05-13 Thread Jason M. Kusar
Is there any way to tell the status of an on-going rsync backup?  
Ideally, I'd like to see how many file have been backed up, how many are 
left, how many were skipped because they're already there, etc.  I know 
this is possible with the command line rsync via the --progress option, 
but I'm not sure about File::RsyncP.  Also, I want to be able to view it 
while the backup is still happening.  I have a large backup that is 
taking several days to complete and I want to be able to see what is 
going on.  Also, is there any way to view the log file as it's being 
created?  I tried using zcat, but that didn't do it.  file reported it 
as simply being data, so I'm not sure what type of compression it's even 
using.  If that is not possible, is there a way to have it log the file 
un-compressed and just compress it once the backup is complete since 
then it will be viewable in the web interface?

Thanks,
--Jason

-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] API for full size?

2007-05-13 Thread Holger Parplies
Hi,

Keith Edmunds wrote on 14.05.2007 at 00:00:33 [[BackupPC-users] API for full 
size?]:
 Is there a way for a script to retrieve the Full Size/GB as reported on
 the Host Summary page?

yes, as the Host Summary page does :-). See BackupPC::CGI::Summary.

Setting XEmacs to quick-hack-mode, I get something like this:


-*- quick-hack -*-
#!/usr/bin/perl

use strict;

use lib '/usr/share/backuppc/lib'; # adapt to match your installation
use BackupPC::Lib;

my $hostname = shift
  or die Usage: $0 hostname\n;

my $bpc = new BackupPC::Lib
  or die Can't create BackupPC object!\n;

my @backups = $bpc - BackupInfoRead ($hostname)
  or die Invalid hostname '$hostname' or other error!\n;

for (my $i = @backups - 1; $i = 0; $i --) {
  if ($backups [$i] {type} eq 'full') {
printf %.2f\n, $backups [$i] {size} / 1024 / 1024 / 1000;
last;
  }
}
-*- /quick-hack -*-


Note that
- you need to run that as the backuppc user for access to the information,
- you only get the number as output, formatted as in the host summary page,
- the number is in units of 1000 MB, where 1 MB is 2^20 Bytes - I don't
  know if I've seen *that* definition of GB somewhere before ;-),
- no error checking, no guarantees etc.

 I want to write a script that checks that users
 haven't exceeded their backup quota.

If you want to iterate over all hosts belonging to one user, look at
GetUserHosts in BackupPC::CGI::Lib ...

-*- quick-hack -*-
$hostinfo = $bpc - HostInfoRead ();
@hosts = sort grep { $hostinfo - {$_} {user} eq $user ||
 $hostinfo - {$_} {moreUsers} =~ /(?:^|,)$user(?:,|$)/ }
  keys %$hostinfo;
-*- /quick-hack -*-

(you might consider belonging to one user to mean to leave out the
moreUsers check).

Regards,
Holger

-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Partial backups

2007-05-13 Thread Jason M. Kusar
Ok, now that I know how to view log files from the command line (guess I 
should check the bin dir first from now on :-), it looks like when a 
partial backup is saved, it only contains individual shares that were 
actually backed up completely.  I have a server with four separate 
partitions that I back up separately using the rsync --one-file-system 
parameter.  The first one completed and the second one failed about half 
way through.  The saved partial seems only to include the first 
partition and not the partial backup of the second.  This seems to be 
confirmed by the fact that the log file for the currently running full 
includes lines saying that files are being created that I know were 
backed up in the first run.  So my question is this:

What happens to those files that rsync transferred before the backup 
aborted?  Is there any way to have the next run of rsync start from that 
base so that they don't need to be transferred over again?

If I am looking at this wrong, let me know.  I may be missing something 
in the log files.

Thanks,
--Jason

-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] SSH Tunnel HOWTO for BackupPC

2007-05-13 Thread Francis Lessard
Hi Johan,

Here is the output of ./BackupPC_dump -v -f myhost :

Results

cmdSystemOrEval: about to system /bin/ping -c 1 -w 3 localhost
cmdSystemOrEval: finished: got output PING localhost (127.0.0.1) 56(84)
bytes of data.
64 bytes from localhost (127.0.0.1): icmp_seq=1 ttl=64 time=0.048 ms

--- localhost ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.048/0.048/0.048/0.000 ms

cmdSystemOrEval: about to system /bin/ping -c 1 -w 3 localhost
cmdSystemOrEval: finished: got output PING localhost (127.0.0.1) 56(84)
bytes of data.
64 bytes from localhost (127.0.0.1): icmp_seq=1 ttl=64 time=0.046 ms

--- localhost ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.046/0.046/0.046/0.000 ms

CheckHostAlive: returning 0.046
Executing DumpPreUserCmd: /etc/BackupPC/ssh-wrapper -p 3022 -f -L
7001:internalremoteip:873 [EMAIL PROTECTED] sleep 20
cmdSystemOrEval: about to system /etc/BackupPC/ssh-wrapper -p 3022 -f -L
7001:internalremoteip:873 [EMAIL PROTECTED] sleep 20
cmdSystemOrEval: finished: got output
full backup started for directory fortune
started full dump, share=hidden
Error connecting to rsync daemon at localhost:7001: inet connect: Connection
refused
Got fatal error during xfer (inet connect: Connection refused)
cmdSystemOrEval: about to system /bin/ping -c 1 -w 3 localhost
cmdSystemOrEval: finished: got output PING localhost (127.0.0.1) 56(84)
bytes of data.
64 bytes from localhost (127.0.0.1): icmp_seq=1 ttl=64 time=0.049 ms

--- localhost ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.049/0.049/0.049/0.000 ms

cmdSystemOrEval: about to system /bin/ping -c 1 -w 3 localhost
cmdSystemOrEval: finished: got output PING localhost (127.0.0.1) 56(84)
bytes of data.
64 bytes from localhost (127.0.0.1): icmp_seq=1 ttl=64 time=0.047 ms

--- localhost ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.047/0.047/0.047/0.000 ms

CheckHostAlive: returning 0.047
Backup aborted (inet connect: Connection refused)
dump failed: inet connect: Connection refused


I have not managed to see what 'exact' command is sent after the wrapper...

Thanks for your support !

Francis





-Message d'origine-
De : Johan Ehnberg [mailto:[EMAIL PROTECTED] 
Envoyé : 10 mai 2007 09:59
À : Francis Lessard
Cc : 'Craig Barratt'; 'BackupPC Users'
Objet : Re: [BackupPC-users] SSH Tunnel HOWTO for BackupPC

Hi,

Can you try running the Dump command manually and post me the output? 
The documentation tells you how.

Changing the SSH port will not mess with anything, as long as you have 
the same commands in BackupPC and on the command line.

I want to see your tunnel working, so hang in there :).

/johan

Francis Lessard wrote:
 Hi Johan,
 
 You document on how to use ssh tunneling with BackupPc is brilliant. I
tried
 it and it works in test, but not in BackupPc.
 In my shell, loggued as backuppc (I replaced internalip, username,
gateway)
 
[EMAIL PROTECTED]:/home$ /etc/BackupPC/ssh-wrapper -p 3022 -f -L
 7001:internalip:873 [EMAIL PROTECTED] sleep 20
 
 works good : SSH Started succesfully
 
 After, ONLY this command have worked :
 
[EMAIL PROTECTED]:/home$ rsync --port=7001
 [EMAIL PROTECTED]::myrsyncservice
 
 I tried to used the --port=7001 argument in BackupPC Cgi + several combos,
 no success, I have not found in logs the complete rsync command that
 BackupPC sends. Maybe it could me debug... My only hint is the port 3022 I
 use instead of the standard port 22 on the ssh gateway. Could that mix
 things up ? 
 
 Thank you for opinion on that.
 
 
 Regards,
 
 Francis
 
 
 
 -Message d'origine-
 De : [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] De la part de Johan
 Ehnberg
 Envoyé : 7 mai 2007 10:13
 À : Johan Ehnberg; Craig Barratt
 Cc : BackupPC Users
 Objet : Re: [BackupPC-users] SSH Tunnel HOWTO for BackupPC
 
 Johan Ehnberg wrote:
 Dear BackupPC users and developers,

 I have created a short HOWTO for BackupPC on using SSH tunnels 
 automatically with each job. This is the first version, but it should be 
 complete.

 The HTML version of the document can be found on:
 http://www.ehnberg.net/johan/files/docs/backuppc-ssh-tunnel-howto.html

 Please feel free to try it out and send comments. It can also be added 
 to the SSH FAQ later on, if it is useful.

 Best regards,
 Johan Ehnberg
 
 I have now been running this model for two weeks and it works like a 
 charm. I consider it production quality and it could be added to the 
 FAQ. It's mature software and a proven model - not much to prove there.
 
 If my HOWTO is too long, only section 4 could be merged, of course. Or 
 it could be a page of it's own. In any case, it's free for use and 
 modification as long as my name is in it somewhere ;).
 
 BTW. I am going to change my whole backup system from another program to 
 BackupPC; I was simply