Re: [BackupPC-users] free backup software

2008-08-04 Thread Alexandre - ArchivTech
Hi there,
maybe you should try this one (free for any purpose, but not
opensource)  : http://www.syncexp.com/wp/

It seems to work great.

Hope this help.


-  
Alexandre BLANC
Responsable SI - Ingénierie systèmes et réseaux
IT Manager - Information Systems Network Manager
Archiv'Tech
[EMAIL PROTECTED]
Mob : 06 72 133 811


Le dimanche 03 août 2008 à 20:55 -0400, drew64 a écrit :
 I have been looking for some free backup software to back up my music and 
 photos to an external hard drive and maybe DVD's. I have tried Cobian but 
 dont know if I like it since it has no restore function. Have read somethings 
 about winbackup and was also told to try comodo backup. Any one have any 
 experience with these. Also When I tried Cobian I set it to incremmental 
 backup. It worked but I see it makes a new folder everytime a backup is made. 
 Is there a way or a program that will just back up and replace the file that 
 has changed into the same folder?
 
 +--
 |This was sent by [EMAIL PROTECTED] via Backup Central.
 |Forward SPAM to [EMAIL PROTECTED]
 +--
 
 
 
 -
 This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
 Build the coolest Linux based applications with Moblin SDK  win great prizes
 Grand prize is a trip for two to an Open Source event anywhere in the world
 http://moblin-contest.org/redirect.php?banner_id=100url=/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
 


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] free backup software

2008-08-04 Thread drew64

I have been looking for some free backup software to back up my music and 
photos to an external hard drive and maybe DVD's. I have tried Cobian but dont 
know if I like it since it has no restore function. Have read somethings about 
winbackup and was also told to try comodo backup. Any one have any experience 
with these. Also When I tried Cobian I set it to incremmental backup. It worked 
but I see it makes a new folder everytime a backup is made. Is there a way or a 
program that will just back up and replace the file that has changed into the 
same folder?

+--
|This was sent by [EMAIL PROTECTED] via Backup Central.
|Forward SPAM to [EMAIL PROTECTED]
+--



-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] One PC Very Slow to Backup

2008-08-04 Thread naigy

Still doing some testing. Not sure of the cause but one thing I do know is that 
I did a incremental and a full backup from the command line and both of these 
were done in the normal time of a bit over 2 hours. Will see how it goes

When monitoring the bandwidth of data that is transferred from my PC should I 
get approximately the whole volume of the share in data traffic being sent from 
the PC being backed up. It seems odd that this PC is doing this. For both 
incremental and full it is transferring approx 7.5GB of data over the network. 
I thought that it was meant to be a relatively conservative with data traffic 
after the initial backup. Not that this bothers me too much but just trying to 
look at all possibilities as to the problem. My other PC's including a linux 
file server dont exhibit this same behaviour (at least for the incrementals).

Thanks all for your assistance to date. I will be in touch in the next few days 
to let you know how things have gone. 

And no this is not related to antivirus (dont believe in it on desktop pc's in 
my environment but thats another story) and no firewalls to worry about either 
as they are fully disabled for internal data traffic.

+--
|This was sent by [EMAIL PROTECTED] via Backup Central.
|Forward SPAM to [EMAIL PROTECTED]
+--



-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] free backup software

2008-08-04 Thread Nils Breunese (Lemonbit)
drew64 wrote:

 I have been looking for some free backup software to back up my  
 music and photos to an external hard drive and maybe DVD's. I have  
 tried Cobian but dont know if I like it since it has no restore  
 function. Have read somethings about winbackup and was also told to  
 try comodo backup. Any one have any experience with these. Also When  
 I tried Cobian I set it to incremmental backup. It worked but I see  
 it makes a new folder everytime a backup is made. Is there a way or  
 a program that will just back up and replace the file that has  
 changed into the same folder?

I guess rsync would do it. It's like a copy tool that only transfers  
the changes and it can work both locally and over network connections.  
It is a command line tool though. I don't know what OS you're running,  
but there are also some rsync GUI frondends out there I believe.  
BackupPC also supports rsync as a backend, but it might be overkill  
for just backing up a single machine. Also take a look at rdiff-backup  
if you want to keep multiple versions of files.

BackupPC also supports rsync as a backend, but setting up a BackupPC  
server for just one machine might be overkill maybe. It will work  
though and its web interface is very handy for restoring files.

Nils Breunese.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] DumpPreUserCmd vs DumpPreShareCmd

2008-08-04 Thread Rob Owens
I'm not sure what the difference is between these two options, and I 
can't find it in the docs.  Could someone enlighten me?

-Rob


The information transmitted is intended only for the person or entity to
which it is addressed and may contain confidential and/or privileged
material. If you are not the addressee, any disclosure, reproduction,
copying, distribution, or other dissemination or use of this transmission in
error please notify the sender immediately and then delete this e-mail.
E-mail transmission cannot be guaranteed to be secure or error free as
information could be intercepted, corrupted lost, destroyed, arrive late or
incomplete, or contain viruses.
The sender therefore does not accept liability for any errors or omissions
in the contents of this message which arise as a result of e-mail
transmission. If verification is required please request a hard copy
version.




-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] DumpPreUserCmd vs DumpPreShareCmd

2008-08-04 Thread Holger Parplies
Hi,

Rob Owens wrote on 2008-08-04 10:45:07 -0400 [[BackupPC-users] DumpPreUserCmd 
vs DumpPreShareCmd]:
 I'm not sure what the difference is between these two options, and I 
 can't find it in the docs.  Could someone enlighten me?

DumpPreUserCmd is run once before the backup.
DumpPreShareCmd is run before each share.

In case it's not obvious, DumpPreUserCmd is run before the DumpPreShareCmd for
the first share.

Regards,
Holger

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] free backup software

2008-08-04 Thread Chris Baker

I use Cobian on a lot of machines here. Most often, I have it back up to a
Winzip archive. In this case, my restore function is the WinZip program.

It doesn't seem to handle compression of large backups that well. For those,
I simply back up the files without compressing them. In this case, my
restore function is Windows Explorer.

The new folders are most likely Winzip archives. You can set the number of
backups you want kept in the software.

Cobian also has excellent support, considering its free. You go to the
Cobian message board, and Cobian himself comes on there every couple days. I
personally don't know how he finds the time for all this. The key fact is
that he does.

Give Cobian Backup a chance.

Chris Baker -- [EMAIL PROTECTED]
systems administrator
Intera Inc. -- 512-425-2006

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of drew64
Sent: Sunday, August 03, 2008 7:55 PM
To: backuppc-users@lists.sourceforge.net
Subject: [BackupPC-users] free backup software


I have been looking for some free backup software to back up my music and
photos to an external hard drive and maybe DVD's. I have tried Cobian but
dont know if I like it since it has no restore function. Have read
somethings about winbackup and was also told to try comodo backup. Any one
have any experience with these. Also When I tried Cobian I set it to
incremmental backup. It worked but I see it makes a new folder everytime a
backup is made. Is there a way or a program that will just back up and
replace the file that has changed into the same folder?

+--
|This was sent by [EMAIL PROTECTED] via Backup Central.
|Forward SPAM to [EMAIL PROTECTED]
+--



-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great
prizes Grand prize is a trip for two to an Open Source event anywhere in the
world http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/



-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] migrated backuppc server from linux-solaris Size/MB=0

2008-08-04 Thread Rob Owens
I had that problem once and I think it came down to a permissions issue. 
   I can't remember the exact fix, but you could try temporarily 
changing everything to 777 permisions in order to test out my theory 
(just make sure that you make note of the existing permissions, so you 
can set them back after testing).

-Rob

Rob Terhaar wrote:
 Hi All,
 I recently migrated backuppc from linux/redhat to opensolaris.
 Everything seems to be working correctly, except the size/mb is blank
 in the compression summary (see the attached screenshot, after
 backup). The backups are running, I am able to restore files.
 
 Does anyone have an idea as to why this file size count is coming up empty?
 
 
 
 
 
 
 
 -
 This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
 Build the coolest Linux based applications with Moblin SDK  win great prizes
 Grand prize is a trip for two to an Open Source event anywhere in the world
 http://moblin-contest.org/redirect.php?banner_id=100url=/
 
 
 
 
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


The information transmitted is intended only for the person or entity to
which it is addressed and may contain confidential and/or privileged
material. If you are not the addressee, any disclosure, reproduction,
copying, distribution, or other dissemination or use of this transmission in
error please notify the sender immediately and then delete this e-mail.
E-mail transmission cannot be guaranteed to be secure or error free as
information could be intercepted, corrupted lost, destroyed, arrive late or
incomplete, or contain viruses.
The sender therefore does not accept liability for any errors or omissions
in the contents of this message which arise as a result of e-mail
transmission. If verification is required please request a hard copy
version.




-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] DumpPreUserCmd vs DumpPreShareCmd

2008-08-04 Thread Rob Owens
Holger Parplies wrote:
 Hi,
 
 Rob Owens wrote on 2008-08-04 10:45:07 -0400 [[BackupPC-users] DumpPreUserCmd 
 vs DumpPreShareCmd]:
 I'm not sure what the difference is between these two options, and I 
 can't find it in the docs.  Could someone enlighten me?
 
 DumpPreUserCmd is run once before the backup.
 DumpPreShareCmd is run before each share.
 
 In case it's not obvious, DumpPreUserCmd is run before the DumpPreShareCmd for
 the first share.
 
Thanks very much, Holger.

-Rob


The information transmitted is intended only for the person or entity to
which it is addressed and may contain confidential and/or privileged
material. If you are not the addressee, any disclosure, reproduction,
copying, distribution, or other dissemination or use of this transmission in
error please notify the sender immediately and then delete this e-mail.
E-mail transmission cannot be guaranteed to be secure or error free as
information could be intercepted, corrupted lost, destroyed, arrive late or
incomplete, or contain viruses.
The sender therefore does not accept liability for any errors or omissions
in the contents of this message which arise as a result of e-mail
transmission. If verification is required please request a hard copy
version.




-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] free backup software

2008-08-04 Thread Holger Parplies
Hi,

drew64 wrote on 2008-08-03 20:55:13 -0400 [[BackupPC-users]  free backup 
software]:
 I have been looking for some free backup software [...]
 I have tried Cobian [...] winbackup [...] comodo backup. Any one have any
 experience with these.

is that a question?

This is a technical mailing list on the usage of the BackupPC software.
Questions about other software are off-topic here. While questions like
Is BackupPC suitable for the job? or maybe even Is BackupPC better
suited than Cobian/winbackup/comodo backup? would be reasonable,

 Is there a way [for Cobian backup] or a program that will just back up and
 replace the file that has changed into the same folder?

is not.

As Nils already stated, BackupPC would be suitable for what you basically want
to do, but may be considered overkill. In particular, it does *not* give you
a simple copy of your file tree(s). You get a backup history (of configurable
extent) without the penalty of consuming space for identical copies of files.

Regards,
Holger

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Fatal error (bad version): OpenSSH_5.0p1

2008-08-04 Thread brunal
Thanks so much for your help!

 Can you

   [EMAIL PROTECTED] rsync [EMAIL PROTECTED]:truc.txt tmp/

 without any problem, manual interaction or extraneous output?

Yes, it works perfect. No particular output, no needed interactions.

[...]

 You aren't running rsyncd, so the server is rsync --server in  
 both cases.
 BackupPC uses File::RsyncP and starts the server over an ssh  
 connection (see
 your log excerpt above). rsync [EMAIL PROTECTED]:truc.txt tmp/  
 automatically runs
 rsync --server ... over ssh [EMAIL PROTECTED]. If you want to test the  
 command
 BackupPC runs, try, *as the backuppc user on the BackupPC server*,

   % rsync --numeric-ids --perms --owner --group -D --links --hard-links
 --times --block-size=2048 --recursive --ignore-times  
 [EMAIL PROTECTED]:/home/temp/ /tmp/anywhere-you-like

 Does that work as expected? Does it ask any questions? Does it  
 output anything
 strange? Anything at all?

thanks for this useful-debug-tool-command line!
May I suggest that this command will be added somewhere in the  
Backuppc wiki, in a check what's wrong section

I've tried this command and I get an error about write permission  
[see below], so I change the user from backuppc to root, do the ssh  
key exchange for root, and finally it WORKS! I've changed the user in  
the etc/BackupPC/hosts file, replacing backuppc by root, started a  
backup throught the web interface, and a backup has started. Cool!

So here comes another question :
I get a write permission error coming from the BackupServer because  
the user backuppc on the BackupServer as no write permission on a  
backuped folder. i.e :
[EMAIL PROTECTED]  ls -la /home/
d---r-xr-x  5 bruno   bruno4096 2007-10-09 14:07 wikiinterne

So when this folder was copied on the BackupServer, the backuppc user  
had no write permission to create the subfolder contained in the  
wikiinterne folder, that's why the backup failed. Of course now that  
it is root the user on the BackupServer, he has all rights to create  
everything.

I believe I will keep running backuppc using the root user on both  
side, but is there a workaround? Because I don't want backuppc to  
have write access on all files on my MainServer...


Anyway thank you so much for your help!
I will keep experiencing after my vacation and I will fill the wiki  
with an howto install backuppc on a DNS-323 with rsync.

Best regards,
Bruno.



 brunal wrote on 2008-08-01 11:20:47 +0200 [Re: [BackupPC-users]  
 Fatal error (bad version): OpenSSH_5.0p1]:
 Can somebody explain me why source and destination are inverted in
 the backuppc command, compare to the usual rsync use?

 See rsync basics above. The command you are referring to starts  
 the server.
 The rsync server command line syntax is of little interest to us.  
 BackupPC
 knows which order the parameters have to be in, so we don't.

 I did the following on 192.168.1.2 :
 - stop all firewall and protection
 - allow root to connect by ssh

 Please verify that backuppc can 'ssh -l root' without password prompt.
 The rsync command I gave you above does that, but once more: you  
 need to run
 it as the backuppc user.

 I've tried this command in bash :
 /ffp/bin/ssh -q -x -l root 192.168.1.2 /usr/bin/rsync --server --
 sender --numeric-ids --perms --owner --group -D --links --hard-links
 --times --block-size=2048 --recursive --ignore-times . /home/racine/
 wikiinterne/

 Nothing happend and the command freeze. Why?

 See above.

 I tried this command in a bash, (same as above, wihtout --server --
 sender) :
 /ffp/bin/ssh -q -x -l root 192.168.1.2 /usr/bin/rsync --verbose --
 numeric-ids --perms --owner --group -D --links --hard-links -- 
 times --
 block-size=2048 --recursive --ignore-times . /home/racine/wikiinterne

 It copies all the content of the /root/ folder to the /home/racine/
 wikiinterne folder, all of this happend on the client, nothing is
 transfered on the backup server.

 That's what it is supposed to do. Neither source (.) nor destination
 (/home/racine/wikiinterne) are remote, so it's a local copy you  
 requested.

 Hope that helps.

 Regards,
 Holger


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Fatal error (bad version): OpenSSH_5.0p1

2008-08-04 Thread brunal2496

This is an edited version of a previous post


Thanks so much for your help!


 Can you
 
   backuppc  at  backuppc-server% rsync root  at  192.168.1.2:truc.txt 
 tmp/
 
 without any problem, manual interaction or extraneous output?
 

Yes, it works perfect. No particular output, no needed interactions.

[...]


 You aren't running rsyncd, so the server is rsync --server in  
 both cases.
 BackupPC uses File::RsyncP and starts the server over an ssh  
 connection (see
 your log excerpt above). rsync root  at  bla:truc.txt tmp/  
 automatically runs
 rsync --server ... over ssh root  at  bla. If you want to test the  
 command
 BackupPC runs, try, *as the backuppc user on the BackupPC server*,
 
   % rsync --numeric-ids --perms --owner --group -D --links --hard-links
 --times --block-size=2048 --recursive --ignore-times  
 root  at  192.168.1.2:/home/temp/ /tmp/anywhere-you-like
 
 Does that work as expected? Does it ask any questions? Does it  
 output anything
 strange? Anything at all?
 

thanks for this useful-debug-tool-command line!
May I suggest that this command will be added somewhere in the  
Backuppc wiki, in a check what's wrong section

I've tried this command and I get an error about write permission  
[see below], so I change the user from backuppc to root, do the ssh  
key exchange for root, and finally it WORKS! I've changed the user in  
the etc/BackupPC/hosts file, replacing backuppc by root, started a  
backup throught the web interface, and a backup has started. Cool!

So here comes another question :
I get a write permission error coming from the BackupServer because  
the user backuppc on the BackupServer as no write permission on a  
backuped folder. i.e :

root  at  MainServer%nbsp; ls -la /home/
d---r-xr-xnbsp; 5 brunonbsp; nbsp;brunonbsp; nbsp; nbsp; nbsp; 4096 
2007-10-09 14#58;07 wikiinterne

So when this folder was copied on the BackupServer, the backuppc user  
had no write permission to create the subfolder contained in the  
wikiinterne folder, that's why the backup failed. Of course now that  
it is root the user on the BackupServer, he has all rights to create  
everything.

I believe I will keep running backuppc using the root user on both  
side, but is there a workaround? Because I don't want backuppc to  
have write access on all files on my MainServer...


Anyway thank you so much for your help!
I will keep experiencing after my vacation and I will fill the wiki  
with an howto install backuppc on a DNS-323 with rsync.

Best regards,
Bruno.

+--
|This was sent by [EMAIL PROTECTED] via Backup Central.
|Forward SPAM to [EMAIL PROTECTED]
+--



-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] free backup software

2008-08-04 Thread Kurt Tunkko
Hello,

I haven't tried Cobian (but thank you for mentioning it).

To make backups on a local client without setting up a server and with 
the possibility to browse the backup archives using an GUI, I used Areca 
Backup:

http://areca.sourceforge.net/

It's open source, has a nice built in browser to look at your archives, 
does compression and encryption and does offer other features like 
email-summarys, merge of archives etc.

While not OpenSource you might also be interested in SyncBackSE

http://www.2brightsparks.com/syncback/sbse.html

But as mentioned by someone else, this is somewhat offtopic.

If you want to enjoy this helpful community, I guess you 'have to' 
install BackupPC (maybe in a virtual machine?) :-)

I've tried several backup-programs that had to be installed on 
desktop-pcs to make backups to an external harddrive and it was always a 
pain. Not only you have lots of configuration, but its much more 
complicated to view the logs and get a picture if all clients have been 
backed up etc.

So _if_ you have more than one pc, you might want to check out BackupPC 
- my first BackupPC-Server was a Pentium III with 800 Mhz and 512 MB RAM.

Good Luck.

- Kurt


Chris Baker wrote:
 I use Cobian on a lot of machines here. Most often, I have it back up to a
 Winzip archive. In this case, my restore function is the WinZip program.
 
 It doesn't seem to handle compression of large backups that well. For those,
 I simply back up the files without compressing them. In this case, my
 restore function is Windows Explorer.
 
 The new folders are most likely Winzip archives. You can set the number of
 backups you want kept in the software.
 
 Cobian also has excellent support, considering its free. You go to the
 Cobian message board, and Cobian himself comes on there every couple days. I
 personally don't know how he finds the time for all this. The key fact is
 that he does.
 
 Give Cobian Backup a chance.
 
 Chris Baker -- [EMAIL PROTECTED]
 systems administrator
 Intera Inc. -- 512-425-2006
 
 -Original Message-
 From: [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] On Behalf Of drew64
 Sent: Sunday, August 03, 2008 7:55 PM
 To: backuppc-users@lists.sourceforge.net
 Subject: [BackupPC-users] free backup software
 
 
 I have been looking for some free backup software to back up my music and
 photos to an external hard drive and maybe DVD's. I have tried Cobian but
 dont know if I like it since it has no restore function. Have read
 somethings about winbackup and was also told to try comodo backup. Any one
 have any experience with these. Also When I tried Cobian I set it to
 incremmental backup. It worked but I see it makes a new folder everytime a
 backup is made. Is there a way or a program that will just back up and
 replace the file that has changed into the same folder?
 
 +--
 |This was sent by [EMAIL PROTECTED] via Backup Central.
 |Forward SPAM to [EMAIL PROTECTED]
 +--
 
 
 
 -
 This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
 Build the coolest Linux based applications with Moblin SDK  win great
 prizes Grand prize is a trip for two to an Open Source event anywhere in the
 world http://moblin-contest.org/redirect.php?banner_id=100url=/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
 
 
 
 -
 This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
 Build the coolest Linux based applications with Moblin SDK  win great prizes
 Grand prize is a trip for two to an Open Source event anywhere in the world
 http://moblin-contest.org/redirect.php?banner_id=100url=/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
 


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: 

Re: [BackupPC-users] One PC Very Slow to Backup

2008-08-04 Thread Kurt Tunkko
naigy wrote:

 Still doing some testing. Not sure of the cause but one thing I do
 know is that I did a incremental and a full backup from the command
 line and both of these were done in the normal time of a bit over 2
 hours. Will see how it goes

that's great, so you have a backup in case you mess everything up - but 
it sounds strange that an incremental backup needs the same time like 
the full backup.
Have you tried to make some more incremental backups?

 For both incremental and full it is transferring approx 7.5GB of data
  over the network. I thought that it was meant to be a
  relatively conservative with data traffic after the
 initial backup.

As far as I understand traffic should be lower for incremental backups - 
  since only changes from the last full backup should be transferred?!

  My other PC's including a linux file server dont exhibit this
  same behaviour (at least for the incrementals).

The PC you are talking about is windows based right? Do you have other 
Windows-Clients within your group of other PC's (where backups are 
working as supposed)? Then you could compare the configuration files.

- Kurt

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] One PC Very Slow to Backup

2008-08-04 Thread Steve

 Kurt Tunkko [EMAIL PROTECTED] wrote: 
 naigy wrote:
 
  Still doing some testing. Not sure of the cause but one thing I do
  know is that I did a incremental and a full backup from the command
  line and both of these were done in the normal time of a bit over 2
  hours. Will see how it goes
 
 that's great, so you have a backup in case you mess everything up - but 
 it sounds strange that an incremental backup needs the same time like 
 the full backup.
 Have you tried to make some more incremental backups?

I'm a BackupPC nOOb and I made the mistake of leaving the '+' after the newer 
than variable while I was doing a backup of a local machine. This confused 
BackupPC (not suprsingly) and it interpreted the date as some time in 1901 so 
my incremental backup was the same size and took the same time as the full 
backup.
In the log file, check the 'newer-than' date on your incremental backup.

From what I remember of this thread I don't think that this is your problem 
but you never know.

Steve.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Just to make sure: How to move/copy /var/lib/backuppc to another place (RAID1)

2008-08-04 Thread Kurt Tunkko
Hello,

after getting so much interesting ideas how to backup the 
backuppc-server after asking this questions two weeks ago, I'm now close 
to take action.
The topic was:
Prepare for the worst, Howto backup the backuppc-server

To summarize:

I want to put /var/lib/backuppc on a RAID1 and add one (two) external 
disk to the RAID. Since this raid can be removed from the raid after the 
content has been synced I have a full copy of the backuppc-data for 
secure offline storage. Another idea I want to try afterwards is to add 
a disk using AoE (ATA over Ethernet). Because the AoE-Disk is attached 
to a virtual machine running on a client, I don't even need physical 
access to the server room, when swapping the drive for offline storage.

Since I've also an dd-diskimage of the server stored on the raid, I 
should be able to rebuilt my server (and then all clients) very easy (in 
theory).

My RAID1 (/dev/md0) is now mounted at /mnt/md0 and as far as I 
understand I need to move /var/lib/backuppc to the RAID.
Then when finished I would remount the RAID to /var/lib/backuppc.

Question

While looking for information about how to move the 
/var/lib/backuppc-directory I found:

'change archive directory' on the backuppc wiki
http://backuppc.wiki.sourceforge.net/change+archive+directory

Option 1 suggest to use:

cp -pR /var/lib/backuppc /mnt/md0

while Option 3 suggest to move the directory to another place.

In order to be safe when something bad happens while transferring the 
data to the RAID, I dont want to use 'move'.

Just to make sure that I don't do something stupid before copying tons 
of GB of backup-data I would like have a short feedback regarding the 
command in option 1. Will this do the job? Can I just remount my RAID to 
/var/lib/backuppc afterwards and be sure that everything is working?

Thanks and kind regards

- Kurt


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Problems excluding files.

2008-08-04 Thread Steve Blackwell
The documentation for $Conf{TarClientCmd} says

...
Also, you will probably want to add ``/proc'' to
$Conf{BackupFilesExclude}.

The following variables are substituted at run-time:
...
$fileListspecific files to backup or exclude
...

I took this to mean that $fileList takes care of excluding the files in
$Conf{BackupFilesExclude}. 

I have the following in my host.pl file:

$Conf{BackupFilesExclude} = {
  'system' = [
'/proc',
'/sys',
'/tmp'
  ],
  'backup' = [
'/media'
  ]
};

but the log show that it is still trying to backup /sys and /proc.

Contents of file /media/disk/pc/steve/XferLOG.6.z, modified 2008-08-04
01:08:44 (Extracting only Errors)

Running: /usr/bin/sudo /bin/tar -c -v -f - -C / --totals
--newer=2008-07-29 17:06:39 . incr backup started back to 2008-07-29
17:06:39 (backup #2) for directory / Xfer PIDs are now 25863,25862
[ skipped 36677 lines ]
/bin/tar: ./sys/devices/platform/uevent: File shrank by 4096 bytes;
padding with zeros

Id there something wrong with my $Conf{BackupFilesExclude} or do I need
to add --exclude something to TarIncrArgs?

Thanks,
Steve

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problems excluding files.

2008-08-04 Thread Holger Parplies
Hi,

Steve Blackwell wrote on 2008-08-04 18:31:38 -0400 [[BackupPC-users] Problems 
excluding files.]:
 [...]
 I have the following in my host.pl file:
 
 $Conf{BackupFilesExclude} = {
   'system' = [
 '/proc',
 '/sys',
 '/tmp'
   ],
   'backup' = [
 '/media'
   ]
 };
 
 but the log show that it is still trying to backup /sys and /proc.
 
 Contents of file /media/disk/pc/steve/XferLOG.6.z, modified 2008-08-04
 01:08:44 (Extracting only Errors)
 
 Running: /usr/bin/sudo /bin/tar -c -v -f - -C / --totals
 --newer=2008-07-29 17:06:39 . incr backup started back to 2008-07-29
 17:06:39 (backup #2) for directory / Xfer PIDs are now 25863,25862
 [...]
 
 Id there something wrong with my $Conf{BackupFilesExclude}

yes.

 or do I need to add --exclude something to TarIncrArgs?

No.

You need to use the share name as hash key. Since you're using XferMethod
tar, chances are your share is /, so you'd have

  $Conf{BackupFilesExclude} = {
'/' = [
  '/proc',
  '/sys',
  '/tmp'
],
'/whatever/share/you/mean/by/backup' = [
  '/media'
]
  };

You *can* use '*' as hash key if you want the excludes to apply to every share
not explicitly given an exclude list, but it seems better practise (and less
error prone) to explicitly list excludes by share like you did.

Regards,
Holger

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] clustered file system and multiple servers

2008-08-04 Thread sabujp

Can a dev let me know if the files in the pool are FLOCK'd before writing, i.e. 
is there a chance that two servers backing up into the same top level data 
directory could mangle a file in the pool in this manner?

+--
|This was sent by [EMAIL PROTECTED] via Backup Central.
|Forward SPAM to [EMAIL PROTECTED]
+--



-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problems excluding files.

2008-08-04 Thread Steve Blackwell
 Hi,

 Steve Blackwell wrote on 2008-08-04 18:31:38 -0400 [[BackupPC-users]
 Problems excluding files.]:
 [...]

 Id there something wrong with my $Conf{BackupFilesExclude}  

 yes.

 or do I need to add --exclude something to TarIncrArgs?  

 No.

 You need to use the share name as hash key. Since you're using
 XferMethod tar, chances are your share is /, so you'd have

   $Conf{BackupFilesExclude} = {
 '/' = [
   '/proc',
   '/sys',
   '/tmp'
 ],
 '/whatever/share/you/mean/by/backup' = [
   '/media'
 ]
   };

 You *can* use '*' as hash key if you want the excludes to apply to
 every share not explicitly given an exclude list, but it seems better
 practise (and less error prone) to explicitly list excludes by share
 like you did.
 
 Regards,
 Holger

Aahh!!! Thanks Holger. 
I had not understood the use of the hash key. I thought it was just a
kind of group name for some excludes.
Now everything is working as expected.

Steve

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Just to make sure: How to move/copy /var/lib/backuppc to another place (RAID1)

2008-08-04 Thread Holger Parplies
Hi,

Kurt Tunkko wrote on 2008-08-04 23:00:35 +0200 [[BackupPC-users] Just to make 
sure: How to move/copy /var/lib/backuppc to another place (RAID1)]:
 [...]
 I found: 'change archive directory' on the backuppc wiki
 http://backuppc.wiki.sourceforge.net/change+archive+directory
 
 Option 1 suggest to use:
 
   cp -pR /var/lib/backuppc /mnt/md0
 
 while Option 3 suggest to move the directory to another place.
 
 In order to be safe when something bad happens while transferring the 
 data to the RAID, I dont want to use 'move'.

['mv'? Really? Just suppose *that* gets interrupted part way through ...]

 Just to make sure that I don't do something stupid before copying tons 
 of GB of backup-data I would like have a short feedback regarding the 
 command in option 1. Will this do the job?

It is *guaranteed* not to. Whoever put that in the wiki either does not have
the slightest clue what he is writing about, or he is talking about an empty
pool (read: BackupPC was freshly installed and *no backups done*) and didn't
make that unmisunderstandably clear.

Even the potentially correct 'cp -dpR ...' will not work in the general case.
The command from the wiki does *not* preserve hard links. Your pool will
explode to at least twice the size, and that's assuming every pooled file is
only used once (which would practically mean you've only got one backup). If
you've got the space, you *could* get away with it, because future backups
would be pooled, but for current backups, the benefits of pooling would be
forever lost.
The next run of BackupPC_nightly would empty the pool (so you might as well
not copy it in the first place), and the files would need to be re-compressed
during future backups.
So: while it is conceivable that someone might use this as a last resort, you
don't want to migrate your pool like this.

The version that *does* preserve hard links (cp -d option) will work for
structures upto a certain limited size.

There seem to be people on the list who repeatedly insist that it worked for
them, so it will work for you (despite the thread already containing an
explanation to the contrary). Apparently it has even made it into the wiki.


On the other hand, there have also been countless reports of problems with
*any* file-based copying of the BackupPC pool using general-purpose tools -
cp, rsync, and tar spring to mind. They either run out of memory or take long
(read: days to weeks, meaning they are usually aborted at some point; I'm not
sure if they would eventually finish). This is, basically, due to the fact
that you cannot create a hard link to an inode based on the inode number. You
need a path name, i.e. a name of the file to link to. For a few hundred files
with a link count of more than one, it's no problem to store the information
in memory (and that is what general-purpose tools are probably expecting). For
100 million files with more than one link, that obviously won't work any more.
Add to that the delays of chaotically seeking from one end of the pool disk to
the other (the kernel needs to look up the paths you're linking to, and
there's not much chance of finding anything in the cache ...), and you'll get
an idea of where the problem is. Lots of memory will help, preferably enough
to fit the pool into cache altogether ;-).


Your best options remain to either do a block level copy of the file system
(dd) or to start over. You can, of course, *try* cp/rsync/tar and hope your
pool is small enough (hint: count your pool files). I'm not saying it never
works, only to be aware of what you're facing. Remember, for cp you need
-d, for rsync -H and for dd a destination partition at least as big
as the source file system. I haven't heard reports of problems with file
system resizers and BackupPC pools, but I'd be cautious just the same.

 Can I just remount my RAID to 
 /var/lib/backuppc afterwards and be sure that everything is working?

If anyone has good ideas how to *test* the result of copying a pool, I'd also
be interested (and please don't suggest 'diff -r'). I can imagine a lot of
things going wrong that BackupPC would *not* notice.

Regards,
Holger

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] move a specific backup (share) from one pool to another pool

2008-08-04 Thread sabujp

Is it possible to move all of the backups and incrementals of a particular 
share (/home/user) from one data directory (pool) into another?

Let's say I've got 100 home directory shares that use up 16TB of data in one 
top level directory (pool). I cannot expand this volume beyond 16TB but I can 
get another 16TB volume. 

Is it possible to extract the backup data for a particular share from one top 
level data directory (pool) and move it to another top level data directory 
(pool) on another file system? If each of the home directories were equivalent 
in the amount of space used, then I could move 50 of the backed up shares to 
the other volume and then spread the original 16TB across two 16TB volumes 
(leaving 8TB on both volumes).

+--
|This was sent by [EMAIL PROTECTED] via Backup Central.
|Forward SPAM to [EMAIL PROTECTED]
+--



-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] clustered file system and multiple servers

2008-08-04 Thread Holger Parplies
Hi,

sabujp wrote on 2008-08-04 23:18:41 -0400 [[BackupPC-users]  clustered file 
system and multiple servers]:
 
 Can a dev let me know if the files in the pool are FLOCK'd before writing,

use the force, read the source. grep -r flock backuppc-3.1.0 suggests that
flock is used, but not on pool files. Not surprising, considering BackupPC_link
and BackupPC_nightly (or two instances of BackupPC_link) may not run
concurrently (BackupPC_link is responsible for entering new files into the
pool).

Come to think of it, the reason for this restriction is of a different nature:
BackupPC_nightly sometimes needs to rename pool files (with a common BackupPC
hash, when one or more files out of the chain are deleted), while BackupPC_link
may insert a new file with the same BackupPC hash. You can't prevent the
resulting race condition with flock() - at least you don't effectively change
anything in the single-threaded case (you'd need a rather global lock).

 i.e. is there a chance that two servers backing up into the same top level
 data directory could mangle a file in the pool in this manner?

I don't think you'd have mixed file contents (or, effectively, a corrupt
compressed file), but there seems to be a chance of linking a file in a backup
to the wrong pool file (wrong contents altogether).

You could probably use flock() to prevent two instances of
BackupPC_link/BackupPC_nightly running simultaneously on different servers,
but there are more things you would want to think about (running
BackupPC_nightly on more than one server does not make much sense, even if
they don't run concurrently; limiting simultaneous backups over all servers;
ensuring BackupPC_dump is not run twice simultaneously for the same host ...).

In short: sharing a pool between servers is currently not supported.

Regards,
Holger

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/