[BackupPC-users] Choice of b/u drive

2009-01-06 Thread colinc

As a Mac user there is no PCMCIA.

The choice is FW400/800, USB2 or a NAS drive. Ethernet would be easiest, but 
does not work well with Apple's otherwise brilliant 'rewindable' TimeMachine 
software and as I said Apple seem to be loosing interest in FW, so it rather 
leave USB ...

Cheers, Colin

+--
|This was sent by co...@nehoc.co.uk via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--



--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Large Amounts of Data

2009-01-06 Thread Les Mikesell
Adam Goryachev wrote:

 We're a mid-sized university academic department and are fairly happy 
 with our current BackupPC setup.  We currently backup an 800 GB 
 fileserver (which has an iSCSI-attached drive), a few other sub-300 GB 
 fileservers, a bunch of XP desktops.  The 800 GB fileserver takes a long 
 time to back up...almost a full day, and I think this is normal for 
 BackupPC.  We'd like to use BackupPC to backup some of our heftier linux 
 servers -- moving into the multiple terabyte range (5-10 TB).  We're 
 considering a ZFS filesystem over gigabit for our backup target, but 
 obviously are concerned that backing up 5 TB of data would take a week.

 Is this where we should consider multiple BackupPC servers to break up 
 the backup time?  Should we move to a solution with less overhead (if 
 there is one)?  Thanks for any input or experiences.
 
 You haven't specified your backup method, but I'll make a couple of
 assumptions:
 
 * you are using rsync over SSH
 
 If your 800G file server is taking 24 hours per backup, there are
 probably some optimisations you can make. Firstly, check if either your
 backup server or file server are consuming all available memory and
 swapping during the backup. Adding some extra RAM will drastically
 reduce the backup time. The other issue is the size of the files on your
 fileserver, lots of small files will take longer than lots of small files.

Also, incremental runs should be substantially faster than fulls, even 
when using rsync where the data tranfer amount is similar.  If all the 
fulls are happening at the same time, force some to run on a different 
day to skew them.  You might push the larger fulls to Friday night if it 
is OK to run into the weekend.

 Other than that, you can look at IO etc on both fileserver and backup
 server. Probably your bigger fileservers are faster, but you backup
 server is only so fast... Examine where the bottlenecks are, and then
 you can either remove/improve those bottlenecks, or else add additional
 backuppc servers (if the backup server is the bottleneck and removing it
 is too costly).

Some files are just not backup-friendly, like unix mailbox format or big 
databases.  You may need to handle them some other way and exclude them 
from the backuppc runs.

Raid5 is generally a bad idea performance-wise unless you have 12 or 
more disks in a set. It is usually much more expensive to go above 
'commodity-size' servers than to add more so you may be better off 
splitting the backups to different servers, especially if you can group 
similar targets for better pooling and knowing where to look when you 
need to restore.

-- 
   Les Mikesell
lesmikes...@gmail.com



--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Choice of b/u drive

2009-01-06 Thread Les Mikesell
colinc wrote:
 As a Mac user there is no PCMCIA.
 
 The choice is FW400/800, USB2 or a NAS drive. Ethernet would be easiest, but 
 does not work well with Apple's otherwise brilliant 'rewindable' TimeMachine 
 software and as I said Apple seem to be loosing interest in FW, so it rather 
 leave USB ...
 

Some do.  How about one of these: 
http://gizmodo.com/5119452/hp-mediasmart-ex487-server-has-remote-mp3-streaming-mac-time-machine-compatibility
or Apple's combo drive and wireless router?

-- 
   Les Mikesell
lesmikes...@gmail.com


--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] gui exclude

2009-01-06 Thread cedric briner
hello and Happy new year,

I'm searching a GUI where people can click on the hierarchical file 
system to tell which folder they do not need to be saved. And then this 
application should also provides the amount of data that it is going to 
save. And it will be even nicer if it could also give this information 
in such format that rsync will understand it. And it will be even more 
nicer if it could be in java webstart which fits nicely in backuppc.

So do you guys have some clues, hints, ideas that I can go for ???

thanks for you help

cEd


-- 

Cedric BRINER
Geneva - Switzerland

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Backuppc timing out

2009-01-06 Thread cantthinkofanickname

As I could not get smb working (probably my fault) I've installed rsyncd on my 
XP PC and have got it running. I have configured backuppc to to a backup using 
an online turorial. When I run a backup I get a timeout error in the log file:

2009-01-06 14:28:12 full backup started for directory docs
2009-01-06 14:31:21 Got fatal error during xfer (inet connect: Connection timed 
out)
2009-01-06 14:31:26 Backup aborted (inet connect: Connection timed out)

Nothing is copied to the server.

Can anyone give me a clue as to how to resolve this?

+--
|This was sent by forumsmail...@btinternet.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--



--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] net connect: Connection timed out

2009-01-06 Thread cantthinkofanickname

Did you resolve this? I have the same problem.

+--
|This was sent by forumsmail...@btinternet.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--



--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Large Amounts of Data

2009-01-06 Thread Pedro M. S. Oliveira
In one of the installations of backuppc I manage there are 6 real servers and 
about 15 virtual servers and a storage total size combined about 10 TB and it a 
works great with backuppc.
On the servers running VMware I do a full backup every 30 days with a weekly 
incremental backup. On the vmware hosts we backup the we retain 2 full backups 
with daily incremental backups for 30 days also and the same apply to the real 
servers.
With pooling and compression we don't have more than 8 TB, the fs is reiserfs.
Usually backups run at night in the weekdays and freely on weekends. 

what take longer is the storage that has about 5TB and takes more than a day to 
do a full backup, i also allow multiple backups to run at the same time (10) 
this will allow to take full use of the gigacards, cpu, memory and cpu (it's a 
quad core) with 4 gb ram. if you do just a backup at a time you will waste your 
cpu as many of the files you do backup are small  and your hosts don't send 
enough info to saturate the backuppc server hardware. with lots of backuppc at 
the same time we have steady mdstat stats of 50-60 Mb/s written to disk.

the backuppc has sata drives with hardware raid 5.
so i can say i'm real happy with backuppc. right now i have 4 major backuppc 
servers running with multiple configuration, backups over internet (if you teak 
a bit ssh options you can compress data on the move). i also have some minor 
installations at my house for my data for instance. 
cheers
Pedro 

On Monday 05 January 2009 23:39:43 Christopher Derr wrote:
 We're a mid-sized university academic department and are fairly happy 
 with our current BackupPC setup.  We currently backup an 800 GB 
 fileserver (which has an iSCSI-attached drive), a few other sub-300 GB 
 fileservers, a bunch of XP desktops.  The 800 GB fileserver takes a long 
 time to back up...almost a full day, and I think this is normal for 
 BackupPC.  We'd like to use BackupPC to backup some of our heftier linux 
 servers -- moving into the multiple terabyte range (5-10 TB).  We're 
 considering a ZFS filesystem over gigabit for our backup target, but 
 obviously are concerned that backing up 5 TB of data would take a week.
 
 Is this where we should consider multiple BackupPC servers to break up 
 the backup time?  Should we move to a solution with less overhead (if 
 there is one)?  Thanks for any input or experiences.
 
 Chris
 
 --
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
 

-- 
--
Pedro M. S. Oliveira
IT Consultant 
Email: pmsolive...@gmail.com  
URL:   http://pedro.linux-geex.com
Cellular: +351 96 5867227
--
--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] gui exclude

2009-01-06 Thread Ryan Knapper
Maybe you're looking for Restore (http://restore-backup.com/), although
their site is down as I write this.  Their software looked interesting, but
I was unable to get it to work correctly.

On Tue, Jan 6, 2009 at 04:21, cedric briner w...@infomaniak.ch wrote:

 hello and Happy new year,

 I'm searching a GUI where people can click on the hierarchical file
 system to tell which folder they do not need to be saved. And then this
 application should also provides the amount of data that it is going to
 save. And it will be even nicer if it could also give this information
 in such format that rsync will understand it. And it will be even more
 nicer if it could be in java webstart which fits nicely in backuppc.

 So do you guys have some clues, hints, ideas that I can go for ???

 thanks for you help

 cEd


 --

 Cedric BRINER
 Geneva - Switzerland


 --
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
[EOM]
--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Archive full backups?

2009-01-06 Thread Bernhard Schneck
Hi,

I've been using BackupPC (3.0.0) on Ubuntu (8.x) for a while
and am quite happy with it ... thanks a lot for the effort
to all BackupPC developers and contributors!

I've started to look at the Archive functions.

What I want to achieve is to do regular full and incremental
backups, while archiving the most recent full backups to
different media regularly (say once per week).

As I understand, Archive will transfer the most recent backup
to the archive storage location ... even if this is an
incremental and the corresponding full backup has not been
archived (which makes the incremental more or less unusable
for disaster recovery purposes)

Is there a setting somewhere to achieve this?
Is ``filled incremental'' the best/recommended/only way?
What are other people using?

The only idea I have currently is to script the archive
process to check for full backups on a daily basis, copy
them to a staging area, and archive from there once a
week ... this is possible, but lacks elegance :-)

I've searched the archives (and also did some source code
reading), but havn't found a discussion of this situation
(I may have used the wrong search terms, though, so please
point me in the right direction if I missed the obvious!)

Thanks,

\Bernhard.


--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] automated backup of specific dirs to local hdd

2009-01-06 Thread jed
Hi All,

Is this app purely for backup across networks to servers, or is it also 
perfectly fine for local backups of stipulated dirs?
For starters I'm just wanting to regularly backup my Tbird/FF profiles 
to a separate hdd on the same Mac..
Can it deal with folders that contain data that's live and may be 
updating at the time of a backup? (hope that makes sense)

I was going to start fiddling with all this but why reinvent the wheel?!?
http://www.google.com.au/search?hl=enclient=firefox-arls=org.mozilla%3Aen-GB%3Aofficialq=bash+scripting+how-tobtnG=Searchmeta=
http://www.google.com.au/search?hl=enclient=firefox-arls=org.mozilla%3Aen-GB%3Aofficialq=crontab+os+xbtnG=Searchmeta=
or this may have worked better in practise than the above?
http://rajeev.name/blog/2008/09/01/automated-osx-backups-with-launchd-and-rsync/

Any advice greatly appreciated  :-)

Seasons well wishes,
Jed

--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] no files dumped for share c$

2009-01-06 Thread Gilles Guiot
Hello everybody ! And happy new year :)

I got the following problem :

A backup server on Debian  2.6.26-1-686 (résultat d'une commande uname 
-a). After the vulnerability on ssh for debian, all the backups failed, 
on both debian clients and Windows clients.
After renewing ssh keys and exchanging them, the backups are working 
again for the linux clients, but not for the windows machines.
The first one is a windows machine with sp4.
The message i get is always the same :
no files dumped for share c$
session resquest to .. failed (called name not present)
tree connect failed: NT_STATUS_ACCESS_DENIED

But the config for these hosts had not changed. When i tried to do a 
manual dump, here is what it gave me :

/backu...@backup:/usr/share/backuppc/bin$ ./BackupPC_dump -v -f 
192.168.0.100
cmdSystemOrEval: about to system /bin/echo -c 1 192.168.0.100
cmdSystemOrEval: finished: got output -c 1 192.168.0.100

cmdSystemOrEval: about to system /bin/echo -c 1 192.168.0.100
cmdSystemOrEval: finished: got output -c 1 192.168.0.100

CheckHostAlive: can't extract round-trip time (not fatal)
CheckHostAlive: returning 0
Running: /usr/bin/smbclient 192.168.0.100\\C\$ -U 
BLACKUP\\Administrateur\\   
   
-E -N -d 1 -c tarmode\ full -Tc -
full backup started for share C$
started full dump, share=C$
Xfer PIDs are now 24737,24736
xferPids 24737,24736
cmdExecOrEval: about to exec /usr/bin/smbclient 192.168.0.100\\C\$ 
-U 
BLACKU  
   
P\\Administrateur\\ -E -N -d 1 -c tarmode\ full -Tc -
session request to 192.168.0.100 failed (Called name not present)
session request to 192 failed (Called name not present)
Anonymous login successful
Domain=[] OS=[Windows 5.0] Server=[Windows 2000 LAN Manager]
tree connect failed: NT_STATUS_ACCESS_DENIED
session request to 192.168.0.100 failed (Called name not present)
session request to 192 failed (Called name not present)
Anonymous login successful
Domain=[] OS=[Windows 5.0] Server=[Windows 2000 LAN Manager]
tree connect failed: NT_STATUS_ACCESS_DENIED
tarExtract: Done: 0 errors, 0 filesExist, 0 sizeExist, 0 sizeExistComp, 
0 
filesT  
   
otal, 0 sizeTotal
Got fatal error during xfer (No files dumped for share C$)
cmdSystemOrEval: about to system /bin/echo -c 1 192.168.0.100
cmdSystemOrEval: finished: got output -c 1 192.168.0.100

cmdSystemOrEval: about to system /bin/echo -c 1 192.168.0.100
cmdSystemOrEval: finished: got output -c 1 192.168.0.100

CheckHostAlive: can't extract round-trip time (not fatal)
CheckHostAlive: returning 0
Backup aborted (No files dumped for share C$)
Not saving this as a partial backup since it has fewer files than the 
prior 
one 
 
(got 0 and 0 files versus 0)
dump failed: No files dumped for share C$/

I checked in the config.pl , and the  $Conf{SmbShareName} do correspond 
to the id and password used by the admin account on this windows machine.
I checked in the topdir/pc/hostname too, but couldn't find any specific 
config file for this host...
So I'm at a loss. Is there somebody who could give me some tips/solutions ?

Thansk a lot in advance.




-- 
Gilles Guiot
Responsable exploitation informatique
Tél. : 01 53 23 02 20
gilles.gu...@saros.fr


--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] I received the error No files dumped for share

2009-01-06 Thread Sean Wong
Hi,

I have tried to do a full backup of a Windows XP PC. the Backup is
successful. Although I get the error “No files dumped for share. What
is wrong?

Regards,
Sean


--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] configure.pl fails

2009-01-06 Thread Kiran Agrahara
 I am trying to install BackupPC on ubuntu server edition. I am running the
confiure command as
 sudo perl configure.pl

it fails with the error message

Making init.d scripts
can't chown 1000, 1000 init.d/gentoo-backuppc.conf at configure.pl line
1011.

Not sure where the permissions have to be changed. Can somebody help me fix
this?

Thanks,
Kiran
--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] no files dumped for share c$

2009-01-06 Thread Craig Barratt
Gilles writes:

 tree connect failed: NT_STATUS_ACCESS_DENIED

David Kahn reports that this happens with recent versions
of smbclient.  Removing the -N option fixes it.

Can you confirm this fix works for you?

Craig

-- Forwarded message --
To:   backuppc-users@lists.sourceforge.net 
backuppc-users@lists.sourceforge.net
From: davidekahn backuppc-fo...@backupcentral.com
Date: Thu, 13 Nov 2008 15:37:00 -0800
Subj: [BackupPC-users]  tree connect failed: NT_STATUS_ACCESS_DENIED


It appears that that have changed the way smbclient works with version 3.2.3, 
and it is causing this problem.  A problem identical to yours was reported as 
being a bug in Ubuntred 8.10 (Intrepid): 
https://bugs.launchpad.net/ubuntu/+source/backuppc/+bug/283652.  However, the 
actual source of the problem is smbclient, which is called by backuppc.  
Therefore, I reported a second bug: https://bugs.launchpad.net/bugs/297025 that 
will hopefully fix the problem.

The solution to your backup problem is to edit /etc/backuppc/config.pl on 
server #2, which is using backuppc version 3.1.0 and smbclient 3.2.3.  Do not 
make this modification to server #1, as it will break it.

There are three strings that you need to modify in config.pl:

$Conf{SmbClientFullCmd}
$Conf{SmbClientIncrCmd}
$Conf{SmbClientRestoreCmd}

which control Samba backups and restore. In all three strings remove the -N 
flag.

My understanding that the flag is no longer needed, because the login prompt is 
automatically suppressed by smbclient when backuppc passes the password through 
the PASSWD environment variable.  But for some unfathomable reason, when the 
-N flag is used, the password does not get passed to Windows' LAN Manager.

Good luck.

+--
|This was sent by david.k...@certiby.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--

--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] I received the error No files dumped for share

2009-01-06 Thread Craig Barratt
Sean writes:

 I have tried to do a full backup of a Windows XP PC. the Backup is
 successful. Although I get the error ?No files dumped for share. What
 is wrong?

The backup isn't successful (since no files were dumped for one (or more)
shares).

Please look at the XferLOG.bad file (which should be quite short) and
if the answer isn't apparent, email the contents of the file (or at
least the first few lines) to this thread.  You should also explain
which XferMethod you are using and the corresponding Share and
Include/Exclude settings.

Craig

--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] configure.pl fails

2009-01-06 Thread Craig Barratt
Kiran writes:

 I am trying to install BackupPC on ubuntu server edition. I am running the 
 confiure command as
 sudo perl configure.pl
 
 it fails with the error message
 
 Making init.d scripts
 can't chown 1000, 1000 init.d/gentoo-backuppc.conf at configure.pl line 1011.
 
 Not sure where the permissions have to be changed. Can somebody help me fix 
 this?

It is trying to change the ownership from root (who is running configure.pl)
to the BackupPC user.  The strange thing is that many chown() calls prior
to this were successful.

Does init.d/gentoo-backuppc.conf exist?  (The path is relative to the
unpacked release - ie: the place you ran configure.pl from.)  What
happens when you manually try to chown it, eg:

sudo chown 1000:1000 init.d/gentoo-backuppc.conf

Craig

--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/