Re: [BackupPC-users] Anyone get rsyncd to work on Windows Server 2008

2010-03-31 Thread Koen Linders
On 31 March 2010 09:51, Sorin Srbu sorin.s...@orgfarm.uu.se wrote:
 Ok, so it’s just the rsyncd-part on the Win-server that’s interesting,
 correct? And you set up BPC to use rsyncd with this particular Win-client
 and point it to eg /cygdrive/d/home which is one of those pre-setup virtual
 directories they mention in the DeltaCopy-manual. Did I get that right?



 --

 /Sorin

Yes.

You create a Deltacopy service which is simply a Rsyncd running on
port 873 (standard). You connect with Backuppc with rsyncd config.
After you created the deltacopy service, you create a virtual
directory: e.g. cDRIVE and fill in the path and the credentials. this
cDRIVE is what you use in backuppc as Rsyncshare name.

The easy part is that you can copy the windows folder to another
client, register the service and you're good. I did a few tests with
Cygwin and it seems there are a lot more steps to get it running (but
you can get encryption via ssh then)

Koen Linders

--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Uncompressed pool compressed pool size with compresslevel 3

2010-02-05 Thread Koen Linders
 1) Shouldn't the uncompressed pool be empty?
 I guess it's safe to actually delete all data in the pool dir? But why
 does
 Nightly cleanup removes things on Etch server at night?
 
 2) And since the upgrade to 3.1. the 2 graphs for pools aren't showing
 (https://192.168.1.5/backuppc/index.cgi?image=4)
 (https://192.168.1.5/backuppc/index.cgi?image=52)

1) If compression is longer than the time required to keep backups. Is it
safe to delete the contents of the pool folder?
 (47GB server1, 123 GB server2)

2) anyone else problems displaying the graphs?


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Uncompressed pool compressed pool size with compresslevel 3

2010-02-04 Thread Koen Linders
2 Backuppc 3.1.0 servers (1 Debian Etch, 1 Debian Lenny)

Lenny was rsynced from etch and backups now the same hosts from another
building. Both have same configs, except ServerHost and some other server
specific things. Compression has been on for atleast a year on Debian Etch
server. Maybe since the start. Date on pool dir says created on 2007-10-12
and owned by backuppc on both servers. 

FullKeepCnt: 2, 1, 1, 1
FullkeepCntMin: 2
CompressLevel3

1) Shouldn't the uncompressed pool be empty? 
I guess it's safe to actually delete all data in the pool dir? But why does
Nightly cleanup removes things on Etch server at night?

2) And since the upgrade to 3.1. the 2 graphs for pools aren't showing 
(https://192.168.1.5/backuppc/index.cgi?image=4)
(https://192.168.1.5/backuppc/index.cgi?image=52)

Server 1 (Etch) 
Uncompressed pool: 
Pool is 47.51GB comprising 37569 files and 4369 directories (as of 4/2
04:01), 
Pool hashing gives 4 repeated files with longest chain 1, 
Nightly cleanup removed 814 files of size 3.52GB (around 4/2 04:01), 
Compressed pool: 
Pool is 480.81GB comprising 1104143 files and 4369 directories (as of 4/2
04:13), 
Pool hashing gives 219 repeated files with longest chain 28, 
Nightly cleanup removed 5072 files of size 7.91GB (around 4/2 04:13),

/data/backuppc/pool# ls -al
total 72
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 20:02 .
drwx--  7 backuppc backuppc 4096 2010-02-03 07:01 ..
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:12 0
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:13 1
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:13 2
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:13 3
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:12 4
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:13 5
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:13 6
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:13 7
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:13 8
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:09 9
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:13 a
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:13 b
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:13 c
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:13 d
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:13 e
drwxr-x--- 18 backuppc backuppc 4096 2007-10-12 21:13 f


Server 2 (Lenny)
Uncompressed pool: 
Pool is 123.97GB comprising 130788 files and 4369 directories (as of 4/2
04:01), 
Pool hashing gives 5 repeated files with longest chain 1, 
Nightly cleanup removed 0 files of size 0.00GB (around 4/2 04:01), 
Compressed pool: 
Pool is 493.42GB comprising 1124974 files and 4369 directories (as of 4/2
04:03), 
Pool hashing gives 218 repeated files with longest chain 28, 
Nightly cleanup removed 2081 files of size 5.34GB (around 4/2 04:03),




--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up a laptop via DHCP

2009-12-29 Thread Koen Linders
 My home network is basically a windows Workgroup and I was planning on
 access my laptops via smb.  From my backuppc server I can see both laptops
 on the workgroup.



 Any help, guidance would be much appreciated.


The easiest thing I guess would be to:
First: install Samba to be the WINS server for your network. (wins
support = yes)
check 7.3.3: http://oreilly.com/catalog/samba/chapter/book/ch07_03.html

(It's only a really short step from here to have samba also be the
fileserver for your home network)

2d: Install DHCP3-server.
There are a lot of examples but the essential is giving the wins
server address with the dhcp lease. That way your windows desktops
will always register their netbios name with the samba wins server and
backuppc can query it as mentioned with the nmblookup command. As a
side effect network browsing in windows will also speed up.

If you don't want to install and configure this, you can always set
the wins server manually in the advanced network settings in windows.

Maybe it is possible to have Backuppc check with dns instead of
netbios but that's still harder to configure in my opinion.

Greetings,
Koen Linders

--
This SF.Net email is sponsored by the Verizon Developer Community
Take advantage of Verizon's best-in-class app development support
A streamlined, 14 day to market process makes app distribution fast and easy
Join now and get one step closer to millions of Verizon customers
http://p.sf.net/sfu/verizon-dev2dev 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] how to keep problems with hosts to reappear in daily mail?

2009-12-17 Thread Koen Linders
Checking the host summary i noticed 2 laptops with no backups for atleast 66 
days.
Daily mail summary of 14/11/09 till now doesn't mention them.
Mail settings: 
EMailNotifyMinDays 2.5
EMailNotifyOldBackupDays 7

Any way to have hosts with problems to keep appearing in the mail? 


--
This SF.Net email is sponsored by the Verizon Developer Community
Take advantage of Verizon's best-in-class app development support
A streamlined, 14 day to market process makes app distribution fast and easy
Join now and get one step closer to millions of Verizon customers
http://p.sf.net/sfu/verizon-dev2dev 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Excludes not working

2009-08-27 Thread Koen Linders
-Oorspronkelijk bericht-
Van: Bernhard Ott [mailto:bernhard@gmx.net] 
Verzonden: donderdag 27 augustus 2009 11:31
Aan: General list for user discussion, questions and support
Onderwerp: Re: [BackupPC-users] Excludes not working

Koen Linders wrote:
 Carl Wilhelm Soderstrom wrote:
 On 08/26 12:57 , Osburn, Michael wrote:
 I am trying to backup my backuppc server while excluding the backups
 directory. No matter what I put under excludes in the config, I still
 end up with the cpool and pc directories in my backups.
 You misunderstand the exclude syntax. Here's an example for a (SMB) share
 named 'c$':

 $Conf{BackupFilesExclude} = {
'c$' = [
 '/RECYCLER', 
 '/winnt/tmp', 
 '/temp', 
 '/WUTemp', 
 '/WINDOWS', 
 '/Documents and Settings/*/Local Settings/Temporary Internet
 Files/', 
 '/Documents and Settings/*/Local Settings/history/', 
 '/Documents and Settings/*/Cookies/', 
 '/Documents and Settings/*/Favorites/', 
 '/Documents and Settings/*/IETldCache/', 
 '/Documents and Settings/*/IECompatCache/', 
 '/Documents and Settings/*/NetHood/', 
 '/Documents and Settings/*/PrivacIE/', 
 '/Documents and Settings/*/PrintHood/', 
 '/pagefile.sys', 
 '/hiberfil.sys',
 ]
};
  
 hmm ... I'm having some problems with smb-tar excludes, too (my first 
 smb client because of never ending VISTA 64-bit related issues with 
 DeltaCopy and/or cygwin-rsyncd) and I read on:
 http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=Smb_exclude
 
 
1.  Backslashes (\) seem to be the only effective way to get smbclient
 to correctly exclude files.
2. Subfolders need to be followed by a \* to be correctly excluded.
3. Files off of the root of the share need to be prepended by an extra
 backslash to be correctly excluded.
4. Folders off of the root of the share need to be prepended by an
 extra backslash to be correctly excluded. 
 
 
 Maybe I should try it your way then ... ;-)
 
 Bernhard
 


 --
 
 I backup Windows Vista without a problem with Deltacopy. I mainly followed
 one of the wiki pages to exclude the 'junction points'. There is more
stuff
 than necessary in the one I list below, but I can't find the Wiki pages
 (what happened there?)
 
 Ah, here is the link to the specific page for Vista
 www.cs.umd.edu/~cdunne/projs/backuppc_guide.html
 
 Also mind the {ClientCharset}.
 
 Greetings,
 Koen Linders
 
Thanks for the tips, Koen,
I followed all these instructions (at least I think I did ;-)), but my 
problems were not related to excludes, I simply got no connection to the 
shares:
@ERROR: chdir failed, so maybe a permission denied error due to ACL or 
similar.
Unfortunately I had no time to investigate any further (the workstation 
is down right now so I can't provide any logs).
Should I use CYGWIN=ntsec tty with DeltaCopy, too?

= Nope. You only need to be sure after registering the Deltacopy service
that it actually runs correctly.
I register the service as administrator (Gives error, but it shows up in
Services). Then I go the services.msc and change it from This account
(administrator) to Local System Account. If it starts then without an error
it's ok.

Whatever virtual directory I defined didn't show up when I connected 
(rsync -av u...@host::), only the standard share Backup was displayed.
Are you running the 64bit version?

Bernhard

= Next I added a directory to backup in Deltacopy with name cDRIVE and path
c:\
= Use authentication: e.g backupuser and password 
= In the BackupPC config: Xfer
RsyncShareName: cDRIVE
RsyncdUsername: backupuser
RsyncdPasswd: password

That it. I can copy the Deltacopy folder after this configuration to any pc.
I only need to register the service again. You could always add multiple
backup directories in Deltacopy if you need.

Koen




--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Excludes not working

2009-08-26 Thread Koen Linders
Carl Wilhelm Soderstrom wrote:
 On 08/26 12:57 , Osburn, Michael wrote:
 I am trying to backup my backuppc server while excluding the backups
 directory. No matter what I put under excludes in the config, I still
 end up with the cpool and pc directories in my backups.
 
 You misunderstand the exclude syntax. Here's an example for a (SMB) share
 named 'c$':
 
 $Conf{BackupFilesExclude} = {
'c$' = [
 '/RECYCLER', 
 '/winnt/tmp', 
 '/temp', 
 '/WUTemp', 
 '/WINDOWS', 
 '/Documents and Settings/*/Local Settings/Temporary Internet
Files/', 
 '/Documents and Settings/*/Local Settings/history/', 
 '/Documents and Settings/*/Cookies/', 
 '/Documents and Settings/*/Favorites/', 
 '/Documents and Settings/*/IETldCache/', 
 '/Documents and Settings/*/IECompatCache/', 
 '/Documents and Settings/*/NetHood/', 
 '/Documents and Settings/*/PrivacIE/', 
 '/Documents and Settings/*/PrintHood/', 
 '/pagefile.sys', 
 '/hiberfil.sys',
 ]
};
  
hmm ... I'm having some problems with smb-tar excludes, too (my first 
smb client because of never ending VISTA 64-bit related issues with 
DeltaCopy and/or cygwin-rsyncd) and I read on:
http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=Smb_exclude


1.  Backslashes (\) seem to be the only effective way to get smbclient
to correctly exclude files.
2. Subfolders need to be followed by a \* to be correctly excluded.
3. Files off of the root of the share need to be prepended by an extra
backslash to be correctly excluded.
4. Folders off of the root of the share need to be prepended by an
extra backslash to be correctly excluded. 


Maybe I should try it your way then ... ;-)

Bernhard


--

I backup Windows Vista without a problem with Deltacopy. I mainly followed
one of the wiki pages to exclude the 'junction points'. There is more stuff
than necessary in the one I list below, but I can't find the Wiki pages
(what happened there?)

Ah, here is the link to the specific page for Vista
www.cs.umd.edu/~cdunne/projs/backuppc_guide.html

Also mind the {ClientCharset}.

Greetings,
Koen Linders


Excludes for rsyncd.

$Conf{XferMethod} = 'rsyncd';
$Conf{ClientCharset} = 'cp1252';
$Conf{BackupFilesExclude} = {
  '*' = [
'/Documents and Settings',
'/ProgramData/Application Data',
'/ProgramData/Desktop',
'/ProgramData/Documents',
'/ProgramData/Favorites',
'/ProgramData/Start Menu',
'/ProgramData/Templates',
'/Users/All Users',
'/Users/Users/Default User',
'/Users/Users/All Users/Application Data',
'/Users/Users/All Users/Desktop',
'/Users/All Users/Documents',
'/Users/All Users/Favorites',
'/Users/All Users/Start Menu',
'/Users/All Users/Templates',
'/Users/*/Application Data',
'/Users/*/Cookies',
'/Users/*/Local Settings',
'/Users/*/My Documents',
'/Users/*/NetHood',
'/Users/*/PrintHood',
'/Users/*/Recent',
'/Users/*/SendTo',
'/Users/*/Start Menu',
'/Users/*/Templates',
'/Users/*/AppData/Local/Application Data',
'/Users/*/AppData/Local/History',
'/Users/*/AppData/Local/Temporary Internet Files',
'/Users/*/Documents/Mijn Muziek',
'/Users/*/Documents/My Music',
'/Users/*/Documents/My Pictures',
'/Users/*/Documents/My Videos',
'/Users/*/AppData/Local/Microsoft/Windows/Temporary Internet Files',
'/Users/*/AppData/Local/Temp',
'/Users/*/NTUSER.DAT*',
'/Users/*/ntuser.dat*',
'/Users/*/AppData/Local/Microsoft/Windows/UsrClass.dat*',
'/Users/*/AppData/Local/Microsoft/Windows Defender/FileTracker',
'/Users/*/AppData/Local/Microsoft/Windows/Explorer/thumbcache_*.db',
'/Users/*/AppData/Local/Microsoft/Windows/WER',
'/Users/*/AppData/Local/Mozilla/Firefox/Profiles/*/Cache',
'/Users/*/AppData/Local/Mozilla/Firefox/Profiles/*/OfflineCache',
'/Users/*/AppData/Roaming/Microsoft/Windows/Cookies',
'/Users/*/AppData/Roaming/Microsoft/Windows/Recent',
'/ProgramData/Microsoft/Search',
'/ProgramData/Microsoft/Windows Defender',
'*.lock',
'Thumbs.db',
'IconCache.db',
'Cache*',
'cache*',
'/Program Files',
'/Windows',
'/$Recycle.Bin',
'/MSOCache',
'/System Volume Information',
'/Boot',
'/autoexec.bat',
'/bootmgr',
'/BOOTSECT.BAK',
'/config.sys',
'/config.sys',
'/hiberfil.sys',
'/pagefile.sys',
'/Program Files (x86)',
'/Users/*/Music',
'/DriveKey',
'/IDE',
'/PerfLogs',
'/ProgramData',
'/Program Files',
'/Program Files (x86)',
  ]
};







--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do

Re: [BackupPC-users] rsyncd on Vista 64-bit cygwin vs SUA

2009-08-17 Thread Koen Linders
I don't know what you mean with SUA environment, but I use Deltacopy in
Vista 64 bit via rsyncd.

http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp

Works without a problem atm. Easy to use and you can copy the files to other
computers and easily register the service.

Greetings,
Koen Linders

-Oorspronkelijk bericht-
Van: Bernhard Ott [mailto:bernhard@gmx.net] 
Verzonden: dinsdag 18 augustus 2009 0:21
Aan: backuppc-users@lists.sourceforge.net
Onderwerp: [BackupPC-users] rsyncd on Vista 64-bit cygwin vs SUA

Hi,
anyone successfully using the SUA environment for backing up a windows 
vista 64bit client via ssh-rsync or rsyncd?
I failed running cygwin on Vista Business 6.0 64-bit and considered 
giving MS a chance ...

Any comments very much appreciated,

thanks in advance,
Bernhard


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus
on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] vmware/virtuabox/etc.?

2009-04-09 Thread Koen Linders
I did a little test with my laptop.

VMware server 2.0 free download
Memory available for OS 384 MB
Debian etch
Freecom Hard Drive Pro 250 GB USB2 disk formatted ext3 bs 4096

dd if=/dev/zero of=/dev/sdb1 bs=4k count=300k
307200+0 records in
307200+0 records out
1258291200 bytes (1.3 GB) copied, 63.8275 seconds, 19.7 MB/s

dd of=/dev/null if=/dev/sdb1 bs=4k count=300k
307200+0 records in
307200+0 records out
1258291200 bytes (1.3 GB) copied, 70.719 seconds, 17.8 MB/s

dd if=/dev/zero of=/dev/sdb1 bs=64k count=100k
102400+0 records in
102400+0 records out
6710886400 bytes (6.7 GB) copied, 324.055 seconds, 20.5 MB/s

If you want another test, i can easily install something else on this VM
debian, cause it's only a test system.

Greetings,
Koen Linders


-Oorspronkelijk bericht-
Van: Les Mikesell [mailto:lesmikes...@gmail.com] 
Verzonden: woensdag 8 april 2009 23:30
Aan: General list for user discussion, questions and support
Onderwerp: Re: [BackupPC-users] vmware/virtuabox/etc.?

Rob Owens wrote:
 Both 
 Virtualbox and the current vmware server/player claim to work with USB 
 2.0 but I haven't done any speed tests yet - in fact I haven't been able 
 to get virtualbox to see usb drives at all.

 I think USB only works in the closed source version of VirtualBox.

Yes, I think that's what I'm using.  It does see the USB, installs some 
sort of new driver on the windows side and lets me tell the guest to 
capture it, but then the status always shows as busy and the guest never 
sees it.

 If you decide to got the live cd route, you might want to try Ubuntu.
 Version 8.10 comes with a nice and easy USB stick installer.  Sidux has
 one, too.

I think clonezilla has a ubuntu-base version.  I'll try that again.  But 
I'm still interested in a VM that has decent USB speed so I can plug 
ext3 drives into my laptop and access them while still running windows.

-- 
   Les Mikesell
lesmikes...@gmail.com



--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup remote ftp server (internet site)

2009-04-09 Thread Koen Linders
You can't directly backup an ftp host.

You could: 
1) do a pre-backup script to mount the ftp site using your favorite method,
and a post-backup script to dismount it. (Dan answered this to me some time
ago)

2) Or via cron you could do wget and backup the local folder with backuppc

(Dan  doing it by cron is a hack and will likely cause you to have failed
backups if a minor change is made.  you should use the pre and post backup
commands to handle this. you could mirror the whole ftp with wget or
something to a local directory, then backup that directory, but still you
should do it with a pre backup script, not with cron)

Greetings,
Koen Linders

-Oorspronkelijk bericht-
Van: Mirco Piccin [mailto:pic...@gmail.com] 
Verzonden: donderdag 9 april 2009 12:48
Aan: backuppc-users@lists.sourceforge.net
Onderwerp: [BackupPC-users] Backup remote ftp server (internet site)

Hi all,
i've installed BackupPC 3.2.0.
It works very well!
I'm trying to do a remote ftp server backup.
The ftp server is a internet site; i've ftp access to manage all site files.
My aim is to backup this site.

So, i create a new host, and set:
$Conf{ClientNameAlias}  - the full ftp address
$Conf{XferMethod}  - FTP
$Conf{FtpShareName} - tryed both nothing and  /htdocs (it's a web
site, and this it it's root).
$Conf{FtpUserName} - ftp host username
$Conf{FtpPasswd} - ftp host password

but, if i try to run a full backup, it does not run with this error:
Last error is xfer start failed: Can't open connection to : Invalid
argument.

Any suggets/tips?
Thanks!

Regards
M


--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup remote ftp server (internet site)

2009-04-09 Thread Koen Linders
Didn't know it had changed.
Latest version I have installed is 3.1 here. 

Sorry about the wrong info.

Koen Linders


Koen Linders wrote on 2009-04-09 13:24:29 +0200 [Re: [BackupPC-users] Backup
remote ftp server (internet site)]:
 You can't directly backup an ftp host.

not true. Read the changelog for 3.2.0.

 Van: Mirco Piccin [mailto:pic...@gmail.com] 
 Verzonden: donderdag 9 april 2009 12:48
 Aan: backuppc-users@lists.sourceforge.net
 Onderwerp: [BackupPC-users] Backup remote ftp server (internet site)
 
 [...]
 i've installed BackupPC 3.2.0.
 [...]
 So, i create a new host, and set:
 $Conf{ClientNameAlias}  - the full ftp address
 $Conf{XferMethod}  - FTP
 $Conf{FtpShareName} - tryed both nothing and  /htdocs (it's a web
 site, and this it it's root).
 $Conf{FtpUserName} - ftp host username
 $Conf{FtpPasswd} - ftp host password
 
 but, if i try to run a full backup, it does not run with this error:
 Last error is xfer start failed: Can't open connection to : Invalid
 argument.

I doubt anyone has much experience with this feature as it is quite new
(read:
as yet unreleased). I haven't had time to download, much less look at 3.2.0,
but the error message looks like something failed to parse correctly, as if
the name of the host to connect to was . Maybe if you could at least post
the comments from the default config.pl file regarding XferMethod FTP (or
send
them to me off-list), then I could try to guess what might be wrong (do you
set the target address via ClientNameAlias? What format is it supposed to be
in? What is the FtpShareName supposed to look like?). Does the XferLOG give
any further hints? Can you increase the XferLogLevel? Since not much is
happening, you might get something interesting from there ...

Regards,
Holger


--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backuppc CGI Interface not working for me

2009-04-07 Thread Koen Linders
You're problem is not really clear but this works for me.
I'm not sure what you need to do for Sid, but for Debian Etch, Apache2 I did
this. I'm not an apache expert.

Activate SSL modules for apache 2:
a2enmod ssl

Changed : ports.conf:  Listen 443

Install openssl 

Generate key in apache dir (Virtual hosts are configured to look here)

openssl req -new -x509 -days 3650 -nodes -out apache.pem -keyout apache.pem
chmod 600 apache.pem

I then changed the sites-available/default like this:

NameVirtualHost *:443

VirtualHost *:443
ServerAdmin webmas...@localhost
# DocumentRoot /var/www/
ServerName *
DirectoryIndex index.php
ErrorLog  /var/log/apache2/error.log
CustomLog /var/log/apache2/access.log combined
SSLEngine On
SSLCertificateFile /etc/apache2/apache.pem
SSLCertificateKeyFile /etc/apache2/apache.pem
Location /
SSLRequireSSL On
SSLVerifyClient optional
SSLVerifyDepth 1
SSLOptions +StdEnvVars +StrictRequire
/Location
/VirtualHost

I hope this works. I log in via remote workstation
https://192.168.*.*/backuppc with backuppc user and password.
Check http://backuppc.wiki.sourceforge.net/ for additional info.

Koen Linders

-Oorspronkelijk bericht-
Van: Laurin d'Volts [mailto:email.por...@gmail.com] 
Verzonden: dinsdag 7 april 2009 9:53
Aan: Holger Parplies
CC: General list for user discussion, questions and support
Onderwerp: Re: [BackupPC-users] Backuppc CGI Interface not working for me

On Sun, 2009-04-05 at 16:04 +0200, Holger Parplies wrote:
 - That file doesn't exist? Is there one with a different name in
   /etc/apache2/sites-enable that seems relevant? One in
   /etc/apache2/sites-available? Do the directories /etc/apache2,
   /etc/apache2/sites-enabled and /etc/apache2/sites-available exist?
 - Is apache really running? Hint: 'ps e -C apache2'.

I understand something could be broken with Sid on the computer I'm
using. However, the other computer I'm using also uses Sid. This makes
me thinks a couple of things:

1. Backuppc is not configured correctly
2. Something system configuration is not correct
3. Apache is not setup correctly
4. The system is broke
5. Perhaps it's an undiscussed bug.

I don't know much about backuppc.

I didn't know that understanding apache was a requirement for it.
I was hoping for the tutorials and documentation on backuppc to cover
more of these things in detail. Because I couldn't find a lot of
information, I came here. If things don't start working soon, I'll
figure something out myself. However, I think I'll post here for the
next while.

I turned off the firewall to test backuppc CGI. As a note, I keep seeing
my system talk about 127.0.1.1 as being a pingable IP. I find that
unusual. So, maybe that has some importance. I don't know what file
relates to that issue. I'm able to ping 127.0.0.1 and 127.0.1.1.
I tried using 127.0.1.1 in the browser: http://127.0.1.1/backuppc
and that didn't work.



seakit...@widerule:~$ cat /etc/apache2/sites-enabled/000-default
VirtualHost *:80
ServerAdmin webmas...@localhost

DocumentRoot /var/www
Directory /
Options FollowSymLinks
AllowOverride None
/Directory
Directory /var/www/
Options Indexes FollowSymLinks MultiViews
AllowOverride None
Order allow,deny
allow from all
/Directory

ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/
Directory /usr/lib/cgi-bin
AllowOverride None
Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
Order allow,deny
Allow from all
/Directory

ErrorLog /var/log/apache2/error.log

# Possible values include: debug, info, notice, warn, error,
crit,
# alert, emerg.
LogLevel warn

CustomLog /var/log/apache2/access.log combined

Alias /doc/ /usr/share/doc/
Directory /usr/share/doc/
Options Indexes MultiViews FollowSymLinks
AllowOverride None
Order deny,allow
Deny from all
Allow from 127.0.0.0/255.0.0.0 ::1/128
/Directory

/VirtualHost
seakit...@widerule:~$ ps e -C apache2
  PID TTY  STAT   TIME COMMAND




--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net

Re: [BackupPC-users] cloning the pool

2009-03-19 Thread Koen Linders
If you want an idea what isn't possible;

A year ago I tried copying a much pool much smaller too an USB disk than my
current (see lower), using a Xeon 2.8 GHz/1 MB with 2 GB DDR and it ran out
of memory copying via rsync -H

Somewhere in the mailing is other information. 

Someone said he does an rsync on a 2million file pool worked perfectly for
him with 2 GB of memory. Not for me.

Now I stop backuppc at night and do: dd if=/dev/sda5 of=/dev/sdb1 bs=4K 
It works perfectly. I managed to copy this pool back to another server with
much bigger raid1 array formatted ext3 with same blocksize. And it works
with a problem afterwards afaik.

Pool is 235.52GB comprising 718235 files and 4369 directories (as of 19/3
04:12), 
Pool hashing gives 121 repeated files with longest chain 11, 
Nightly cleanup removed 6736 files of size 5.27GB (around 19/3 04:12), 
Pool file system was recently at 61% (19/3 10:02), today's max is 61% (19/3
04:00) and yesterday's max was 61%.

Greetings,
Koen Linders

-Oorspronkelijk bericht-
Van: stoffell [mailto:stoff...@gmail.com] 
Verzonden: woensdag 18 maart 2009 21:57
Aan: General list for user discussion, questions and support
Onderwerp: Re: [BackupPC-users] cloning the pool

 I want to clone the pool to a local disk attached via USB.
 I can't made it with a dd because the pool is on a raid volume
 that don't contain the pool only.

We're about to do exactly the same thing. This to get ourselves a
weekly off-site copy. We will use 500 GB external disks to rsync -aH
the complete backuppc directory to this disk. We will use lvm and some
encrypted filesystem for enhanced security.

We'll have to test it out because the wiki is not very clear about it:
rsync has different limitations than cp - don't ask me whether it's
better or worse. It's simply something different to try.

It might be nice to have some case studies / usage scenarios on
the backuppc wiki ?

I'll report our experiences after we tested it all out..

cheers
stoffell


--
Apps built with the Adobe(R) Flex(R) framework and Flex Builder(TM) are
powering Web 2.0 with engaging, cross-platform capabilities. Quickly and
easily build your RIAs with Flex Builder, the Eclipse(TM)based development
software that enables intelligent coding and step-through debugging.
Download the free 60 day trial. http://p.sf.net/sfu/www-adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


--
Apps built with the Adobe(R) Flex(R) framework and Flex Builder(TM) are
powering Web 2.0 with engaging, cross-platform capabilities. Quickly and
easily build your RIAs with Flex Builder, the Eclipse(TM)based development
software that enables intelligent coding and step-through debugging.
Download the free 60 day trial. http://p.sf.net/sfu/www-adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] vista laptops info

2009-02-01 Thread Koen Linders
I use Deltacopy as rsync client. Works without a problem for VISTA. Check
Backuppc wiki for exclude list due to junction points.

You could always install the MS outlook backup prog, which prompts the user
to save their outlook pst on closing.
Named: pfbackup
(http://www.microsoft.com/downloads/details.aspx?FamilyId=8B081F3A-B7D0-4B16
-B8AF-5A6322F4FD01)

Koen

-Oorspronkelijk bericht-
Van: Rob Owens [mailto:row...@ptd.net] 
Verzonden: zondag 1 februari 2009 16:14
Aan: General list for user discussion, questions and support
Onderwerp: Re: [BackupPC-users] vista laptops info

Could you back up the mail server instead of the clients?  I've used exmerge
(a free-of-charge MS utility) to break the exchange store up into individual
pst files -- then use BackupPC to back them up.  They will not get locked.

-Rob

On Sun, Feb 01, 2009 at 11:02:27AM +, Terry wrote:
 Hi every one. I have a small work group of vista laptops who come into 
 the office and connect using wireless and dhcp.
 I am looking to backup there my documents and there pst file which tend 
 to be a round 2 to 3 GB.
 The one big snag is there pst files are mostly going to be open and even 
 when they close out look the outlook process may still be running due to 
 add ons they have such as act etc.
 Any pointers or experiences welcome. Or am I going to be banging my head 
 against a brick wall.
 
 The plus point is I have a freebsd server to use. I am in the middle of 
 setting up backuppc at home now for some testing
 
 Thanks
 Terry
 


--
 This SF.net email is sponsored by:
 SourcForge Community
 SourceForge wants to tell your story.
 http://p.sf.net/sfu/sf-spreadtheword
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Remote backups of a win2003 server keeps failing after a certain amount of time.

2009-01-07 Thread Koen Linders
Remote backups of a win2003 server keeps failing after a certain amount of
time (almost every time about 20h later / data 11 GB already done).

Backup method: Rsyncd
Server client: Rsync via Deltacopy (great piece of easy to configure
software thanks to someone on this list, works perfectly for win2K, winXP,
vista, win2003)

Slow upload +- 512 KB/s 
25 GB of data on a separate partition (not the windows one)

Another win2003 with similar data (but less 9g) hasn't got a problem. The
first full needed 24h, +- 100 KB/s average. 

= Could it be a specific file is too big? Too long name? 
= Anyone could point me to what to search for?

Greetings,
Koen Linders

Contents of file /var/lib/backuppc/pc/80.201.242.118/LOG.012009, modified
2009-01-07 09:44:29 
2009-01-01 20:00:00 full backup started for directory dDRIVE
2009-01-02 16:23:54 Aborting backup up after signal ALRM
2009-01-02 16:23:55 Got fatal error during xfer (aborted by signal=ALRM)
2009-01-02 16:23:55 Saved partial dump 0
2009-01-02 20:00:00 full backup started for directory dDRIVE
2009-01-03 16:24:01 Aborting backup up after signal ALRM
2009-01-03 16:24:02 Got fatal error during xfer (aborted by signal=ALRM)
2009-01-03 16:24:04 Saved partial dump 0
2009-01-03 17:00:00 full backup started for directory dDRIVE
2009-01-04 13:23:27 Aborting backup up after signal ALRM
2009-01-04 13:23:28 Got fatal error during xfer (aborted by signal=ALRM)
2009-01-04 13:23:28 Saved partial dump 0
2009-01-04 14:00:01 full backup started for directory dDRIVE
2009-01-05 10:26:18 Aborting backup up after signal ALRM
2009-01-05 10:26:20 Got fatal error during xfer (aborted by signal=ALRM)
2009-01-05 10:26:23 Saved partial dump 0
2009-01-05 15:19:04 full backup started for directory dDRIVE
2009-01-06 11:51:22 Aborting backup up after signal ALRM
2009-01-06 11:51:26 Got fatal error during xfer (aborted by signal=ALRM)
2009-01-06 11:51:29 Saved partial dump 0
2009-01-06 11:58:05 full backup started for directory dDRIVE
2009-01-07 09:44:26 Aborting backup up after signal ALRM
2009-01-07 09:44:28 Got fatal error during xfer (aborted by signal=ALRM)
2009-01-07 09:44:29 Saved partial dump 0


Contents of file /var/lib/backuppc/pc/80.201.242.118/XferLOG.0.z, modified
2009-01-07 09:44:28 (Extracting only Errors) 
full backup started for directory dDRIVE
Connected to 80.201.242.118:873, remote version 30
Negotiated protocol version 28
Connected to module dDRIVE
Sending args: --server --sender --numeric-ids --perms --owner --group -D
--links --hard-links --times --block-size=2048 --recursive --ignore-times .
.
Sent exclude: Thumbs.db
Sent exclude: IconCache.db
Sent exclude: Cache
Sent exclude: cache
Sent exclude: /Documents and Settings/*/Local Settings/Temporary Internet
Files
Sent exclude: /Documents and Settings/*/Local Settings/Temp
Sent exclude: /Documents and Settings/*/NTUSER.DAT
Sent exclude: /Documents and Settings/*/ntuser.dat.LOG
Sent exclude: /Documents and Settings/*/Local Settings/Application
Data/Microsoft/Windows/UsrClass.dat
Sent exclude: /Documents and Settings/*/Local Settings/Application
Data/Microsoft/Windows/UsrClass.dat.LOG
Sent exclude: /Documents and Settings/*/Local Settings/Application
Data/Mozilla/Firefox/Profiles/*/Cache
Sent exclude: /Documents and Settings/*/Local Settings/Application
Data/Mozilla/Firefox/Profiles/*/OfflineCache
Sent exclude: /Documents and Settings/*/Recent
Sent exclude: *.lock
Sent exclude: /WINDOWS
Sent exclude: /RECYCLER
Sent exclude: /MSOCache
Sent exclude: /System Volume Information
Sent exclude: /AUTOEXEC.BAT
Sent exclude: /BOOTSECT.BAK
Sent exclude: /CONFIG.SYS
Sent exclude: /hiberfil.sys
Sent exclude: /pagefile.sys
Sent exclude: /Program Files/F-Secure/common/policy.ipf
Sent exclude: NTUSER.DAT.LOG
Sent exclude: NTUSER.dat
Sent exclude: *.tmp
Sent exclude: /Profiles/VTB/*/NTUSER.DAT
Sent exclude: /Profiles/VTB/*/ntuser.dat.LOG
Sent exclude: /Profiles/VTB/*/Local Settings/Application
Data/Microsoft/Windows/UsrClass.dat
Sent exclude: /Profiles/VTB/*/Local Settings/Application
Data/Microsoft/Windows/UsrClass.dat.LOG
Sent exclude: /Profiles/VTB/*/Local Settings/Application
Data/Mozilla/Firefox/Profiles/*/Cache
Sent exclude: /Profiles/VTB/*/Local Settings/Application
Data/Mozilla/Firefox/Profiles/*/Cache
Sent exclude: /Profiles/VTB/*/Local Settings/Application
Data/Mozilla/Firefox/Profiles/*/OfflineCache
Sent exclude: /Profiles/VTB/*/Recent
Sent exclude: /Profiles/VTB/*/NetHood
Sent exclude: /Profiles/VTB/*/Onlangs geopend
Sent exclude: /Profiles/VTB/*/UserData
Sent exclude: /Public/Backup
Xfer PIDs are now 30058
[ skipped 25915 lines ]
Remote[2]: file has vanished: Shares/Audiologie/administratie HOTO en
FM/Facturatie/faktuuraanvraag FM en H.A/FM + hulpmiddelen/~$orbeeld
Veranneman facturatie FM en hulpmiddelen.doc (in dDRIVE)
[ skipped 83 lines ]
finish: removing in-process file Shares/Audiologie/backup/PC
MPI047/Backup.bkf
Child is aborting
Done: 22734 files, 11921915308 bytes
Got fatal error during xfer (aborted by signal=ALRM

Re: [BackupPC-users] Remote backups of a win2003 server keeps failing after a certain amount of time.

2009-01-07 Thread Koen Linders
I forgot to add the windows logbook errors:

1) 
The description for Event ID ( 0 ) in Source ( rsyncd ) cannot be found. The
local computer may not have the necessary registry information or message
DLL files to display messages from a remote computer. You may be able to use
the /AUXSOURCE= flag to retrieve this description; see Help and Support for
details. The following information is part of the event: rsyncd: PID 644:
rsync: writefd_unbuffered failed to write 4092 bytes [sender]: Connection
reset by peer (104).

2)
The description for Event ID ( 0 ) in Source ( rsyncd ) cannot be found. The
local computer may not have the necessary registry information or message
DLL files to display messages from a remote computer. You may be able to use
the /AUXSOURCE= flag to retrieve this description; see Help and Support for
details. The following information is part of the event: rsyncd: PID 644:
rsync error: error in rsync protocol data stream (code 12) at
/home/lapo/packaging/rsync-3.0.4-1/src/rsync-3.0.4/io.c(1541)
[sender=3.0.4].

-Oorspronkelijk bericht-
Van: Koen Linders [mailto:koen.lind...@koca.be] 
Verzonden: 07 January 2009 10:17
Aan: BackupPC-users@lists.sourceforge.net
Onderwerp: [BackupPC-users] Remote backups of a win2003 server keeps failing
after a certain amount of time.

Remote backups of a win2003 server keeps failing after a certain amount of
time (almost every time about 20h later / data 11 GB already done).

Backup method: Rsyncd
Server client: Rsync via Deltacopy (great piece of easy to configure
software thanks to someone on this list, works perfectly for win2K, winXP,
vista, win2003)

Slow upload +- 512 KB/s 
25 GB of data on a separate partition (not the windows one)

Another win2003 with similar data (but less 9g) hasn't got a problem. The
first full needed 24h, +- 100 KB/s average. 

= Could it be a specific file is too big? Too long name? 
= Anyone could point me to what to search for?

Greetings,
Koen Linders

Contents of file /var/lib/backuppc/pc/80.201.242.118/LOG.012009, modified
2009-01-07 09:44:29 
2009-01-01 20:00:00 full backup started for directory dDRIVE
2009-01-02 16:23:54 Aborting backup up after signal ALRM
2009-01-02 16:23:55 Got fatal error during xfer (aborted by signal=ALRM)
2009-01-02 16:23:55 Saved partial dump 0
2009-01-02 20:00:00 full backup started for directory dDRIVE
2009-01-03 16:24:01 Aborting backup up after signal ALRM
2009-01-03 16:24:02 Got fatal error during xfer (aborted by signal=ALRM)
2009-01-03 16:24:04 Saved partial dump 0
2009-01-03 17:00:00 full backup started for directory dDRIVE
2009-01-04 13:23:27 Aborting backup up after signal ALRM
2009-01-04 13:23:28 Got fatal error during xfer (aborted by signal=ALRM)
2009-01-04 13:23:28 Saved partial dump 0
2009-01-04 14:00:01 full backup started for directory dDRIVE
2009-01-05 10:26:18 Aborting backup up after signal ALRM
2009-01-05 10:26:20 Got fatal error during xfer (aborted by signal=ALRM)
2009-01-05 10:26:23 Saved partial dump 0
2009-01-05 15:19:04 full backup started for directory dDRIVE
2009-01-06 11:51:22 Aborting backup up after signal ALRM
2009-01-06 11:51:26 Got fatal error during xfer (aborted by signal=ALRM)
2009-01-06 11:51:29 Saved partial dump 0
2009-01-06 11:58:05 full backup started for directory dDRIVE
2009-01-07 09:44:26 Aborting backup up after signal ALRM
2009-01-07 09:44:28 Got fatal error during xfer (aborted by signal=ALRM)
2009-01-07 09:44:29 Saved partial dump 0


Contents of file /var/lib/backuppc/pc/80.201.242.118/XferLOG.0.z, modified
2009-01-07 09:44:28 (Extracting only Errors) 
full backup started for directory dDRIVE
Connected to 80.201.242.118:873, remote version 30
Negotiated protocol version 28
Connected to module dDRIVE
Sending args: --server --sender --numeric-ids --perms --owner --group -D
--links --hard-links --times --block-size=2048 --recursive --ignore-times .
.
Sent exclude: Thumbs.db
Sent exclude: IconCache.db
Sent exclude: Cache
Sent exclude: cache
Sent exclude: /Documents and Settings/*/Local Settings/Temporary Internet
Files
Sent exclude: /Documents and Settings/*/Local Settings/Temp
Sent exclude: /Documents and Settings/*/NTUSER.DAT
Sent exclude: /Documents and Settings/*/ntuser.dat.LOG
Sent exclude: /Documents and Settings/*/Local Settings/Application
Data/Microsoft/Windows/UsrClass.dat
Sent exclude: /Documents and Settings/*/Local Settings/Application
Data/Microsoft/Windows/UsrClass.dat.LOG
Sent exclude: /Documents and Settings/*/Local Settings/Application
Data/Mozilla/Firefox/Profiles/*/Cache
Sent exclude: /Documents and Settings/*/Local Settings/Application
Data/Mozilla/Firefox/Profiles/*/OfflineCache
Sent exclude: /Documents and Settings/*/Recent
Sent exclude: *.lock
Sent exclude: /WINDOWS
Sent exclude: /RECYCLER
Sent exclude: /MSOCache
Sent exclude: /System Volume Information
Sent exclude: /AUTOEXEC.BAT
Sent exclude: /BOOTSECT.BAK
Sent exclude: /CONFIG.SYS
Sent exclude: /hiberfil.sys
Sent exclude: /pagefile.sys
Sent exclude: /Program Files

Re: [BackupPC-users] Remote backups of a win2003 server keeps failing after a certain amount of time.

2009-01-07 Thread Koen Linders
Thanks for the reply.

I changed the value to 144000 for this client and will wait another day (or
two). 

Maybe it hangs on a specific file? I hope changing the value worked.

Anyway, thx :)
Koen Linders

-Oorspronkelijk bericht-
Van: Holger Parplies [mailto:wb...@parplies.de] 
Verzonden: 07 January 2009 12:35
Aan: Koen Linders
CC: BackupPC-users@lists.sourceforge.net
Onderwerp: Re: [BackupPC-users] Remote backups of a win2003 server keeps
failing after a certain amount of time.

Hi,

Koen Linders wrote on 2009-01-07 10:17:20 - [[BackupPC-users] Remote
backups of a win2003 server keeps failing after a certain amount of time.]:
 Remote backups of a win2003 server keeps failing after a certain amount of
 time (almost every time about 20h later / data 11 GB already done).
 
 Backup method: Rsyncd
 [...]
 Contents of file /var/lib/backuppc/pc/80.201.242.118/LOG.012009, modified
 2009-01-07 09:44:29 
 2009-01-01 20:00:00 full backup started for directory dDRIVE
 2009-01-02 16:23:54 Aborting backup up after signal ALRM
 2009-01-02 16:23:55 Got fatal error during xfer (aborted by signal=ALRM)
 2009-01-02 16:23:55 Saved partial dump 0

signal ALRM is always caused by BackupPC aborting the backup after
$Conf{ClientTimeout} has passed without BackupPC detecting any progress. In
the case of rsync(d), this needs to account for the complete backup due to
the
way it is implemented (for tar type backups, I believe it is only the
duration
of the longest file transfer). The default value is 72000 seconds == 20
hours.

Raise $Conf{ClientTimeout} to a value that will allow your backup to
complete.
A too high value has the drawback of not detecting stuck backups for that
amount of time. This is nothing to worry about for your first backup.
Perhaps
just add a '0' and change the value back after the first backup has
successfully completed. Future rsync(d) backups should hopefully complete
significantly faster.

 2009-01-02 20:00:00 full backup started for directory dDRIVE
 2009-01-03 16:24:01 Aborting backup up after signal ALRM
 2009-01-03 16:24:02 Got fatal error during xfer (aborted by signal=ALRM)
 2009-01-03 16:24:04 Saved partial dump 0

I'm not sure why you repeatedly get a partial dump 0 though instead of the
transfer restarting from the point it was originally interrupted (using the
previous partial as reference). That would allow the backup to eventually
complete even without increasing ClientTimeout, but it does not seem to be
happening in your case.

Regards,
Holger


--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] How to backup remote ftp server?

2008-07-07 Thread Koen Linders
I don't know if its possible.

Anyone managed to use backuppc to backup e.g. ftp://example.com providing  
a login and password?
I can't find anything in the documentation or wiki.

greetings,
Koen Linders


-
Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW!
Studies have shown that voting for your favorite open source project,
along with a healthy diet, reduces your potential for chronic lameness
and boredom. Vote Now at http://www.sourceforge.net/community/cca08
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to backup remote ftp server?

2008-07-07 Thread Koen Linders
On Mon, 07 Jul 2008 10:42:40 +0200, Nils Breunese (Lemonbit)  
[EMAIL PROTECTED] wrote:

 Koen Linders wrote:

 I don't know if its possible.

 Anyone managed to use backuppc to backup e.g. ftp://example.com
 providing
 a login and password?
 I can't find anything in the documentation or wiki.


  From docs: No client-side software is needed. On WinXX the standard
 smb protocol is used to extract backup data. On linux, unix or MacOSX
 clients, rsync or tar (over ssh/rsh/nfs) is used to extract backup
 data. Alternatively, rsync can also be used on WinXX (using cygwin),
 and Samba could be installed on the linux or unix client to provide
 smb shares). (http://backuppc.sourceforge.net/faq/BackupPC.html#overview
 )

 BackupPC does not work over FTP.

 Nils Breunese.

 -
 Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW!
 Studies have shown that voting for your favorite open source project,
 along with a healthy diet, reduces your potential for chronic lameness
 and boredom. Vote Now at http://www.sourceforge.net/community/cca08
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


thanks for the quick response. i can stop looking now

Koen Linders


-
Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW!
Studies have shown that voting for your favorite open source project,
along with a healthy diet, reduces your potential for chronic lameness
and boredom. Vote Now at http://www.sourceforge.net/community/cca08
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to backup remote ftp server?

2008-07-07 Thread Koen Linders
I didn't think of mounting or doing it via cron.
It's perfect solution.

Thanks
Koen Linders


On Mon, 07 Jul 2008 11:15:04 +0200, Joe Bordes [EMAIL PROTECTED] wrote:

 I like the idea of mounting the ftp to a local mount point. That is a
 good one:

 ftpfs

 Joe
 TSolucio

 El lun, 07-07-2008 a las 11:10 +0200, Joe Bordes escribió:
 Hi,

 Have you thought of adding a precopy process or cron to fetch the
 information to a local directory using mirdir or wget or some similar
 tool and then have BackupPC backup the local directory?

 Joe
 TSolucio

 El lun, 07-07-2008 a las 10:53 +0200, Koen Linders escribió:
  On Mon, 07 Jul 2008 10:42:40 +0200, Nils Breunese (Lemonbit)
  [EMAIL PROTECTED] wrote:
 
   Koen Linders wrote:
  
   I don't know if its possible.
  
   Anyone managed to use backuppc to backup e.g. ftp://example.com
   providing
   a login and password?
   I can't find anything in the documentation or wiki.
  
  
From docs: No client-side software is needed. On WinXX the  
 standard
   smb protocol is used to extract backup data. On linux, unix or  
 MacOSX
   clients, rsync or tar (over ssh/rsh/nfs) is used to extract backup
   data. Alternatively, rsync can also be used on WinXX (using cygwin),
   and Samba could be installed on the linux or unix client to provide
   smb shares).  
 (http://backuppc.sourceforge.net/faq/BackupPC.html#overview
   )
  
   BackupPC does not work over FTP.
  
   Nils Breunese.
  

 -
   Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW!
   Studies have shown that voting for your favorite open source  
 project,
   along with a healthy diet, reduces your potential for chronic  
 lameness
   and boredom. Vote Now at http://www.sourceforge.net/community/cca08
   ___
   BackupPC-users mailing list
   BackupPC-users@lists.sourceforge.net
   List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
   Wiki:http://backuppc.wiki.sourceforge.net
   Project: http://backuppc.sourceforge.net/
  
 
  thanks for the quick response. i can stop looking now
 
  Koen Linders
 
 
   
 -
  Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW!
  Studies have shown that voting for your favorite open source project,
  along with a healthy diet, reduces your potential for chronic lameness
  and boredom. Vote Now at http://www.sourceforge.net/community/cca08
  ___
  BackupPC-users mailing list
  BackupPC-users@lists.sourceforge.net
  List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
  Wiki:http://backuppc.wiki.sourceforge.net
  Project: http://backuppc.sourceforge.net/
 


 -
 Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW!
 Studies have shown that voting for your favorite open source project,
 along with a healthy diet, reduces your potential for chronic lameness
 and boredom. Vote Now at http://www.sourceforge.net/community/cca08
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


 -
 Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW!
 Studies have shown that voting for your favorite open source project,
 along with a healthy diet, reduces your potential for chronic lameness
 and boredom. Vote Now at http://www.sourceforge.net/community/cca08
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/



-- 
Using Opera's revolutionary e-mail client: http://www.opera.com/mail/


-
Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW!
Studies have shown that voting for your favorite open source project,
along with a healthy diet, reduces your potential for chronic lameness
and boredom. Vote Now at http://www.sourceforge.net/community/cca08
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] SendMail Options...

2008-04-21 Thread Koen Linders
On Mon, 21 Apr 2008 08:52:45 +0200, Fabrice Blatch [EMAIL PROTECTED]  
wrote:

 Hello

 i must say that i'm very grateful for the creators of BackupPC... it
 is an awesome piece of software

 all is going well with my installation but i'm struggling with the
 email notifications...

 i'm using the gui to configure backupPC, the backups are going well
 but i do not receive any emails form backupPC.
 in the documentation, it says:

 To verify that it can run sendmail and deliver email correctly you
 should ask it to send a test email to you:

  su __BACKUPPCUSER__
  __INSTALLDIR__/bin/BackupPC_sendEmail -u [EMAIL PROTECTED]

 i did that and i did receive an email form backuppc, but i still do
 not get any notifications...

 any suggestions?

 thank you


BackupPC must be running at the first number of the wakeup schedule else  
the nightly cleanup will not start and no mail alerts will be sent.

So check the shedule. And check via the web interface when the last  
cleanup was.

I hope this helps.

Koen Linders


-
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC restore???

2008-04-17 Thread Koen Linders
On Thu, 17 Apr 2008 12:53:15 +0200, Joseph Holland  
[EMAIL PROTECTED] wrote:

 Ok, we have BackupPC version 3 installed on a few servers throughout our
 company.  Recently one of the servers went down.  We have been using
 rsync to backup the BackupPC'c data directory onto a USB drive.  I'm
 just wondering is it possible to restore this data over a fresh install
 of BaclupPC on another server?  I know that I will probably need the
 /etc/BackupPC and the /usr/local/share/BackupPC directory's to do this
 and have copied them over my fresh install of BackupPC, but it will not
 see the old backups.  It can see however all the hosts that I was
 backing up, but says that they have never been backed up by this new
 install (which is true).

 Can anyone help me???

 Thanks,


 JoeH.




Did you rsync with the hardlinks option?

You can always try to mount the usb disk at the appriorate folder. That  
should work (atleast it did for me, when doing some testing)

Greetings,
Koen Linders


-
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC restore???

2008-04-17 Thread Koen Linders
On Thu, 17 Apr 2008 13:27:00 +0200, Koen Linders [EMAIL PROTECTED] wrote:

 On Thu, 17 Apr 2008 12:53:15 +0200, Joseph Holland
 [EMAIL PROTECTED] wrote:

 Ok, we have BackupPC version 3 installed on a few servers throughout our
 company.  Recently one of the servers went down.  We have been using
 rsync to backup the BackupPC'c data directory onto a USB drive.  I'm
 just wondering is it possible to restore this data over a fresh install
 of BaclupPC on another server?  I know that I will probably need the
 /etc/BackupPC and the /usr/local/share/BackupPC directory's to do this
 and have copied them over my fresh install of BackupPC, but it will not
 see the old backups.  It can see however all the hosts that I was
 backing up, but says that they have never been backed up by this new
 install (which is true).

 Can anyone help me???

 Thanks,


 JoeH.




 Did you rsync with the hardlinks option?

 You can always try to mount the usb disk at the appriorate folder. That
 should work (atleast it did for me, when doing some testing)

 Greetings,
 Koen Linders


Ok,

a bit too quick:
did you rsync to the usb disk with rsync with hardlinks?
did you rsync/copy back to local disk with hardlinks?

Does mounting the usb drive at the backuppc data folder show your backups?
If not, it could be a config thing.
Else there might be something wrong with the way you copied the data from  
(or initially to) the USB disk.

Koen Linders


-
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC_nighlty fails to start

2008-03-19 Thread Koen Linders
Ok. Another shorter post about this problem (more info 7/03 -  
BackupPC_nightly not running)

BackupPC_nightly doens't run anymore. Backuppc also doens't start the mail  
alert system since it seems to start after a succesfull nightly cleanup.

The only thing I think which might disrupt this is a cron.daily that start  
at 9.30 pm
stops backuppc, unmount the partition and does a dd to a seperate usb  
disk. Around 2.30 am, it's ok. Partition is mounted and backuppc started.

Backuppc nightly has run in this configuration without problems for a few  
months (dd stopping at the same hour). And running manually:
su backupppc
./BackupPC_serverMesg BackupPC_nightly run
works perfectly (and sends a mail when finished)

I know I could make a cron.weekly or something to manually start it. But  
I'd like to find the problem that causes this...

Any ideas? Things I could check?

Thanks.
Koen Linders


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC_nightly not running

2008-03-07 Thread Koen Linders
First: i'm no linux pro. I started working with my first linux +-14 months  
ago. And i'm still learning a lot (So many possible things i want to try  
(backupping remote sites!)
Second: BackupPC is great.
---

Main problem: backuppc_nightly not running

Backuppc is running on a samba PDC with DHCP / WINS / DDNS running (Debian  
Etch). I will be migrating BackupPC to another server as soon as possible,  
not completely sure how, but that's another story for later.

There is a problem with backupPC_nightly not running for some time now  
(and server running out of diskspace).
And no mails since 26/1 either. Something must have happened/changed one  
of those days.
I can't recall what i was doing those days. Maybe setting up the DDNS...

Backuppc is running as backuppc user.

Daily I make a backup of the backuppc partition to an external disk at  
22:00. Around 3 a.m. I receive an e-mail with a summary.

/etc/cron.daily/dd_disk:
Stopping backuppc: ok.
Stopping Samba daemons: nmbd smbd.
umount: /dev/sdb1: not mounted
110742061+1 records in
110742061+1 records out
453599483904 bytes (454 GB) copied, 17776.4 seconds, 25.5 MB/s

real296m16.430s
user0m37.190s
sys 41m33.888s
Starting backuppc: ok.
Starting Samba daemons: nmbd smbd.


It seems that 28/1 was the last time it ran.

This morning i started the process with:
su backupppc
./BackupPC_serverMesg BackupPC_nightly run
and it is finished after 4 hours.

(While typing this, look below about emails, now i get a mail)  
BackupPC_nightly starts the mailing it seems.

450 GB HD (raid 5), i had 66 GB free and i'm up to 180 GB :)

One of the reasons, it's cleaning that much, i guess, is because i have  
been changing the amount of full backups to keep last 1.5 month, since  
diskspace was slowly running out.

Right now i have 38 WinXP hosts, from which i do a full back keeping 1, 2  
and 4 weeks old + 7 incrementals. (FullKeepCnt 2 1)
Pool is steady at 130 GB. Profiles are normally between 1-3 GB big.

Anything i can change to have backuppc_nightly running again  
automatically? I didn't change any of the basic settings there:

MaxBackups  2   
MaxUserBackups  4   
MaxPendingCmds  10
MaxBackupPCNightlyJobs   2  
BackupPCNightlyPeriod   1

Mails sent: 26/1 last mail.

I tested the mailing system:
su backuppc
/usr/share/backuppc/bin/BackupPC_sendEmail -u [EMAIL PROTECTED]
No problems there

Backuppc settings: EMailNotifyOldBackupDays 7

Though i have a few hosts with Full age

HostUser#Full   Full Age (days) Full Size (GB)  Speed 
(MB/s)#Incr  
Incr Age (days) Last Backup (days)  State   Last attempt
mpi024  [EMAIL PROTECTED]   3   32.05.922.30
2   16.116.1idleno  
ping (host not found)
mpil21  [EMAIL PROTECTED]   4   17.15.332.731   115.8   
17.1idleno  
ping (host not found)

-
Dialy backup to external disk via cron.daily:

invoke-rc.d backuppc stop
invoke-rc.d samba stop

umount /dev/sda5
umount /dev/sdb1

time dd if=/dev/sda5 of=/dev/sdb1 bs=4K

mount -a

invoke-rc.d backuppc start
invoke-rc.d samba start
---

The servers PID is 5774, on host newton, version 3.0.0, started at 7/3  
02:56.
This status was generated at 7/3 08:49.
The configuration was last loaded at 7/3 02:56.
PCs will be next queued at 7/3 09:00.
Other info:
0 pending backup requests from last scheduled wakeup,
0 pending user backup requests,
0 pending command requests,
Pool is 118.49GB comprising 946267 files and 4369 directories (as of 28/1  
01:16),
Pool hashing gives 470 repeated files with longest chain 28,
Nightly cleanup removed 673 files of size 0.37GB (around 28/1 01:16),
Pool file system was recently at 84% (7/3 08:46), today's max is 84% (6/3  
13:50) and yesterday's max was 52%.
---


Thanks for reading.

Greetings,
Koen Linders


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] trying to understand Conf{FullKeepCnt}

2008-01-30 Thread Koen Linders
There is a bit more info here:
http://backuppc.wiki.sourceforge.net/keep_yearly_backups 

The documentation also explains another bit.

Koen Linders

-Oorspronkelijk bericht-
Van: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] Namens B. Cook
Verzonden: woensdag 23 januari 2008 13:38
Aan: backuppc-users@lists.sourceforge.net
Onderwerp: [BackupPC-users] trying to understand Conf{FullKeepCnt}

Last night backuppc was a hero.

We had a dell sata raid card drop an array, just 'cause :P

anyway it was 11pm night last night restoring from backuppc (which  
went very nicely) after ssh keys were restored and such ;)

anyway..

I noticed that the backups were only two weeks, and in reading in the  
manual how to increase the retention time.. I quickly became confused..

   $Conf{FullKeepCnt} = [4, 2, 3];

this looks like (by the explanation) that this would be 16 weeks of  
backups, or 4 months.. would doubling that give it 8 months? and  
tripling it be a year?

I am assuming it will still do nightly incrementals?

I am not sure if it is confusing or I am tired.. but either way I  
would like to get an answer ;)

Thanks in advance,

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC_tarPCCopy hard link error

2007-10-18 Thread Koen Linders
Hello,

system info:
Backuppc v.3 (latest stable via deb http://www.backports.org/debian  
etch-backports main contrib non-free)
Debian Etch
Tar v. 1.60

I'm trying to backup my cpool and pc's to an external disk attached to the  
same computer. Both are: ext3

1) Rsync -avHh for cpool works.

Btw, i'm pretty sure this is the correct way for backup of the pool. Can  
someone confirm this? Or reply how you backup the backuppc data?

I want a backup which i could soft-link the backuppc data folder to so  
everything keeps working in case of an emergency. I can't test it yet  
since BackupPC_tarPCCopy gives errors.

2)i can't find much info on the internet about the BackupPC_tarPCCopy  
errors. Atleast i hope i didn't miss a very obvious post/faq somewhere...

As backuppc user using: e.g. for only one computer (50+ computer are being  
fully backupped)

BackupPC_tarPCCopy /var/lib/backuppc/pc/mpi021 | (cd  
/backups/backuppc/pc/mpi021  tar xPf -)

gives something like this:

tar: ./mpi021/7/fC$/fDocuments and Settings/fAdministrator/fMijn  
documenten/fHPLJ24x0/ftoolbox/attrib: Cannot hard link to  
`../cpool/d/b/1/db10c689101ef0b373742b62b7b6f532': No such file or  
directory

backuppc has write rights as owner for the target directory:
drwxr-x--- 11 backuppc backuppc  4096 2007-10-18 14:31 mpi021

/var/lib/backuppc is soft-linked to another directory but if i try that i  
get:

Argument /data/backuppc/pc/mpi021 must be an absolute path starting with  
/var/lib/backuppc

i'm not really sure what goes wrong. I found some info about the leading  
/ with cpool. But that part seems ok; hence the usage of P for the tar  
command.


Btw, this is a really great program.

Greetings,

Koen Linders


-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now  http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_tarPCCopy hard link error

2007-10-18 Thread Koen Linders

Pool is about 108 GB. I already saw the system go down to a very very slow  
pace when trying the rsync everything.

The backuppc folder is on the same partition as my samba shares so i  
rather not unmount it every night for backups. It depends how long it  
takes to backup 250-350 GB with dd.


On Thu, 18 Oct 2007 16:08:45 +0200, Carl Wilhelm Soderstrom  
[EMAIL PROTECTED] wrote:

 On 10/18 03:47 , Koen Linders wrote:
 I'm trying to backup my cpool and pc's to an external disk attached to  
 the
 same computer. Both are: ext3

 1) Rsync -avHh for cpool works.

 How much data do you have?
 you'll find that rsync has awful memory requirements once you start  
 getting
 millions of files you're trying to sync. enough to crush your box.

 If you have the backuppc data pool mounted on a separate partition; your
 best bet is likely to be to shut down backuppc, unmount the partition,  
 and
 copy it with dd.

 failing that, shut down backuppc (so the disk and system are quiesced),  
 and
 copy the data pool with tar.

 I spent a lot of time trying to archive the data pool with tar, using an  
 LVM
 snapshot. I found that the performance was so bad that it took 4x as  
 long as
 just quiescing the disk and backing it up without the snapshot. (I  
 suspect
 mostly due to the backups going on while the archive was working; this
 destroyed the memory caches in addition to causing many more disk head
 seeks). It was far better to just quiesce the system and archive it with
 tar, then start it back up again.




-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now  http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] backuppc problem about storage

2007-10-08 Thread Koen Linders

hey Varuna,


This is how i did it.

If its on the same system:

- Stop backuppc
- move /var/lib/backuppc to the location you want.
- Create symlink:
= ln -s /location_you_want /var/lib/backuppc
- Start backuppc

I hope this helps,

Koen Linders


Varuna Nohar schreef:
 hi all
 
 i want to know that the path to store the backup is var/lib/backuppc/pc
 
 but i wanted to store in my home . It is taking it defalut .Now i want 
 to know from where i should change
 
 to be store in the desired location.
 
 Regards
 
 Varuna Nohar
 
 
 
 
 -
 This SF.net email is sponsored by: Splunk Inc.
 Still grepping through log files to find problems?  Stop.
 Now Search log events and configuration files using AJAX and a browser.
 Download your FREE copy of Splunk now  http://get.splunk.com/
 
 
 
 
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/

-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now  http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/