[BackupPC-users] Call Timed Out
I'm using xfer method SMB to backup a server the whole job fails with this error Call timed out: server did not respond after 2 milliseconds opening remote file Is there anyway to adjust this or tell the job to skip files that are open etc ...? Thanks -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Call Timed Out
This problem doesn't necessarily mean open files. It could simply mean that it is having trouble reading a directory listing. I get this error quite a bit on some two older laptops. And it's almost always on the same directory. Rsync did not solve the problem either. It ended up abandoning BackupPC for them. However, I would love to get rid of the laptops as well. Make sure your Windows machines have been defragmented. I believe that increasing the 2 ms requirement would involve recompiling the kernel. Chris Baker -- cba...@intera.com systems administrator INTERA -- 512-425-2006 _ From: Tim Hall [mailto:th...@insightit.ca] Sent: Wednesday, August 05, 2009 9:29 AM To: backuppc-users@lists.sourceforge.net Subject: [BackupPC-users] Call Timed Out I'm using xfer method SMB to backup a server the whole job fails with this error Call timed out: server did not respond after 2 milliseconds opening remote file Is there anyway to adjust this or tell the job to skip files that are open etc ...? Thanks -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Call Timed Out
Tim Hall wrote: I'm using xfer method SMB to backup a server the whole job fails with this error Call timed out: server did not respond after 2 milliseconds opening remote file This is probably a symptom of on-access virus scanning. -- Les Mikesell lesmikes...@gmail.com -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Restore issues
Hi Matthias, Sorry it took me so long to get back to you - I've had a lot of tight deadlines. I'm not backing up a windows client... It's a Linux server offering a samba share. To be fair, thinking about it, since I'm using rsync, it's just a Linux server, and the fact that I access it with Windows clients is irrelevant. No VSS - because Linux... Good news on the backup front... Still, very strange about the different numbers of files reported on Windows when looking at the share, but I guess that's one of those strange glitchy things with MS products... I think pretty much the only thing I backup is the /home, /etc, /usr and /var... Thanks for your help on this... Cheers, Jx On Thu, Jul 30, 2009 at 8:00 PM, Matthias Meyer matthias.me...@gmx.liwrote: Vetch wrote: Hi Matthias, All my xferlogs say they have 0 errors (apart from one, but that was after the problem occurred anyway)... I've had a look at them, but they are... quite long... Without knowing what to search for, I'm not sure what I can do with them... If they report no errors, I guess I can assume all files are backing up properly? Xfer Error Summary Backup# Type View #Xfer errs #bad files #bad share #tar errs 0 full XferLOG, Errors 0 0 0 0 28 full XferLOG, Errors 0 0 0 0 56 full XferLOG, Errors 0 0 0 0 84 full XferLOG, Errors 0 0 0 0 112 full XferLOG, Errors 0 0 0 0 126 full XferLOG, Errors 0 0 0 0 133 full XferLOG, Errors 0 0 0 0 140 full XferLOG, Errors 0 0 0 0 147 full XferLOG, Errors 0 0 0 0 150 incr XferLOG, Errors 0 0 0 0 151 incr XferLOG, Errors 0 0 0 0 152 incr XferLOG, Errors 0 0 0 0 153 full XferLOG, Errors 1 0 0 0 154 incr XferLOG, Errors 0 0 0 0 155 incr XferLOG, Errors 0 0 0 0 156 incr XferLOG, Errors 0 0 0 0 Thanks, Jx That is unbelievable. You backup a windows client, right? There should be a lot of files which can not be backuped because they are in use. Do you use volume shadow copies in windows? But nevertheless, you have backups of all files specified in your configuration. If you check your backup include/exclude configuration you should find which files are not backuped. br Matthias -- Don't Panic -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Restore issues
Hi Jeffrey, Sounds doable, but as I say, I'm backing a Linux server, not a Win client, so I don't think I need to worry too much about the busy files... Thanks for the suggestion though... Cheers, Jx On Thu, Jul 30, 2009 at 8:19 PM, Jeffrey J. Kosowsky backu...@kosowsky.orgwrote: Matthias Meyer wrote at about 21:00:54 +0200 on Thursday, July 30, 2009: Vetch wrote: Hi Matthias, All my xferlogs say they have 0 errors (apart from one, but that was after the problem occurred anyway)... I've had a look at them, but they are... quite long... Without knowing what to search for, I'm not sure what I can do with them... If they report no errors, I guess I can assume all files are backing up properly? Xfer Error Summary Backup# Type View #Xfer errs #bad files #bad share #tar errs 0 full XferLOG, Errors 0 0 0 0 28 full XferLOG, Errors 0 0 0 0 56 full XferLOG, Errors 0 0 0 0 84 full XferLOG, Errors 0 0 0 0 112 full XferLOG, Errors 0 0 0 0 126 full XferLOG, Errors 0 0 0 0 133 full XferLOG, Errors 0 0 0 0 140 full XferLOG, Errors 0 0 0 0 147 full XferLOG, Errors 0 0 0 0 150 incr XferLOG, Errors 0 0 0 0 151 incr XferLOG, Errors 0 0 0 0 152 incr XferLOG, Errors 0 0 0 0 153 full XferLOG, Errors 1 0 0 0 154 incr XferLOG, Errors 0 0 0 0 155 incr XferLOG, Errors 0 0 0 0 156 incr XferLOG, Errors 0 0 0 0 Thanks, Jx That is unbelievable. You backup a windows client, right? There should be a lot of files which can not be backuped because they are in use. Do you use volume shadow copies in windows? Alternatively, you could just exclude the files that tend to be busy. Before I wrote my volume shadow copy script, I had a short list of excludes that eliminated all busy files. But nevertheless, you have backups of all files specified in your configuration. If you check your backup include/exclude configuration you should find which files are not backuped. br Matthias -- Don't Panic -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] HowTo backup __TOPDIR__?
If I want to have desaster resistent backups I need to have the backups at least on two locations. What is the best way to syncronize __TOPDIR__ to another location? As I found in many messages, rsync isn't possible because of expensive memory usage for the hardlinks. In my opinion dd or cp -a isn't possible too because they would copy all the data. That would consume to much time if I syncronize the locations on a daily basis. Thanks Matthias -- Don't Panic -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] HowTo backup __TOPDIR__?
Matthias Meyer wrote: If I want to have desaster resistent backups I need to have the backups at least on two locations. What is the best way to syncronize __TOPDIR__ to another location? As I found in many messages, rsync isn't possible because of expensive memory usage for the hardlinks. In my opinion dd or cp -a isn't possible too because they would copy all the data. That would consume to much time if I syncronize the locations on a daily basis. The simple way is to run two independent copies and that also keeps you from having a single point of failure. If that isn't practical, rotating external disks that you image-copy locally might work. -- Les Mikesell lesmikes...@gmail.com -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] backups don't complete - don't know why
Kanwar writes: full backup started for directory /home; updating partial #147 started full dump, share=/home Running: /usr/bin/sudo /usr/bin/rsync --server --sender --numeric-ids --perms --owner --group -D --links --hard-links --times --block-size=2048 --recursive --checksum-seed=32761 --one-file-system --ignore-times . /home/ Xfer PIDs are now 11778 xferPids 11778 Got remote protocol 30 Negotiated protocol version 28 Checksum caching enabled (checksumSeed = 32761) Got checksumSeed 0x7ff9 ^CfileListReceive() failed Done: 0 files, 0 bytes Got fatal error during xfer (fileListReceive failed) Backup aborted by user signal Not saving this as a partial backup since it has fewer files than the prior one (got 20245 and 0 files versus 20245) dump failed: fileListReceive failed link ranbir I cancelled the backup at the point where backuppc output the line Got checksumSeed 0x7ff9 because backuppc didn't do anything for over an hour. It appears the remote rsync never completes traversing /home and sending the file list. Have you tried running fsck on /home? How large is /home? Craig -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] HowTo backup __TOPDIR__?
What is the best way to syncronize __TOPDIR__ to another location? As I found in many messages, rsync isn't possible because of expensive memory usage for the hardlinks. Since version 3.0.0 (protocol 3 on both ends) rsync uses an incremental mode to generate and compare the file lists on both sides. So memory usage decreased a lot, because just a small part of the list is in memory all the time. But the massive hardlink usage of BackupPC causes very slow copying of the whole structure, because link creation on any filesystem seems to be a very expensive task (locks?) ... In my opinion dd or cp -a isn't possible too because they would copy all the data. That would consume to much time if I syncronize the locations on a daily basis. Any other tool has the same time consumption if it keeps hardlinks (cp e.g. does that with option -l). A somehow lazy solution would be to just copy the pool-Files (hashes as file names) by rsync and create a tar archive of the pc directory. The time consuming process of link creation is then deferred to the restore case (which may never be needed). Thomas -- OSTC Open Source Training and Consulting GmbH / HRB Nuernberg 20032 tel +49 911-3474544 / fax +49 911-1806277 / http://www.ostc.de Delsenbachweg 32 / D-90425 Nuernberg / Geschaeftsfuehrung: Thomas Birnthaler / +49 171-3047465 / t...@ostc.de / pgp 0xFEE7EB4C Hermann Gottschalk / +49 173-3600680 / h...@ostc.de / pgp 0x0B2D8EEA -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Log says Pool is 0.00GB, but pool is big and growing
Christian writes: Hi, I'm experiencing some strange difficulties with BackupPC (3.1.0-3ubuntu1 on Ubuntu 8.04 LTS). It appears that BackupPC is not recognizing that it put files into the pool already. The log shows nightly a message according to which the pool is 0 GB, consisting of 0 directories, whereas the pool actually exists - it's 195,120 MB currently, and growing day by day, cluttering my harddisk. 2009-07-29 01:00:00 Running 2 BackupPC_nightly jobs from 0..15 (out of 0..15) 2009-07-29 01:00:00 Running BackupPC_nightly -m 0 127 (pid=20391) 2009-07-29 01:00:00 Running BackupPC_nightly 128 255 (pid=20392) 2009-07-29 01:00:00 Next wakeup is 2009-07-29 08:15:00 2009-07-29 01:00:03 Finished admin1 (BackupPC_nightly 128 255) 2009-07-29 01:00:03 BackupPC_nightly now running BackupPC_sendEmail 2009-07-29 01:00:15 Finished admin (BackupPC_nightly -m 0 127) 2009-07-29 01:00:15 Pool nightly clean removed 0 files of size 0.00GB 2009-07-29 01:00:15 Pool is 0.00GB, 0 files (0 repeated, 0 max chain, 0 max links), 0 directories 2009-07-29 01:00:15 Cpool nightly clean removed 0 files of size 0.00GB 2009-07-29 01:00:15 Cpool is 0.00GB, 0 files (0 repeated, 0 max chain, 0 max links), 1 directories I've searched Google and the archives. I found some similar issues, which appearently were resolved by adding a patch (Tino's patch). I did applied the patch (a couple of weeks ago), but the issue persists. Any idea what could be the issue/what I could try to resolve this? If you need any information from my config, please let me know. The most likely cause is that IO::Dirent fails on certain file systems. Yes, Tino's patch is meant to fix that. It's also fixed in 3.2.0beta0. Just to be sure, why don't you change this line: $IODirentOk = 1; in lib/BackupPC/Lib.pm to: $IODirentOk = 0; and then see if BackupPC_nightly reports non-zero numbers? Craig -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Backups stopped working for Vista machine
I have BackupPC set up to backup my Fedora F10 box and my wife's Vista box to an external USB drive. I had everything working and then some time ago the backups stopped working for the Vista box and I'm just getting around to looking into it. The error log shows: Running: /usr/bin/smbclient user-pc\\C\$ -U User -E -N -d 1 -c tarmode\ full -TcrX - list of excluded files full backup started for share C$ Xfer PIDs are now 9222,9221 tar_re_search set Anonymous login successful [ skipped 1 lines ] tree connect failed: NT_STATUS_ACCESS_DENIED So there is the problem...I think. The -N on the cmd line says (from the smbclient man page) If specified, this parameter suppresses the normal password prompt from the client to the user. This is useful when accessing a service that does not require a password. so its logging in anonymously but I need to use a password so that User has sufficient privileges on the Vista box to execute the command. The password is correctly set in the SmbSharePasswd variable. Can someone enlighten me? Thanks, Steve -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/