[BackupPC-users] Concrete proposal for feature extension for pooling
I'm still confused on what you meant about the migration system and the different digests available. I'd suggest some kind of converter if you are allowing us to change the digest as that could cause a lot of issues and some people might do it the md5 way and then get a hardware-based SHA processor and want to change it for performance reasons. Jeffery has definitely been answering all the questions I've had about this in his posts. I've been thinking about this for months and from what Jeffery's asked, you've answered just as descriptively. Big thing though, Jeffery brought up filesystems that can both compress and de-dupe. The two big ones right now are ZFS and BTRFS. In my opinion, ZFS is there and ready to be used. It's lzjb compression and block-level de-dupe would actually make a lot of what BackupPC does moot. The good thing is, BackupPC is portable apart from filesystems, the bad thing is, there is currently no way to migrate BackupPC to a filesystem like ZFS and expect great results. The reasons you noted, compression killing the block-level de-dupe and , To me, BackupPC's compression is much better for storage reasons than lzjb, but I'd like to see some hard numbers before anything comes around. The fact that there are multiple pools based on compressed data really does this in. There has to be a way to move backups between compressed and uncompressed and more/less compressed pools or else the benefit of filesystem migration or features would be non-existent. Frankly, I'd love to keep my backups forever, but there's no great way to do this yet. There's really no way to synthesize old fulls with the incrementals up to the next full so you will end up losing data in the process. Sure, it's nice to see the pool size go down for a change, but it's also nice to be able to say here's everything and now I've fused it in a package. Maybe have it so just the way backups are viewed, there is a way to access them from full to full such as the full and incrementals from 1-19-10 to 2-18-10, and then the full and incrementals from 2-19-10 to 3-18-10. I don't know how best to put it. I've begun moving all of my Windows machines over to Rsyncd and because of the emulation of GNU using Cygwin, I'd have to say the performance is far below that of Samba on so many fronts. I have had more issues using Rsyncd in Windows with BackupPC than I ever had with Samba. My question is, with 4.x, is the FileSys::SmbClient support going to be added in? If so, what will be gained by using it instead of the standard SmbClient approach? Are there plans to maybe create the BackupPC client again (I would love this idea)? What about adding support for Unison or creating a way to make Unison worthwhile? Frankly, I don't think Unison is very feature-worthy at the moment, but I'm sure if people already had Unison servers, having BackupPC manage those with its snapshop methodology would be greatly appreciated without having to setup Rsyncd on the server. Is there any hope for customization of Rsyncd such as adding in rsync -z compression for wireless or VPN'd clients, allowing passwords to be sent encrypted somehow, or for connecting to the Rsyncd client over SSH? If there's no benefit to doing that, then oops on my part. What are the changes, if any, for adding an array of IPs and hostnames and priorities on each and even being able to specify if a machine is on IP 12.34.56.78, that means it's not on the local network and will probably have a slower connection to be soft to random disconnections or fluctuations in connection speed. As a side note, I'd like to say I don't condone of using the terminology WinXX for a number of reasons: 1. Unnecessary: Win or Windows machines makes more sense than WinXX 2. Redundant: All the Windows up to Windows Vista were XX so by saying WinXX, that is not necessarily specifying anything except the exclusion of 3.11 and below versions which are probably out of the scope of the documentation anyway. 3. Exclusionary: Windows Vista and Windows 7 do not abide by the XX specification making it seem like it's only Win95 to WinXP that is supported by BackupPC. 4. Confusing: Not everyone would understand what WinXX means because they might think it's a subset of WinXP, a random typo, or a concrete exclusion of the Vista+ line of Windows. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Download Intel#174; Parallel Studio Eval Try the new software tools for yourself. Speed compiling, find bugs proactively, and fine-tune applications for parallel performance. See why Intel Parallel Studio got high marks during beta. http://p.sf.net/sfu/intel-sw-dev
[BackupPC-users] Concrete proposal for feature extension for pooling
I went way off-topic in asking about the pool and would like a moderator to move the last few posts since mine into another topic if possible please. Thanks Les for answering those questions. I guess it makes more sense, but it's not completely apparently what all is going on from what you said or what was written by Jeffery and Craig. Couldn't there be a way to do block-level and file-level de-dupe based on how often the file changes or if it's specifically a database file? The Indexing service in Windows can create a .edb file in upwards of 6GB. Definitely best to do block-level de-dupe on that rather than file-level. I think BackupPC could be far more functional if that sort of backup was available in the system itself. Maybe there could be a way to change it on a per-client basis or have a client and then have a special area of the client for that file. I actually do have DDNS setup, but it's based on my DHCP server and that doesn't tell it, if the machine connected on Wi-Fi, to prioritize the DNS name hostname.local to 192.168.0.12 instead of 192.168.0.13. The easiest way to do any of this, especially with a machine that isn't allowing the pinging of machines by windows hostnames, is to do it by IP address. The DHCP method in BackupPC counts from the bottom up. If the Wi-Fi card is setup as a lower IP, then it will be pulled first also not solving the issue. The main thing here is, there should be some way of solving this problem in BackupPC, not creating other problems with having to setup local DDNS servers on in upwards of 20 machines or more and then having to tell people to turn off the wireless cards when they plug in. It'd be so much easier just to say ClientAlias {} take an array just like how file/directory excludes work. Most of what you said isn't a solution, it's yet another problem. If you thought it'd be better to talk to the Bacula client, then why is SSH-port forwarding the best option? If BackupPC already knows how to talk through SSH pub/priv key, then why doesn't it just allow that for Rsyncd as well? One of the biggest things on the configuration side of things is having a global config and then having to modify that on a per-machine basis rather than having configuration profiles or a way to have a global config and then have a machine config underneath it which allows even more modification. I'd rather see a method of adding functionality for easier management of multiple machines which are similar Linux vs Windows rather than seeing make global for Linux and go by hand and change all the Windows machines, but if you make a change to the global config and want that reflected in the Windows machines too, there's no way to do that other than doing it by hand. See what I mean? One solves the problem the other does nothing but create more headache and doesn't solve the problem. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Download Intel#174; Parallel Studio Eval Try the new software tools for yourself. Speed compiling, find bugs proactively, and fine-tune applications for parallel performance. See why Intel Parallel Studio got high marks during beta. http://p.sf.net/sfu/intel-sw-dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Concrete proposal for feature extension for pooling
Les Mikesell wrote: Saturn2888 wrote: I actually do have DDNS setup, but it's based on my DHCP server and that doesn't tell it, if the machine connected on Wi-Fi, to prioritize the DNS name hostname.local to 192.168.0.12 instead of 192.168.0.13. The easiest way to do any of this, especially with a machine that isn't allowing the pinging of machines by windows hostnames, is to do it by IP address. The DHCP method in BackupPC counts from the bottom up. If the Wi-Fi card is setup as a lower IP, then it will be pulled first also not solving the issue. The main thing here is, there should be some way of solving this problem in BackupPC, not creating other problems with having to setup local DDNS servers on in upwards of 20 machines or more and then having to tell people to turn off the wireless cards when they plug in. It'd be so much easier just to say ClientAlias {} take an array just like how file/directory excludes work. I think it is an unusual setup that causes the problem - and it is a general networking problem, not at all related to backuppc. And I'm not actually sure that connecting to a specific IP would help if the other connections are available and still have viable routes on the target host. Ip routes are asymmetrical and the host makes its own decision about the return path. I'm not sure if backuppc will try multiple IP addresses if they are returned by DNS for a name and the first choice doesn't work. Browsers typically do that and it works out well. That could fix the case of a machine alternating between wired/wireless if both are in DNS - but not multiple connections with different speeds/reliability at the same time. Most of what you said isn't a solution, it's yet another problem. If you thought it'd be better to talk to the Bacula client, I mentioned the bacula client as an alternative to your suggestion of reviving the backuppc client - at the moment, neither is usable, but I thought you were talking about future development. then why is SSH-port forwarding the best option? Because it works now and offers the compression option you said you wanted. If BackupPC already knows how to talk through SSH pub/priv key, then why doesn't it just allow that for Rsyncd as well? Normally if you want to use ssh, you'd use rsync instead of rsyncd - and I'd expect that to be almost universal for non-windows targets. But there has been a bug in the cygwin ssh/rsync combination in the past that would cause random hangs. It may or may not be fixed in the current version. So port-forwarding over ssh to get encryption/compression or to start the connection from the other end to meet firewall requirements are special cases that work if you have to deal with certain issues - or have windows and a somewhat broken ssh/rsync. One of the biggest things on the configuration side of things is having a global config and then having to modify that on a per-machine basis rather than having configuration profiles or a way to have a global config and then have a machine config underneath it which allows even more modification. I'd rather see a method of adding functionality for easier management of multiple machines which are similar Linux vs Windows rather than seeing make global for Linux and go by hand and change all the Windows machines, but if you make a change to the global config and want that reflected in the Windows machines too, there's no way to do that other than doing it by hand. See what I mean? One solves the problem the other does nothing but create more headache and doesn't solve the problem. I'd like to see another 'group' level of inheritance, but I don't see a big problem with the way things currently work since you can create one per-host instance with everything the way you will want for a group, then copy it with the NEWHOST=COPYHOST syntax in the web 'edit hosts' screen. It only becomes an issue if you want to change an existing setting across an existing group of similar machines. -- Les Mikesell lesmikesell at gmail.com -- Download Intel® Parallel Studio Eval Try the new software tools for yourself. Speed compiling, find bugs proactively, and fine-tune applications for parallel performance. See why Intel Parallel Studio got high marks during beta. http://p.sf.net/sfu/intel-sw-dev ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ The Bacula client would be a great idea. That I was fine with so long as there was something. Yes, I was talking about a new version
[BackupPC-users] Adding compression to BackupPC
Is there a way to add compression to BackupPC backups using Rsyncd without having to using a VPN or by somehow enabling SSH compression? My wireless machines are taking anywhere from six to ten times as long to backup since I switched from Samba to Rsyncd which means they're backing up in hours instead of minutes now. Another issue seems to be that machines will be backed up and if the machine goes offline, BackupPC won't know or care. I had a machine here on today for only a few minutes and then it went off the facilities. BackupPC is still sitting there with a nice green Backup in Progress even though the machine is completely gone as if it's too dumb to realize it. All of this is quite annoying and makes me think the benefit of Rsync not having to transfer files is actually a lot less helpful. I can't figure out why it'd be doing this all so slowly or why it wouldn't realize a machine had gone offline over 12 hours ago. I'm using version 3.1.0 in Ubuntu Server 8.04.4. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Download Intel#174; Parallel Studio Eval Try the new software tools for yourself. Speed compiling, find bugs proactively, and fine-tune applications for parallel performance. See why Intel Parallel Studio got high marks during beta. http://p.sf.net/sfu/intel-sw-dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Full Backup built from incrementals ?
I'm pretty sure, when using Rsync, this is done automatically. Secondly, there's a way to fill in your incrementals and make them no-longer dependent on the full backups. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Download Intel#174; Parallel Studio Eval Try the new software tools for yourself. Speed compiling, find bugs proactively, and fine-tune applications for parallel performance. See why Intel Parallel Studio got high marks during beta. http://p.sf.net/sfu/intel-sw-dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 months
I am listing the observations and complaints of mine after using BackupPC for 5 months. My system is running Ubuntu Hardy and has been kept up with the latest updates and patches. I upgraded from 3.0.0 to 3.1.0 within my first few months and have kept this computer solely a BackupPC machine. I've since gone through a few hard drive configurations which I will list here: - 3x640GB in hardware RAID3 using an IDE RAID card with SATA ports - 3x250GB in Linux software RAID5 using mdadm and onboard SATA ports - 1x2TB using onboard SATA ports All of those configurations have assured me it is not the speed of the drives themselves which have been giving me any issues. The machine itself has a 3.4GHz P4HT which I have equally tested with and without HyperThreading to no apparent change in usage. The Ethernet NIC is a gigabit broadcom one, and I have both switched the cables the switches with no apparent change in performance as far as I can remember. Through all my reading of the documentation and many articles about BackupPC which I found using Google, I was able to reconfigure my configuration files to fit the needs of the machines I had and their backup schedules. I was constantly tweaking and changing setting files, but I want to note, I did not see a change in performance, just less error messages such as excluding /dev for instance. Please feel free, from here on in, to question or assert my obervations and complaints so that we may all get a well-rounded understanding of the situation. ./01\. The first thing I noticed when using the program was that the defaults seemed a bit extreme for my system. I began my setup with two always-on Windows machines configured with Samba. I let one full backup go one night and the other the next. From here, I noticed the load of my computer consistently at or over 4, and I don't think I've ever seen the processor usage go under 100% unless BackupPC isn't doing anything. Either way, these backups seemed to go very smoothly and did not cause me any problems after the initial full backups. I did have one very pertinent issue, the speed of the full backups and incrementals never went over 10MB/s. On gigabit switches with all gigabit-linked computers, this was very strange. The Windows machines could transfer anywhere from 50-75MB/s to each other so I figured something else might be wrong. I understand many small files are being transferred, but I would expect the average to be higher when using Samba over Gigabit. ./02\. I later added in a few always-on Linux machines configured with Rsyncd. This is where I started noticing some things that were fishy. One of those machines was hooked up to the network using a USB Wi-Fi card and the other an Intel Gigabit Ethernet adapter. Unlike common sense would show, the computer on WLAN had a backup speed in MB/s was faster than that of the computer on LAN. Both machines had similar setups and both had their own public IP addresses hosting web pages publicly using a different interface than the one for backup purposes. The one main difference in the two setups was the operating system. Both were on Ubuntu, but one had been consistently upgraded so it was on Karmic (WLAN) while the other was on Hardy (LAN). Either way, this should not have affected much of anything. The LAN computer also has twice as much RAM and a slightly faster processor. Both had IDE drives at similar speeds. I can only think the speed difference might have been caused by the LAN computer having 4 more GB of data, possibly many small files, than the WLAN machine which had only 2.7GB in use at the time. but I cannot be sure. ./03\. I eventually ended up with 18 machines configured in BackupPC with more being added as I configure them. In those 18 are a few websites of mine for which the servers are remote. Those website I am backing up using the Rsync configuration. The Rsync configuration with these websites works exactly as it should. One of the sites had 11GB of data and while that took forever to download on my connection, incremental updates are speedy. The full updates are speedy too. Backup number 61 on this website took only 6.4 minutes for 2.5GB of files. This is because those 2.5GB were already backed up in the last full and therefore no longer needed to be redownloaded. It is my understanding that this is how BackupPC is supposed to work. While the very first full took around 160 minutes, the second was quite a huge difference, only taking a bit more than three times as long as the previous incrementals. ./04\. After some time, I noticed my Samba backups were missing files. I had begun to need older deleted files and noticed they were not backed up in the last full. It appeared as though BackupPC would get to a point and just give up but make sure to backup the root folder list first at least. There were times where certain folders were gone but the rest adjacent were filled. This last sentence
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
was already backed up and in the pool. I have 2GB of memory in this machine and enabled commit=60 which may have actually been a cause for it slowing down, I do not know. I've been meaning to upgrade or change over this rig to an Atom D510 or N330, but I cannot find one with more than 2 slots for memory leading me to believe it might be best to keep it as-is. Would HyperThreading actually make it work faster at all even though most of it is I/O? Or is Rsync that big of a cultprit? Should I look around for deals on memory and get my system up to at least 4GB of RAM? Here is /etc/fstab: # file system mount point type options dump pass proc/proc procdefaults0 0 # / /dev/sda1 / ext3 noatime,errors=remount-ro,usrquota,grpquota,data=writeback,commit=60 0 1 # swap /dev/sda5 none swap sw 0 0 It used to be that my swap was even on another drive entirely which I can do again if you guys suggest that for speed reasons. I'm unable to read iostat and need to receive instructions to do so. This is why I use iotop as it makes sense. I don't understand why it's faster to transfer all the content again than use Rsync to see if files need to be redownloaded. Elapsed time is what I'm having the most problems with, not the download speed itself as it could show 0.001MB/s, and I'd be happy, let a little concerned, if it only took 5 minutes. Windows permission locking? I'm sure that's wrong. I could hardly assume that C:/Program Files (x86)/Pidgin would be locked by the OS. I can't see any reason not to back that folder up as I believe I'm able to read it regardless of the files being in use or not. The XferLOG files are also very large and painstaking to navigate. I hope you have better tools to recommend me other than Notepad++ or another text editor with search functionality. It may be nice to understand regular expressions to assist in my search, but I don't think it should really be that complicated to find this issue. What I mean is, I had a C: and G: drive. The G: drive was almost never in use, and it was months before I ever changed data on it yet part-way through backing it up, Samba would just be like error for all the files in the XferLOG and none of them would transfer. I tried a few things and was unable to get it to fully back them up not in the full nor the incrementals. Were it not specifically for this, I would not have moved to Rsyncd. I figured incrementals might take longer because of that. Are you suggesting I do full backups each night instead of incrementals or should I do 0 to 6 instead of, for example, 0 to 60 (incremental levels). The logs are showing me all folder creations along with files. It's difficult to disiver create d from create . The error log also doesn't show which files were newly created which could assist me in figuring out which files are actually taking up a gigabyte's worth of backups that night. For a while, I saw certain, but not all, excluded files appearing in backups. After much time and diliberation, I went to #rsync and man rsync to find out what all was required for the exclude list. Through this, I made this fairly-organized list of global excludes for both Windows and Linux machines. $Conf{BackupFilesExclude} = { '*' = [ '*.tmp', 'tmp/', 'temp/', 'Temp/', 'Temporary Internet Files/', '/dev', '/media', '/mnt', '/proc', '/sys', '/var/lib/backuppc', 'pagefile.sys', 'hiberfil.sys', 'RECYCLER', '$RECYCLE.BIN', '$Recycle.Bin', 'desktop.ini', 'Thumbs.db', 'thumbcache_*.db', 'IconCache.db', '*.edb', '/Windows/Prefetch', '/home/dan/excludes', '/home/ddr/logs/', '/home/sokg/logs/', '/home/saturn2888/logs/', '/umkcddr.com/extg', '/Program Files (x86)/Electronic Arts/Burnout(TM) Paradise The Ultimate Box/*', '/Folders/Learn Japanese', '/Folders/zhid-e' ] }; I've already factored in growing databases and that would explain issues in Linux servers, but definitely not in my windows systems especially if indexing is disabled. Only one computer is running an e-mail program and those files are not part of BackupPC's backups. I don't think my directory trees are getting that much larger. In fact, I'm someone who tries to stay organized so I often get rid of and simplify things constantly. The only directory that ever changes on my laptop or netbook is the Downloads folder as it may be the case I add another directory there. I even copy-back my Pidgin IM logs to my main rig when they begin to grow to increase the speed of backups but that doesn't seem to have worked at all. What is --checksum-seed=32761? Yes, running more than 2 seems to be slower, but not back when I was using Samba. Since I'm not downloading many files, wouldn't you believe then that it'd be okay to run more at the same time? I have had the best experiences running 4 at the same time because if I
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
was already backed up and in the pool. I have 2GB of memory in this machine and enabled commit=60 which may have actually been a cause for it slowing down, I do not know. I've been meaning to upgrade or change over this rig to an Atom D510 or N330, but I cannot find one with more than 2 slots for memory leading me to believe it might be best to keep it as-is. Would HyperThreading actually make it work faster at all even though most of it is I/O? Or is Rsync that big of a cultprit? Should I look around for deals on memory and get my system up to at least 4GB of RAM? Here is /etc/fstab: # file system mount point type options dump pass proc/proc procdefaults0 0 # / /dev/sda1 / ext3 noatime,errors=remount-ro,usrquota,grpquota,data=writeback,commit=60 0 1 # swap /dev/sda5 none swap sw 0 0 It used to be that my swap was even on another drive entirely which I can do again if you guys suggest that for speed reasons. I'm unable to read iostat and need to receive instructions to do so. This is why I use iotop as it makes sense. I don't understand why it's faster to transfer all the content again than use Rsync to see if files need to be redownloaded. Elapsed time is what I'm having the most problems with, not the download speed itself as it could show 0.001MB/s, and I'd be happy, let a little concerned, if it only took 5 minutes. Windows permission locking? I'm sure that's wrong. I could hardly assume that C:/Program Files (x86)/Pidgin would be locked by the OS. I can't see any reason not to back that folder up as I believe I'm able to read it regardless of the files being in use or not. The XferLOG files are also very large and painstaking to navigate. I hope you have better tools to recommend me other than Notepad++ or another text editor with search functionality. It may be nice to understand regular expressions to assist in my search, but I don't think it should really be that complicated to find this issue. What I mean is, I had a C: and G: drive. The G: drive was almost never in use, and it was months before I ever changed data on it yet part-way through backing it up, Samba would just be like error for all the files in the XferLOG and none of them would transfer. I tried a few things and was unable to get it to fully back them up not in the full nor the incrementals. Were it not specifically for this, I would not have moved to Rsyncd. I figured incrementals might take longer because of that. Are you suggesting I do full backups each night instead of incrementals or should I do 0 to 6 instead of, for example, 0 to 60 (incremental levels). The logs are showing me all folder creations along with files. It's difficult to disiver create d from create . The error log also doesn't show which files were newly created which could assist me in figuring out which files are actually taking up a gigabyte's worth of backups that night. For a while, I saw certain, but not all, excluded files appearing in backups. After much time and diliberation, I went to #rsync and man rsync to find out what all was required for the exclude list. Through this, I made this fairly-organized list of global excludes for both Windows and Linux machines. $Conf{BackupFilesExclude} = { '*' = [ '*.tmp', 'tmp/', 'temp/', 'Temp/', 'Temporary Internet Files/', '/dev', '/media', '/mnt', '/proc', '/sys', '/var/lib/backuppc', 'pagefile.sys', 'hiberfil.sys', 'RECYCLER', '$RECYCLE.BIN', '$Recycle.Bin', 'desktop.ini', 'Thumbs.db', 'thumbcache_*.db', 'IconCache.db', '*.edb', '/Windows/Prefetch', '/home/dan/excludes', '/home/ddr/logs/', '/home/sokg/logs/', '/home/saturn2888/logs/', '/umkcddr.com/extg', '/Program Files (x86)/Electronic Arts/Burnout(TM) Paradise The Ultimate Box/*', '/Folders/Learn Japanese', '/Folders/zhid-e' ] }; I've already factored in growing databases and that would explain issues in Linux servers, but definitely not in my windows systems especially if indexing is disabled. Only one computer is running an e-mail program and those files are not part of BackupPC's backups. I don't think my directory trees are getting that much larger. In fact, I'm someone who tries to stay organized so I often get rid of and simplify things constantly. The only directory that ever changes on my laptop or netbook is the Downloads folder as it may be the case I add another directory there. I even copy-back my Pidgin IM logs to my main rig when they begin to grow to increase the speed of backups but that doesn't seem to have worked at all. What is --checksum-seed=32761? Yes, running more than 2 seems to be slower, but not back when I was using Samba. Since I'm not downloading many files, wouldn't you believe then that it'd be okay to run more at the same time? I have had the best experiences running 4 at the same time because if I
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
I've seen so many things on the mailing list. I don't know how to respond, what program to use if any, if there's some special way to keep track. Think of if you've never come across a forum before. Someone would have to teach you how to use it. This is the same thing. Pointing me to where a bunch of mail-list links are wouldn't help because I already know about that. Instead of mail list, let's call it Communication Method X. If you speak to me about getting or writing to Communication Method X, how would I know what that means? Sure you can point me to links I can click on, but there's no information on it. It's like saying just post it in the forums but not telling me I need a user/pass and that after getting in you post topics or replies to topics already made by other people and have discussions about that topic while also lacking the explanation of the fundamentals of threads and posts and different methods of viewing and managing them. Hmm... For my RAID5 setups, I've always ended up with the speed of one drive or more, not less than one drive. On my main rig using Intel ICH10R, I'm able to get close to 200MB/s using 5 hard drives. If each drive can write consistently at 75MB/s, I'm writing 53% of the total continuous write capacity whereas you're trying to explain that it's completely for me to even get anywhere from 5-20MB/s (under 10%) write speed using RAID5 with the same or faster drives because of some massive overhead. I see this as a problem and do not think that if this was true, anyone would ever use RAID5 with anything less than 10 drives as it'd be completely unusable. I do not think it was the RAID5 that was the issue though especially since I changed over to a single-drive setup just recently which also has the slower speeds, even slower than writing with the RAID5. I'll try to get a hold of more memory within the next month then. I believe I could probably switch out what I have in other machines and try. I only saw, at most, 6MB of swap used, but I always figured it's a good idea to have a swap partition. I always thought it would be used, but my RAM usually is normally around 380MB which is why I've never felt like I needed more RAM. Ah yes, is the compression something that happens during the backup or after? I have no problems using smbclient or file managers in Windows. The files disappearing happened only after continued use. As far as I remember, the very first full backup did what I wanted, any since started having missing parts which could've been because I started adding more and more hosts. The user was an admin user with full file access apart from whatever the System user gets. I wonder how I missed that bit in the documentation on checksum caching. I'm assuming I just didn't understand its use. Another issue has come up just recently. A machine has been in standby since before this server came back online and both the virtual machine on that server and the server OS itself are both reported as failed backups when they should have been shown as having no ping. Since transitioning to my 2TB drive, I've also noticed the pool information vanished. My pool filesystem grew last night from 18% to 23% usage. That means an extra 100GB of files appeared. I think something is wrong. Is there a way to fix this? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
Since it was mentioned before, here's iostat: Every 2.0s: iostat Fri Apr 23 05:34:51 2010 Linux 2.6.24-27-server (grubber)04/23/2010 avg-cpu: %user %nice %system %iowait %steal %idle 19.910.042.99 70.840.006.23 Device:tps Blk_read/s Blk_wrtn/s Blk_read Blk_wrtn sda 92.73 1937.06 2538.96 196266513 257252240 +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
The web server is the one thing that's ever seemingly been fast. There's never a problem so long as I use it. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
Les Mikesell wrote: Saturn2888 wrote: Since it was mentioned before, here's iostat: Every 2.0s: iostat Fri Apr 23 05:34:51 2010 Linux 2.6.24-27-server (grubber)04/23/2010 avg-cpu: %user %nice %system %iowait %steal %idle 19.910.042.99 70.840.006.23 Device:tps Blk_read/s Blk_wrtn/s Blk_read Blk_wrtn sda 92.73 1937.06 2538.96 196266513 257252240 The first set of values from iostat is a long accumulation that you normally should ignore. If you give a number on the command line it will run continuously reporting every so many seconds. A value of 1 to 5 should give you meaningful samples when the machine is busy. But, that 70% iowait number pretty much tells you what the machine is doing with most of its time... -- Les Mikesell lesmikesell at gmail.com -- ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ I did watch iostat and saw that consistently. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
I was gonna put it on 2x400GB in RAID0 and having that data backed up on another rig but didn't think that'd be such a good idea. As an update, while my graphs haven't updated, my pool info is back! I realized the extra 100GB was from the new files I wanted to start backing up so that explains that part of the deal. Now I question if I should add more spindles or not. I wanted to run the OS on a USB flash drive but feared if the unmounted /var/lib/backuppc disappeared, the USB drive would fill up and that'd be many many writes to it I would not want to happen which could slow down and slowly max out the number of writes. Since you noted RAID5 being bad for this, what sort of RAID did you have in mind? My original idea for this machine was a RAID 10, but that's a lot of spinning disks and a lot of power draw. I'm trying the most power-efficient approach at the moment. Would switching to an Atom-based dual-core or single-core server speed this up at all or be of equal speed? I know the Atom processors are slow, but since the processor is constantly at 100%, having two cores instead of one might not be such a bad idea. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] rsync on windows
You need Rsync 3.0.6-1, 3.0.7-1 crashes in all my uses of it. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
Use this as the root directory: http://badmarkup.com/backuppc/ Note, full-backups of laptops or netbooks are done using 100Mbits wired Ethernet links, everything else is under 802.11g. Since I can't upload any, here's my most-recent pic of the Host Summary screen. As you can see, the smallest ones seem to be working correctly; although, the backups are still slow in my opinion, just far less noticeable. For example, Virtual-XP is a virtual machine on the machine Soyver. Looking at its summary page, we see the backups taking anywhere from a nice 2-5 minutes up to a terrible 15 minutes. I say terrible because even though the size is a few gigabytes, think of it the size was a few ten gigabytes, then what? It will take a few ten times as long? That actually seems to be the case. Faye-Mini is a netbook, running Linux. It has been taking longer and longer each backup. Colissio's page is the most insightful. All backups since 78 were done using Rsync. Sadly, I have no older pictures of which to compare before and after shots now. I do have them of kevin-top, a laptop. If you see this, you can probably tell that it was after full backup 63 I switched to using Rsync. It was also after backup 63 my fan started whirling a lot and my battery life dropped 3 hours. I spent a long time trying to figure it out until I realized it was only when rsync was backing up this laptop did I have all of these issues. I have other things I can show too. Look at Main and Xenon. Notice that after the date I began moving Windows machines to Rsyncd, their backup times increased a lot! +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
Hey Les, after putting that checksum stuff in, I get a lot of errors which say verified cached digest. To me that's not an error, that's a good thing right? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
If that's the case, other computers would show the error too but they do not. It's only that one PC that has 'em. It shows 139 errors for the last full whereas the ones before had around 24. The previous fulls used to have a lot of errors as well, but only because files were vanishing such that it probably started backing up while i was using the machine so cache files and log files were changing as it backed up. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
I've been reading a lot. Apparently Rsync over Cygwin or anything using those DLLs hangs. I haven't yet felt like upgrading my memory for money reasons, but I'm wondering if I should include the -W or --whole-file into the Rsync options for backups. From what I read online, a lot of Windows server + Rsync issues were occuring with hangs so I tried to change my list to exclude junction points and other things like it. So far, things seem to be moving as they used to, but certain machines are still taking a long time, in fact, longer than before. I had a machine which was taking 2 hours now taking 20, haha. It seems like the more I do to fix it, the worse it gets. I hope that checksum thing eventually does help. I can confirm some things work properly. I did a full back up over 100GB of files yesterday, none of these operating system files, and was surprised to see it take 5 hours to do. The good thing is, when it did the incremental today, it only took 15 minutes. 15 minutes for an incremental backup of 100GB of unchanged files from a Windows machine seems decent enough. Then I looked closer. While it says it backed up 0MB of 0 files total were backed up, 53 files show up as backed up existing already and 381 new files. Please look at http://badmarkup.com/backuppc/majin.png to see what I'm talking about. How is this possible? Is it a bug in BackupPC? Would it explain the issues I'm having? I'd still have to say backing up these files was notably faster than backing up 20GB of a C: drive with only the OS and some programs installed so I dunno what to expect here. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
Correctioni, it took 8 hours for the full, and as you can see in http://badmarkup.com/backuppc/majin.png, almost all of the files were already backed up before. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
I don't understand. Which DLLs and what versions? I use DeltaCopy and can't imagine why I'd not need Rsyncd. Are you saying use it over mounted Samba shares, or you're telling me to use SSH? In DeltaCopy, I'm able to just pop in compatible Cygwin DLL files. Cygwin itself is far too bulky and heavy-weight for me to condone putting it on 15 hosts whereas DeltaCopy, all it does, is as a GUI front-end to some Cygwin stuff and makes it so you only need a few things, not hundreds of other GNU programs I don't know if I do or do not need. I hardly use the GUI anymore, but it's helpful and easier to use on an OS dominated by GUI front-ends. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
Yeah, but with version of the DLL is the newest one in this case that solves it? I refuse to install Cygwin, but I also can't seem to find that DLL on this page: ftp://sourceware.org/pub/cygwinports/. I would still need Cygwin to run an SSH server right or would I just use some form of PuTTY? I've heard of cwRsync. Maybe they keep up-to-date DLLs unlike DeltaCopy. I did, at one point, switch out some DLLs and my EXE files, but that was back in February so I'm sure I'm behind on my cygwin1.dll and don't see a reason to install a full app just for one file. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
The $100-worth of writes lowering the life of my SSD for features I don't ever need nor will ever use in Windows ooor I find someone that has it installed or could find the DLL and links me :P since I really only need cygwin1.dll. The cwRsync one is from 1.7.2 which I believe is the version I already have. So you still need Cygwin even if you don't have Rsyncd. At least the transfer is encrypted, but for my home network, that's not much of my concern right now for the possible hassle involved. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
Yeah, I know that. That's why I keep saying, I need the DLL somehow to test if it will fix this. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
So after installing Cygwin into my RAID of hard drives and waiting the forever it took, I finally have a cygwin1.dll. I gave up looking through those tar.bz2 files for it. The cwRsync package had the same old 1.7.2 version of the DLL I already got in February not the one being proposed. The old DLL was 2420KB, the new one is 2548KB for anyone interested. I can also put this DLL up on my server if people just want the DLL to fix their windows systems like I did. I'm gonna put the DLL around the network and see what happens in the afternoon when most all backups should be completed. Most of my backups' blackouts end at 2am, some at 1am. We'll see how this goes. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
Well I do have a Linux AD server. I also have a WinServer2008R2 for 6 months, but it's having issues connecting into the domain for whatever reason. I was referring to the SSH encryption. I don't recall Samba using encryption for file transfers, just for user/pass, maybe even just pass. Are you using the stock DeltaCopy files? Those are really aged. Here, I uploaded you some newer files. Stop the DeltaCopy service first and put these in. Feel free to make a copy of the old files. Tell me if these are any faster. They should be. 22-23 hours to backup a server? That's a LONG time. The longest I've gone was 21 hours and that was just recently during an incremental bc BackupPC was suddenly being stupid about backing up my Windows machines because it felt like it. Here are newer files: http://badmarkup.com/deltacopy/ +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] suggestion for enhancement
Raman Gupta's suggetion was definitely the best. Should Rsync copy files in alphabetical order, than you can imagine how far you are by judging what's on the drive. Right now BackupPC is in the /Windows folder in one of my machines. Since there are a ton of itty bitty files in there, usually new ones bc of logs and such, this is what's going to take the longest time. It's going to be especially long when using Rsyncd instead of Samba for instance. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Moved cable, backup kept going
I've had it happen to me, then 6 hours later it's like backup failed, disconnection or something. I think there's a timeout issue in some cases. Was yours really still working or did it eventually time out? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
Sorin, with your setup, VLANs don't seem out of the question. In another case, I seem to be figuring out more and more why my machines are taking so long, Windows and Linux ones included. This PC started backing up over 8 hours ago. I looked at the PID, nothing. I looked at the Xfer PID, here's what I found: open(/var/lib/backuppc/pc/main/69/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266, O_RDONLY|O_NONBLOCK|O_LARGEFILE|O_DIRECTORY|0x8) = 3 open(/var/lib/backuppc/pc/main/69/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/attrib, O_RDONLY|O_LARGEFILE) = 3 open(/var/lib/backuppc/pc/main/68/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266, O_RDONLY|O_NONBLOCK|O_LARGEFILE|O_DIRECTORY|0x8) = 3 open(/var/lib/backuppc/pc/main/68/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/attrib, O_RDONLY|O_LARGEFILE) = 3 open(/var/lib/backuppc/pc/main/67/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266, O_RDONLY|O_NONBLOCK|O_LARGEFILE|O_DIRECTORY|0x8) = 3 open(/var/lib/backuppc/pc/main/67/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/attrib, O_RDONLY|O_LARGEFILE) = 3 open(/var/lib/backuppc/pc/main/66/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266, O_RDONLY|O_NONBLOCK|O_LARGEFILE|O_DIRECTORY|0x8) = 3 open(/var/lib/backuppc/pc/main/66/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/attrib, O_RDONLY|O_LARGEFILE) = 3 open(/var/lib/backuppc/pc/main/65/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266, O_RDONLY|O_NONBLOCK|O_LARGEFILE|O_DIRECTORY|0x8) = 3 open(/var/lib/backuppc/pc/main/65/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/attrib, O_RDONLY|O_LARGEFILE) = 3 open(/var/lib/backuppc/pc/main/64/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266, O_RDONLY|O_NONBLOCK|O_LARGEFILE|O_DIRECTORY|0x8) = 3 open(/var/lib/backuppc/pc/main/64/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/attrib, O_RDONLY|O_LARGEFILE) = 3 open(/var/lib/backuppc/pc/main/63/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266, O_RDONLY|O_NONBLOCK|O_LARGEFILE|O_DIRECTORY|0x8) = 3 open(/var/lib/backuppc/pc/main/63/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/attrib, O_RDONLY|O_LARGEFILE) = 3 open(/var/lib/backuppc/pc/main/62/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266, O_RDONLY|O_NONBLOCK|O_LARGEFILE|O_DIRECTORY|0x8) = 3 open(/var/lib/backuppc/pc/main/62/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/attrib, O_RDONLY|O_LARGEFILE) = 3 open(/var/lib/backuppc/pc/main/61/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266, O_RDONLY|O_NONBLOCK|O_LARGEFILE|O_DIRECTORY|0x8) = 3 open(/var/lib/backuppc/pc/main/61/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/attrib, O_RDONLY|O_LARGEFILE) = 3 open(/var/lib/backuppc/pc/main/60/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266, O_RDONLY|O_NONBLOCK|O_LARGEFILE|O_DIRECTORY|0x8) = 3 open(/var/lib/backuppc/pc/main/60/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/attrib, O_RDONLY|O_LARGEFILE) = 3 stat64(/var/lib/backuppc/pc/main/new/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/%25gconf.xml, 0x81530c8) = -1 ENOENT (No such file or directory) stat64(/var/lib/backuppc/pc/main/new//full-drive/RStmp, 0x81530c8) = -1 ENOENT (No such file or directory) stat64(/var/lib/backuppc/cpool/b//5/bf50328e06a80bcea0293670cc726879, 0x81530c8) = -1 ENOENT (No such file or directory) open(/var/lib/backuppc/pc/main/new/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/%25gconf.xml, O_WRONLY|O_CREAT|O_TRUNC|O_LARGEFILE, 0666) = 3 stat64(/var/lib/backuppc/pc/main/new/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/%25gconf.xml, {st_mode=S_IFREG|0644, st_size=203, ...}) = 0 stat64(/var/lib/backuppc/pc/main/new/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/attrib, 0x81530c8) = -1 ENOENT (No such file or directory) stat64(/var/lib/backuppc/pc/main/new/full-drive/var/lib/ebox/gconf/ebox-ro/modules/events/configureEventTable/keys/conf6266/attrib, 0x81530c8) = -1 ENOENT (No such file or directory) stat64(/var/lib/backuppc/cpool/5/e/c/5ec8ffd787cc5f2e88704a76cc3ec86b, 0x81530c8) = -1 ENOENT (No such
[BackupPC-users] rsync on windows
Ah, you used OpenVPN. I backup a host using that too but over the Internet. Did the freezing up stop? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
So what's the maximum amount of incrementals I should do then? For wireless devices it's best to keep it going and let the server handle the rest, for my wired computers, I'm assuming anywhere from 6-8 is plenty and I should probably not do more than that. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Noted Observations Complaints Using BackupPC for 5 mon
Yeah, that's what I meant, depth, not number. There are very few changes to most of my systems, what does change are the file folders I'm backing up on my main rig. Those files change daily, but not all of them of course. What are your thoughts? Probably less level changes are good if there aren't many changes correct? Since changing from Samba, a lot of my BackupPC server's functionality has also needed to change. To update on the new DLL I used, it seems to have worked properly. One machine went from 450min from the previous incremental to 86 this time around but that's caused by moving 4GB of unchanging files off of that drive. Still, 86 is still too high for an incremental. Another machine spent 470min on it's previous incremental and after the DLL, the full took only 380min. Clearly I'm getting somewhere, but I haven't yet found a hardy solution. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Concrete proposal for feature extension for pooling
Meh it's okay. I'd still like a feature like this, but I've never gotten a DNS server to apply multiple IPs to a hostname sadly, only the other way around. Oh well. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] What do I need for backing up the pool?
:: INTRO :: I've been really curious about a few things on backing up the backup because I've been doing it for a while from an ext3 filesystem for a zfs one and have noticed quite a few issues especially when my zfs drive suddenly filled up and rm -rv pc/blah would give me immediate feedback the directory was deleted meaning there was nothing in it, yet somehow I had tons and tons of data in use. Then I noticed all the rsync cron jobs were happening all at once and nothing was finishing. So I took a further look at the cpool and noticed it was pretty full of data but nowhere else was and that this was probably what was causing the issue. It also made clear to me why my BackupPC machine would suddenly start to take hours for backups that usually took 10 minutes. :: INFORMATION :: The BackupPC machine is running Ubuntu Hardy with BackupPC version 3.1.0. The other machine is running FreeNAS embedded. Both machines have the same 2TB drive in them with the filesystems being ext3 and zfs respectively. The rsync command I cron to copy the /var/lib/backuppc directory looks like this: rsync u...@ip::backuppc /mnt/backuppc -aH --exclude-from=/mnt/backuppc/exclude-list.txt --delete --inplace --checksum-seed=32761 I modified the command in my post to show more of what's actually going on than what my user-specific settings are. My exclude file now includes pool and cpool as I don't yet know of their importance. :: QUESTIONS :: My main question pertains to the pool and the cpool. What is the purpose of these directories? Are they recreated when the nightly jobs runs? For instance, I'm under the impression the way to delete a host, you remove the pc/host folder and then you go into the main configuration and remove the host from there as well. What happens to the pool in this instance? Let's say the drive for my BackupPC server dies, and I cannot recover the data because someone smashed the drive with a hammer, if I wanted to re-add hosts, and I only had backed up the pc folder, wouldn't that be sufficient or would all the de-duplicated file processing be lost? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] My backup method across VPN
I would do successive incrementals, but only if the same exact files aren't changing each time. For instance, successive incrementals do not benefit you if files X changed today, and it changes tomorrow, and it changes the next day only if file X is the file that chances the most. If file X is a text file, successive incrementals might be useful so long as the same blocks of text from the day before still exist in the file. This is going to be pretentious for any file whose changes persist day to day even if there are changes. The one time you, for-sure, want to use successive incrementals is if you have the processing power, disk space, and amount of files that change day-to-day to require it. Especially if you're on a limited line like a VPN over the Internet, it would be beneficial to check for changes from the day before than say 6 days before as you'd wanna copy less information rather than more. The more successive incremental levels you add, the slower it goes. 6 will always yield decent speeds. I have mine setup at home for 8, but even that might be too much for a one-drive setup. I used to have it setup for 60 and each incremental past level 18 would start to show slowdown. It would start to take 6-7 hours to backup the same host that, at lower incremental levels, took only a few tens of minutes. I would try the successive incrementals and see where it takes you. Test between them both for 2 different weeks and that will tell you which to choose for sure. I like successive incrementals, I just don't like using them at higher levels because it really really begins to slow things down. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] What do I need for backing up the pool?
I forgot to add, my rsync command also has --delete. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] My backup method across VPN
Your method is fine. I don't think it will be fast enough though. 5GB of daily changes you said? You don't wanna do those all at once. My recommendation is first, limit BackupPC to backing up a maximum of one host at a time. Second, spread out the backups into multiple hosts. Instead of them being physical hosts, do what I do to my computer, I have specific folders like Compositions, Documents, Photoshop files, etc all in different BackupPC hosts. It's one thing to transfer 100GB of files, it's another thing to transfer 500GB of files and have the entire transfer fail for every single file because I needed to restart my PC or something. If there's one single error, the entire transfer will fail which is why you want to space it out among different hosts. Make sure, since you're going over 2Mb/s upload, you set the blackout period for the time people are in the office, that way, you won't be backing it up while they're in session. The VPN should compress things as needed if you have it set to do so meaning --compress won't be needed; plus, I think it breaks BackupPC to enable that rsync option anyway. You want to make sure your VPN is setup w/ udp and not tcp also. Are you using the rsync or rsyncd methodology? What types of files are you transferring and to what files system? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] My backup method across VPN
Add --checksum-seed=32761 and --inplace to your rsync arguments. I believe those speed it up enough to notice. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] What do I need for backing up the pool?
@Josh Malone, what do you mean by that? @Tyler J. Wagner, thanks! I'm gonna look at that article and see what I can do. I haven't seen any numbers of the speed difference between LVM or no LVM so I haven't yet used it. I'm wanting to make sure enabling LVM doesn't screw up my current speed and plus, it means I have to get the data off and get it back somehow without dd to make sure it doesn't muck up the newly-formatted LVM filesystem. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] What do I need for backing up the pool?
Can't I just dd the entire partition or that one directory into a file? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] What do I need for backing up the pool?
Well BackupPC only runs in Linux I thought. I'm running BackupPC in Linux, the zfs filesystem I'm copying this stuff to in an embedded FreeBSD 7.3. I say embedded bc the OS is read-only and most of the normal functionality is missing such as ports. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] What do I need for backing up the pool?
Over 4 hours for just hosts A through D, not even all the way to Z. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] What do I need for backing up the pool?
Things are changing. What I'm hoping for it that I can capture whatever I can. I used to have a 700GB pool which is now only 366GB after I got rid of things I don't need snapshots for (music, pictures, installer files). It still takes freakin' forever though! Just deleting the files, which I'm doing now as a test, has taken over 4 hours. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] My backup method across VPN
Well it might not be faster for BackupPC, that's for sure. I've gone ahead and removed that part of it. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] My backup method across VPN
It's disk write speed which has always plagued rsync in my opinion. Normally nearly none of my Gigabit network bandwidth is in use. I've even changed out all the CAT5 and CAT5e cables to CAT6 to no avail so it's definitely disk performance. As it stated though, it might actually make transferring a bunch of small files slower and BackupPC doesn't use rsync in the way everything else does in that, I think, it's internally wanting to create new files so maybe it's best to leave this option off. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] My backup method across VPN
So would it just turn it off automatically or did I screw things up? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] My backup method across VPN
Oh darn, you guys are right. I'm using the forums so it's different. Let's see, we're allowed to add the --checksum-seed option so I think that option either does nothing then or that it does work and the perl rsync.pm file is able to accommodate more rsync functions than those noted there. Les Mikesell wrote: On 7/6/2010 11:57 AM, Saturn2888 wrote: So would it just turn it off automatically or did I screw things up? http://search.cpan.org/~cbarratt/File-RsyncP-0.68/lib/File/RsyncP.pm doesn't show --inplace as an option, so I'd guess it doesn't do anything. By the way, posting without quoting any context makes things pretty hard to understand from the mailing list side of the world. -- Les Mikesell lesmikesell at gmail.com -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] My backup method across VPN
Ah, good point! Hmm... I wonder about -z (--compress) then because when I've turned that on for testing, it ran the same rsync job for days and did not finish on any clients. I wonder if anyone's added it successfully not that you'd ever need or want to. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] 5GB log files or more because of some weird issue
I shrunk an ext3 partition and dd'd it over into an LVM2 partition on a Linux Software RAID1 then I moved the /var/lib/backuppc out and deleted the rest of the drive contents bc I'd already copied those manually. Then I upgraded the file system to ext4 by doing some kinda tune2fs commands to make it ext4 then ran the fsck operation on it to finalize the change. Since I started it backup, BackupPC keeps having enormous log files and backups seemingly smoother. Might I note I also upgraded the machine from Ubuntu Hardy Server running BackupPC 3.1.0 from tormodvolden's repo to Ubuntu Lucid Server running BackupPC 3.1.0 from the official repo. After all of this, I get tons of errors like these: 2010-07-28 02:40:03 ERROR: opendir (/var/lib/backuppc/pc/main/174/ffull-drive/fusr/flib/fperl5/fauto/fText/fCSV_XS) failed 2010-07-28 02:40:03 ERROR: opendir (/var/lib/backuppc/pc/main/173/ffull-drive/fusr/flib/fperl5/fauto/fText/fCSV_XS) failed The files don't appear, but I'm pretty sure all my backups aren't broken so I don't know what's going on. I've checked a lot of past snapshots and have restored some data. I don't know what exactly is missing that is causing all of these errors which keep piling up more and more. The next log is probably going to be even larger. I figure maybe I have to run something like this to hardlink everything again or something? // Hardlink duplicate files recursively based on md5 hash comparison find . -xdev -type f -print0 | xargs -0 md5sum | sort | perl -ne 'chomp; $ph=$h; ($h,$f)=split(/\s+/,$_,2); if ($h ne $ph) { $k = $f; } else { unlink($f); link($k, $f); }' +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] 5GB log files or more because of some weird issue
Not gigabytes, it was megabytes, but after searching this, either those files never did get backed up or something else happened to them. Strangely, even through all of this the pool size has only grown 3GB since I first turned it on again. I'm very confused. Missing files should lower the total space taken up by BackupPC's pool and as it finds new files to add, it should take a lot longer to backup each machine and the I still don't know if there's a solution, but I'm more interested in why the files disappeared. Either they were gone before and I never noticed, or they disappeared during the conversion to ext4. My assumption is, a bunch of files became corrupted in the pool because of too many things going on at once. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] 5GB log files or more because of some weird issue
An update, it's doing it now for newer files too and my log files are only growing in size. What's with ext4? Is there a way to check if something's amok? Fsck says the file system is clean.? Saturn2888 wrote: Not gigabytes, it was megabytes, but after searching this, either those files never did get backed up or something else happened to them. Strangely, even through all of this the pool size has only grown 3GB since I first turned it on again. I'm very confused. Missing files should lower the total space taken up by BackupPC's pool and as it finds new files to add, it should take a lot longer to backup each machine and the I still don't know if there's a solution, but I'm more interested in why the files disappeared. Either they were gone before and I never noticed, or they disappeared during the conversion to ext4. My assumption is, a bunch of files became corrupted in the pool because of too many things going on at once. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] BackupPC 3.2.0 released
Can't wait to find a backport for Lucid :) +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] 5GB log files or more because of some weird issue
To make it easier for people to know what I'm talking about, here are the errors I'm getting in my logs. It's a bunch of things like these for different folders on different hosts. 2010-07-28 02:40:03 ERROR: opendir (/var/lib/backuppc/pc/main/174/ffull-drive/fusr/flib/fperl5/fauto/fText/fCSV_XS) failed 2010-07-28 02:40:03 ERROR: opendir (/var/lib/backuppc/pc/main/173/ffull-drive/fusr/flib/fperl5/fauto/fText/fCSV_XS) failed Saturn2888 wrote: An update, it's doing it now for newer files too and my log files are only growing in size. What's with ext4? Is there a way to check if something's amok? Fsck says the file system is clean.? Saturn2888 wrote: Not gigabytes, it was megabytes, but after searching this, either those files never did get backed up or something else happened to them. Strangely, even through all of this the pool size has only grown 3GB since I first turned it on again. I'm very confused. Missing files should lower the total space taken up by BackupPC's pool and as it finds new files to add, it should take a lot longer to backup each machine and the I still don't know if there's a solution, but I'm more interested in why the files disappeared. Either they were gone before and I never noticed, or they disappeared during the conversion to ext4. My assumption is, a bunch of files became corrupted in the pool because of too many things going on at once. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] BackupPC 3.2.0 released
You wouldn't want a backport from me, I break way too much stuff. I was hoping someone else was going to do it like Tolaris (http://www.tolaris.com/apt-repository/) or you. Tyler J. Wagner wrote: If you do, post it here please. If you haven't within a month, I'll repackage it myself. Regards, Tyler On Wednesday 04 Aug 2010 14:53:38 Saturn2888 wrote: Can't wait to find a backport for Lucid :) +-- |This was sent by Saturn2888 at gmail.com via Backup Central. |Forward SPAM to abuse at backupcentral.com. +-- --- --- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Freedom of thought is best promoted by the gradual illumination of men's minds, which follows from the advance of science. -- Charles Darwin +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] BackupPC 3.2.0 released
You wouldn't want a backport from me, I break way too much stuff. I was hoping someone else was going to do it like you @ http://www.tolaris.com/apt-repository/ Tyler J. Wagner wrote: If you do, post it here please. If you haven't within a month, I'll repackage it myself. Regards, Tyler On Wednesday 04 Aug 2010 14:53:38 Saturn2888 wrote: Can't wait to find a backport for Lucid :) +-- |This was sent by Saturn2888 at gmail.com via Backup Central. |Forward SPAM to abuse at backupcentral.com. +-- --- --- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Freedom of thought is best promoted by the gradual illumination of men's minds, which follows from the advance of science. -- Charles Darwin +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] 5GB log files or more because of some weird issue
I don't think the RAM is bad, but I can do a check on it. The directories no longer exist but I swear they had to have. After a lot of observation, it's creating new directories and after 2 full backups, the errors stop (duh, because it's not looking back an extra full backup when rsyncing). I also noticed backup 0 or the lowest one seems to always have the data even though newer ones do not even on hosts where no data has changed in months. My suspicion is that the inodes got messed up and only one was kept meaning all the old hardlinks went away and only the first one (original) was kept. That's from not knowing anything about ext4 though so I could be wrong. I'm running fsck -f, and it's still on the first pass. I'll update you when it finishes. Do those directories exist? Can you read them as the backuppc user? Have you tried an fsck on the partition? 'Clean' means it was unmounted before the last shutdown, not that the structure is correct. Use a -f to force a check anyway. Bad RAM can cause this sort of problem too. Les Mikesell wrote: On 8/4/2010 8:57 AM, Saturn2888 wrote: To make it easier for people to know what I'm talking about, here are the errors I'm getting in my logs. It's a bunch of things like these for different folders on different hosts. 2010-07-28 02:40:03 ERROR: opendir (/var/lib/backuppc/pc/main/174/ffull-drive/fusr/flib/fperl5/fauto/fText/fCSV_XS) failed 2010-07-28 02:40:03 ERROR: opendir (/var/lib/backuppc/pc/main/173/ffull-drive/fusr/flib/fperl5/fauto/fText/fCSV_XS) failed Do those directories exist? Can you read them as the backuppc user? Have you tried an fsck on the partition? -- Les Mikesell lesmikesell at gmail.com -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ Les Mikesell wrote: On 8/4/2010 8:23 AM, Saturn2888 wrote: An update, it's doing it now for newer files too and my log files are only growing in size. What's with ext4? Is there a way to check if something's amok? Fsck says the file system is clean.? 'Clean' means it was unmounted before the last shutdown, not that the structure is correct. Use a -f to force a check anyway. Bad RAM can cause this sort of problem too. -- Les Mikesell lesmikesell at gmail.com -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] 5GB log files or more because of some weird issue
fsck finished with no problems. Here's what it is, those errors are coming up when they should not. The specific case I was looking at, I had rsync exclude some directories and recently changed it thinking it wasn't excluding them (oops). So the errors shouldn't have even appeared in the first place. Second sets of data seem to show that stuff is missing, but I don't know for sure. Stuff is definitely there now, but I don't know if it either did not get backed up, got deleted, or had some strange inode problems when converting to ext4. Saturn2888 wrote: I don't think the RAM is bad, but I can do a check on it. The directories no longer exist but I swear they had to have. After a lot of observation, it's creating new directories and after 2 full backups, the errors stop (duh, because it's not looking back an extra full backup when rsyncing). I also noticed backup 0 or the lowest one seems to always have the data even though newer ones do not even on hosts where no data has changed in months. My suspicion is that the inodes got messed up and only one was kept meaning all the old hardlinks went away and only the first one (original) was kept. That's from not knowing anything about ext4 though so I could be wrong. I'm running fsck -f, and it's still on the first pass. I'll update you when it finishes. Do those directories exist? Can you read them as the backuppc user? Have you tried an fsck on the partition? 'Clean' means it was unmounted before the last shutdown, not that the structure is correct. Use a -f to force a check anyway. Bad RAM can cause this sort of problem too. Les Mikesell wrote: On 8/4/2010 8:57 AM, Saturn2888 wrote: To make it easier for people to know what I'm talking about, here are the errors I'm getting in my logs. It's a bunch of things like these for different folders on different hosts. 2010-07-28 02:40:03 ERROR: opendir (/var/lib/backuppc/pc/main/174/ffull-drive/fusr/flib/fperl5/fauto/fText/fCSV_XS) failed 2010-07-28 02:40:03 ERROR: opendir (/var/lib/backuppc/pc/main/173/ffull-drive/fusr/flib/fperl5/fauto/fText/fCSV_XS) failed Do those directories exist? Can you read them as the backuppc user? Have you tried an fsck on the partition? -- Les Mikesell lesmikesell at gmail.com -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ Les Mikesell wrote: On 8/4/2010 8:23 AM, Saturn2888 wrote: An update, it's doing it now for newer files too and my log files are only growing in size. What's with ext4? Is there a way to check if something's amok? Fsck says the file system is clean.? 'Clean' means it was unmounted before the last shutdown, not that the structure is correct. Use a -f to force a check anyway. Bad RAM can cause this sort of problem too. -- Les Mikesell lesmikesell at gmail.com -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] 5GB log files or more because of some weird issue
Craig Barratt wrote: Saturn2888 writes: To make it easier for people to know what I'm talking about, here are the errors I'm getting in my logs. It's a bunch of things like these for different folders on different hosts. 2010-07-28 02:40:03 ERROR: opendir (/var/lib/backuppc/pc/main/174/ffull-drive/fusr/flib/fperl5/fauto/fText/fCSV_XS) failed 2010-07-28 02:40:03 ERROR: opendir (/var/lib/backuppc/pc/main/173/ffull-drive/fusr/flib/fperl5/fauto/fText/fCSV_XS) failed Sorry, I haven't been following this thread, so my questions might be off base. There errors aren't directly from BackupPC. It is from rsync (ie: the client) inside flist.c: rsyserr(FERROR_XFER, errno, opendir %s failed, full_fname(fbuf)); Normally I wouldn't expect the rsync client to be reading the BackupPC data tree. Craig Craig Barratt wrote: Are you backing up the local host (ie: backuppc server)? I am backing up localhost using rsyncd, but that's not the only one with errors like these. Craig Barratt wrote: If so, are you accidentally recursively backing up the backuppc store? Nope. /var/lib/backuppc (Ubuntu Lucid) has been excluded, and I even checked to make sure. Craig Barratt wrote: That will take a long time. Or are you trying to backup the BackupPC server onto another machine? I was doing this previously which is why I setup this machine with a RAID1. Craig, I think you solved the reasoning for the problem. BackupPC uses RsyncP. It's possible, the version I was using in Hardy wasn't as new; although, then it should be working properly I'd think. Maybe Tolaris compiled 3.1.0 for Hardy differently than how the Ubuntu people packaged 3.1.0 for Lucid as I'd been using Tolaris's backport of 3.1.0 before the upgrade and after the upgrade, I believe it changed me to the Lucid package manager's version. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] BackupPC 3.2.0 released
Trey Nolen wrote: And what about rsync3 support in File::RsyncP ? There's progress. Most of my BackupPC work recently has been on 4.x, the next major version. Several months ago there were various emails about the architecture. Yes, it will support rsync 3.x and extended attributes. I've taken a different approach for rsync support and File::RsyncP is no longer used. The new setup uses a slightly modified native rsync 3.x on the server side, with a perl emulation layer on all file system IO that maps to the BackupPC store. The advantage is all the usual rsync options are available, and it's native C code (until it hits the perl emulation layer of course). And, yes, there will be no hardlinks in 4.x... I should probably look back at the archives for this, but no hardlinks? I thought that was the way the pool was able to save disk space -- an essential feature (at least for us). Is this going to be implemented in some other way? Trey Nolen -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ There were a few of us going on about BackupPC 4 in a thread early this year. There should be a lot of good information in there. I too have forgotten what all was discussed so if you find it, please link us. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] BackupPC 3.2.0 released
I'm also assuming this means it'll be harder to access the data outside of BackupPC itself as well. Craig Barratt wrote: Trey writes: I should probably look back at the archives for this, but no hardlinks? I thought that was the way the pool was able to save disk space -- an essential feature (at least for us). Is this going to be implemented in some other way? Yes, pooling continues to work. Reference counting is done at the application level (using a simple data base) rather than file system level (using hardlinks). This will make the BackupPC data store a lot easier to copy and backup. Craig -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] BackupPC 3.2.0 released
Sorry to be off-topic, but since you're bringing it up, can you link to it too please? Jeffrey J. Kosowsky wrote: Saturn2888 wrote at about 16:17:30 -0400 on Friday, August 6, 2010: I'm also assuming this means it'll be harder to access the data outside of BackupPC itself as well. Hopefully, a fuse filesystem frontend could be used to transparently access individual backups. BTW, I find backuppc-fuse to be very useful even in the 3.x series since it allows me to access any backup from the command line without file mangling, with proper ownership/permissions, and with incrementals all filled out. -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] 5GB log files or more because of some weird issue
Logs are now anywhere from 16MB to 30MB. The problem is only growing, and I don't know why. If those are rsync errors, why am I seeing them only since the upgrade to Lucid? In parts of my findings, they're benign, but the log sizes being that large even compressed is pretty scary since they were always about 2kB. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] BackupPC 3.2.0 released
Google is not my friend. Plus, I've been seeing it harder to find actually useful content on it in the last month; backuppc fuse being one of those because I've searched for it time and time again. he reason I asked for a link is I and whomever else is interested know exactly which bakcuppc-fuse module you're referring to. Personally, all I have is backuppc_ls, and I do not have a link to it as I don't know where I got it. Jeffrey J. Kosowsky wrote: I believe I found it on the Wikki somewhere. If not google is your friend... Saturn2888 wrote at about 03:42:24 -0400 on Monday, August 9, 2010: Sorry to be off-topic, but since you're bringing it up, can you link to it too please? Jeffrey J. Kosowsky wrote: Saturn2888 wrote at about 16:17:30 -0400 on Friday, August 6, 2010: I'm also assuming this means it'll be harder to access the data outside of BackupPC itself as well. Hopefully, a fuse filesystem frontend could be used to transparently access individual backups. BTW, I find backuppc-fuse to be very useful even in the 3.x series since it allows me to access any backup from the command line without file mangling, with proper ownership/permissions, and with incrementals all filled out. -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ +-- |This was sent by Saturn2888 at gmail.com via Backup Central. |Forward SPAM to abuse at backupcentral.com. +-- -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] 5GB log files or more because of some weird issue
Naw, the excluding is working just fine. Those files popping up aren't excluded. In one case they were, but I fixed that; it was just one entry which was broken. Les Mikesell wrote: On 8/9/2010 12:27 PM, Saturn2888 wrote: Logs are now anywhere from 16MB to 30MB. The problem is only growing, and I don't know why. If those are rsync errors, why am I seeing them only since the upgrade to Lucid? In parts of my findings, they're benign, but the log sizes being that large even compressed is pretty scary since they were always about 2kB. Are the entries still like the ones where Craig responded that they were coming from the remote rsync but refering to the local backuppc archive? Maybe your exclude syntax is wrong. -- Les Mikesell lesmikesell at gmail.com -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] 5GB log files or more because of some weird issue
Did a fsck on the entire filesystem, no fix. They are from any rsync or rsyncd operation. I don't know what you mean by from remote rsync. Is that an application or are you talking about the protocol? I think I'm missing some kind of related understanding here. Les Mikesell wrote: On 8/9/2010 2:59 PM, Saturn2888 wrote: Naw, the excluding is working just fine. Those files popping up aren't excluded. In one case they were, but I fixed that; it was just one entry which was broken. So what do some of the other log entries say? And if they are from the remote rsync, have you tried an fsck on the file system where they happen? -- Les Mikesell lesmikesell at gmail.com -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] BackupPC 3.2.0 released
Your month's almost up. Be ready Mr.! Tyler J. Wagner wrote: If you do, post it here please. If you haven't within a month, I'll repackage it myself. Regards, Tyler On Wednesday 04 Aug 2010 14:53:38 Saturn2888 wrote: Can't wait to find a backport for Lucid :) +-- |This was sent by Saturn2888 at gmail.com via Backup Central. |Forward SPAM to abuse at backupcentral.com. +-- --- --- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Freedom of thought is best promoted by the gradual illumination of men's minds, which follows from the advance of science. -- Charles Darwin +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Sell apps to millions through the Intel(R) Atom(Tm) Developer Program Be part of this innovative community and reach millions of netbook users worldwide. Take advantage of special opportunities to increase revenue and speed time-to-market. Join now, and jumpstart your future. http://p.sf.net/sfu/intel-atom-d2d ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] BackupPC 3.2.0 released
I use Ubuntu Lucid on that machine so I don't know if it'd be compatible. I'm also not really liking to test things when they're associated with my backup machine, haha. Test is bad, working is good. B. Alexander wrote: Here is the one I built. Note that I have not had time to test it (i.e. install on my backup machine) yet. I sent email to the Debian Dev who maintains backuppc, but have yet to receive a response. Given a few more days of silence, I may go ahead and do an NMU...But with squeeze in freeze, I'm not sure whether it will be accepted into sid... Test and let me know. --b On Sun, Aug 29, 2010 at 8:42 AM, Saturn2888 backuppc-forum at backupcentral.com (backuppc-forum at backupcentral.com) wrote: Your month's almost up. Be ready Mr.! Tyler J. Wagner wrote: If you do, post it here please. If you haven't within a month, I'll repackage it myself. Regards, Tyler On Wednesday 04 Aug 2010 14:53:38 Saturn2888 wrote: Can't wait to find a backport for Lucid :) +-- |This was sent by Saturn2888 at gmail.com (http://gmail.com) via Backup Central. |Forward SPAM to abuse at backupcentral.com (http://backupcentral.com). +-- --- --- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm (http://p.sf.net/sfu/dev2dev-palm) ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net (http://lists.sourceforge.net) List: https://lists.sourceforge.net/lists/listinfo/backuppc-users (https://lists.sourceforge.net/lists/listinfo/backuppc-users) Wiki: http://backuppc.wiki.sourceforge.net (http://backuppc.wiki.sourceforge.net) Project: http://backuppc.sourceforge.net/ (http://backuppc.sourceforge.net/) -- Freedom of thought is best promoted by the gradual illumination of men's minds, which follows from the advance of science. -- Charles Darwin +-- |This was sent by Saturn2888 at gmail.com (Saturn2888 at gmail.com) via Backup Central. |Forward SPAM to abuse at backupcentral.com (abuse at backupcentral.com). +-- -- Sell apps to millions through the Intel(R) Atom(Tm) Developer Program Be part of this innovative community and reach millions of netbook users worldwide. Take advantage of special opportunities to increase revenue and speed time-to-market. Join now, and jumpstart your future. http://p.sf.net/sfu/intel-atom-d2d (http://p.sf.net/sfu/intel-atom-d2d) ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net (BackupPC-users at lists.sourceforge.net) List: https://lists.sourceforge.net/lists/listinfo/backuppc-users (https://lists.sourceforge.net/lists/listinfo/backuppc-users) Wiki: http://backuppc.wiki.sourceforge.net (http://backuppc.wiki.sourceforge.net) Project: http://backuppc.sourceforge.net/ (http://backuppc.sourceforge.net/) +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Sell apps to millions through the Intel(R) Atom(Tm) Developer Program Be part of this innovative community and reach millions of netbook users worldwide. Take advantage of special opportunities to increase revenue and speed time-to-market. Join now, and jumpstart your future. http://p.sf.net/sfu/intel-atom-d2d___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] BackupPC 3.2.0 released
Ok then, how do I get a hold of this version? B. Alexander wrote: From what I saw (I'm running real Debian) in my 30 seconds of research, I think they use the straight Debian version. Try to install it, and worst case, it will error out. --b On Sun, Aug 29, 2010 at 4:34 PM, Saturn2888 backuppc-forum at backupcentral.com (backuppc-forum at backupcentral.com) wrote: I use Ubuntu Lucid on that machine so I don't know if it'd be compatible. I'm also not really liking to test things when they're associated with my backup machine, haha. Test is bad, working is good. B. Alexander wrote: Here is the one I built. Note that I have not had time to test it (i.e. install on my backup machine) yet. I sent email to the Debian Dev who maintains backuppc, but have yet to receive a response. Given a few more days of silence, I may go ahead and do an NMU...But with squeeze in freeze, I'm not sure whether it will be accepted into sid... Test and let me know. --b On Sun, Aug 29, 2010 at 8:42 AM, Saturn2888 backuppc-forum at backupcentral.com (http://backupcentral.com) (backuppc-forum at backupcentral.com (http://backupcentral.com)) wrote: Your month's almost up. Be ready Mr.! Tyler J. Wagner wrote: If you do, post it here please. If you haven't within a month, I'll repackage it myself. Regards, Tyler On Wednesday 04 Aug 2010 14:53:38 Saturn2888 wrote: Can't wait to find a backport for Lucid :) +-- |This was sent by Saturn2888 at gmail.com (http://gmail.com) (http://gmail.com (http://gmail.com)) via Backup Central. |Forward SPAM to abuse at backupcentral.com (http://backupcentral.com) (http://backupcentral.com (http://backupcentral.com)). +-- --- --- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm (http://p.sf.net/sfu/dev2dev-palm) (http://p.sf.net/sfu/dev2dev-palm (http://p.sf.net/sfu/dev2dev-palm)) ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net (http://lists.sourceforge.net) (http://lists.sourceforge.net (http://lists.sourceforge.net)) List: https://lists.sourceforge.net/lists/listinfo/backuppc-users (https://lists.sourceforge.net/lists/listinfo/backuppc-users) (https://lists.sourceforge.net/lists/listinfo/backuppc-users (https://lists.sourceforge.net/lists/listinfo/backuppc-users)) Wiki: http://backuppc.wiki.sourceforge.net (http://backuppc.wiki.sourceforge.net) (http://backuppc.wiki.sourceforge.net (http://backuppc.wiki.sourceforge.net)) Project: http://backuppc.sourceforge.net/ (http://backuppc.sourceforge.net/) (http://backuppc.sourceforge.net/ (http://backuppc.sourceforge.net/)) -- Freedom of thought is best promoted by the gradual illumination of men's minds, which follows from the advance of science. -- Charles Darwin +-- |This was sent by Saturn2888 at gmail.com (http://gmail.com) (Saturn2888 at gmail.com (http://gmail.com)) via Backup Central. |Forward SPAM to abuse at backupcentral.com (http://backupcentral.com) (abuse at backupcentral.com (http://backupcentral.com)). +-- -- Sell apps to millions through the Intel(R) Atom(Tm) Developer Program Be part of this innovative community and reach millions of netbook users worldwide. Take advantage of special opportunities to increase revenue and speed time-to-market. Join now, and jumpstart your future. http://p.sf.net/sfu/intel-atom-d2d (http://p.sf.net/sfu/intel-atom-d2d) (http://p.sf.net/sfu/intel-atom-d2d (http://p.sf.net/sfu/intel-atom-d2d)) ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net (http://lists.sourceforge.net) (BackupPC-users at lists.sourceforge.net (http://lists.sourceforge.net)) List: https://lists.sourceforge.net/lists/listinfo/backuppc-users (https
[BackupPC-users] BackupPC 3.2.0 released
I'm using the forums. I've checked back and forth and haven't seen any attachments. B. Alexander wrote: It was attached to my last email in this thread. --b On Mon, Aug 30, 2010 at 8:58 AM, Saturn2888 backuppc-forum at backupcentral.com (backuppc-forum at backupcentral.com) wrote: Ok then, how do I get a hold of this version? B. Alexander wrote: From what I saw (I'm running real Debian) in my 30 seconds of research, I think they use the straight Debian version. Try to install it, and worst case, it will error out. --b On Sun, Aug 29, 2010 at 4:34 PM, Saturn2888 backuppc-forum at backupcentral.com (http://backupcentral.com) (backuppc-forum at backupcentral.com (http://backupcentral.com)) wrote: I use Ubuntu Lucid on that machine so I don't know if it'd be compatible. I'm also not really liking to test things when they're associated with my backup machine, haha. Test is bad, working is good. B. Alexander wrote: Here is the one I built. Note that I have not had time to test it (i.e. install on my backup machine) yet. I sent email to the Debian Dev who maintains backuppc, but have yet to receive a response. Given a few more days of silence, I may go ahead and do an NMU...But with squeeze in freeze, I'm not sure whether it will be accepted into sid... Test and let me know. --b On Sun, Aug 29, 2010 at 8:42 AM, Saturn2888 backuppc-forum at backupcentral.com (http://backupcentral.com) (http://backupcentral.com (http://backupcentral.com)) (backuppc-forum at backupcentral.com (http://backupcentral.com) (http://backupcentral.com (http://backupcentral.com))) wrote: Your month's almost up. Be ready Mr.! Tyler J. Wagner wrote: If you do, post it here please. If you haven't within a month, I'll repackage it myself. Regards, Tyler On Wednesday 04 Aug 2010 14:53:38 Saturn2888 wrote: Can't wait to find a backport for Lucid :) +-- |This was sent by Saturn2888 at gmail.com (http://gmail.com) (http://gmail.com (http://gmail.com)) (http://gmail.com (http://gmail.com) (http://gmail.com (http://gmail.com))) via Backup Central. |Forward SPAM to abuse at backupcentral.com (http://backupcentral.com) (http://backupcentral.com (http://backupcentral.com)) (http://backupcentral.com (http://backupcentral.com) (http://backupcentral.com (http://backupcentral.com))). +-- --- --- The Palm PDK Hot Apps Program offers developers who use the Plug-In Development Kit to bring their C/C++ apps to Palm for a share of $1 Million in cash or HP Products. Visit us here for more details: http://p.sf.net/sfu/dev2dev-palm (http://p.sf.net/sfu/dev2dev-palm) (http://p.sf.net/sfu/dev2dev-palm (http://p.sf.net/sfu/dev2dev-palm)) (http://p.sf.net/sfu/dev2dev-palm (http://p.sf.net/sfu/dev2dev-palm) (http://p.sf.net/sfu/dev2dev-palm (http://p.sf.net/sfu/dev2dev-palm))) ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net (http://lists.sourceforge.net) (http://lists.sourceforge.net (http://lists.sourceforge.net)) (http://lists.sourceforge.net (http://lists.sourceforge.net) (http://lists.sourceforge.net (http://lists.sourceforge.net))) List: https://lists.sourceforge.net/lists/listinfo/backuppc-users (https://lists.sourceforge.net/lists/listinfo/backuppc-users) (https://lists.sourceforge.net/lists/listinfo/backuppc-users (https://lists.sourceforge.net/lists/listinfo/backuppc-users)) (https://lists.sourceforge.net/lists/listinfo/backuppc-users (https://lists.sourceforge.net/lists/listinfo/backuppc-users) (https://lists.sourceforge.net/lists/listinfo/backuppc-users (https://lists.sourceforge.net/lists/listinfo/backuppc-users))) Wiki: http://backuppc.wiki.sourceforge.net (http://backuppc.wiki.sourceforge.net) (http://backuppc.wiki.sourceforge.net (http://backuppc.wiki.sourceforge.net)) (http://backuppc.wiki.sourceforge.net (http://backuppc.wiki.sourceforge.net) (http://backuppc.wiki.sourceforge.net (http://backuppc.wiki.sourceforge.net
[BackupPC-users] Why is the BackupPC pool size 30GB less than the size on the
Pool is 388.93GB comprising 2139798 files and 4369 directories (as of 2010-09-19 02:38), Pool file system was recently at 77% (2010-09-19 08:23) But df -h shows: /dev/mapper/vg-backuppc 576G 420G 128G 77% /var/lib/backuppc The volume group is only for BackupPC's pool, nothing else. There's a pretty large difference between 390GB and 420GB. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Start uncovering the many advantages of virtual appliances and start using them to simplify application deployment and accelerate your shift to cloud computing. http://p.sf.net/sfu/novell-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Installing Backuppc
Try apt-get purge backuppc and see if that helps you. Can someone explain to me why the post box is only 1 line tall? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Start uncovering the many advantages of virtual appliances and start using them to simplify application deployment and accelerate your shift to cloud computing. http://p.sf.net/sfu/novell-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] unexpected response rsync version 3.0.7 protocol version 30
For some reason I get the error unexpected response rsync version 3.0.7 protocol version 30. The first time I received this error was almost 6 days ago. I haven't restarted the machine, nothing's happened. I have no clue what's caused this. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Beautiful is writing same markup. Internet Explorer 9 supports standards for HTML5, CSS3, SVG 1.1, ECMAScript5, and DOM L2 L3. Spend less time writing and rewriting code and more time creating great experiences on the web. Be a part of the beta today. http://p.sf.net/sfu/beautyoftheweb ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] unexpected response rsync version 3.0.7 protocol version 30
Pedro M. S. Oliveira wrote: Hi, Did you do an update on you server/client? If I'm not wrong that has to do with Perl Rsync module. Cheers Pedro Oliveira From my android HD2 On 8 Oct 2010 08:50, Saturn2888 backuppc-forum at backupcentral.com (backuppc-forum at backupcentral.com) wrote: For some reason I get the error unexpected response rsync version 3.0.7 protocol version 30. The first time I received this error was almost 6 days ago. I haven't restarted the machine, nothing's happened. I have no clue what's caused this. +-- |This was sent by Saturn2888 at gmail.com (Saturn2888 at gmail.com) via Backup Central. |Forward SPAM to abuse at backupcentral.com (abuse at backupcentral.com). +-- -- Beautiful is writing same markup. Internet Explorer 9 supports standards for HTML5, CSS3, SVG 1.1, ECMAScript5, and DOM L2 L3. Spend less time writing and rewriting code and more time creating great experiences on the web. Be a part of the beta today. http://p.sf.net/sfu/beautyoftheweb (http://p.sf.net/sfu/beautyoftheweb) ___ BackupPC-users mailing list BackupPC-users at lists.sourceforge.net (BackupPC-users at lists.sourceforge.net) List: https://lists.sourceforge.net/lists/listinfo/backuppc-users (https://lists.sourceforge.net/lists/listinfo/backuppc-users) Wiki: http://backuppc.wiki.sourceforge.net (http://backuppc.wiki.sourceforge.net) Project: http://backuppc.sourceforge.net/ (http://backuppc.sourceforge.net/) Maybe a long time ago, but no since BackupPC didn't stop functioning, I only got this error backing up localhost; probably should've noted that. I haven't logged into the machine for 3 months and noticed this error 6 days ago in the e-mails. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Beautiful is writing same markup. Internet Explorer 9 supports standards for HTML5, CSS3, SVG 1.1, ECMAScript5, and DOM L2 L3. Spend less time writing and rewriting code and more time creating great experiences on the web. Be a part of the beta today. http://p.sf.net/sfu/beautyoftheweb ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] unexpected response rsync version 3.0.7 protocol version 30
I noticed this: *** System restart required *** so I restarted the system. We'll see what happens. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Beautiful is writing same markup. Internet Explorer 9 supports standards for HTML5, CSS3, SVG 1.1, ECMAScript5, and DOM L2 L3. Spend less time writing and rewriting code and more time creating great experiences on the web. Be a part of the beta today. http://p.sf.net/sfu/beautyoftheweb ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] unexpected response rsync version 3.0.7 protocol version 30
Still doing it, but I also saw this: r...@computer#58;~# rsync localhost#58;#58; rsync#58; server sent rsyncnbsp; version 3.0.7nbsp; protocol version 30 rather than greeting rsync error#58; error starting client-server protocol #40;code 5#41; at main.c#40;1524#41; #91;Receiver=3.0.7#93; +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Beautiful is writing same markup. Internet Explorer 9 supports standards for HTML5, CSS3, SVG 1.1, ECMAScript5, and DOM L2 L3. Spend less time writing and rewriting code and more time creating great experiences on the web. Be a part of the beta today. http://p.sf.net/sfu/beautyoftheweb ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] unexpected response rsync version 3.0.7 protocol version 30
I also found it might be some kinda protocol issue with SSH: http://www.gearhack.com/Forums/DisplayComments.php?file=Computer/Linux/rsync_with_server_on_non-standard_SSH_port. I tried -e rsh and it gave some kinda error about needing a key. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Nokia and ATT present the 2010 Calling All Innovators-North America contest Create new apps games for the Nokia N8 for consumers in U.S. and Canada $10 million total in prizes - $4M cash, 500 devices, nearly $6M in marketing Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store http://p.sf.net/sfu/nokia-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] unexpected response rsync version 3.0.7 protocol version 30
Anyone know anything about this? r...@name:~# rsync r...@localhost:: rsync: server sent rsync version 3.0.7 protocol version 30 rather than greeting rsync error: error starting client-server protocol (code 5) at main.c(1524) [Receiver=3.0.7] I've been getting this same error for months now. I wish I knew what was causing it so I can fix it. I have two servers both with this issue. Both have BackupPC and Zentyal 2.0 installed. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Oracle to DB2 Conversion Guide: Learn learn about native support for PL/SQL, new data types, scalar functions, improved concurrency, built-in packages, OCI, SQL*Plus, data movement tools, best practices and more. http://p.sf.net/sfu/oracle-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] unexpected response rsync version 3.0.7 protocol version 30
Saturn2888 wrote: I'm guessing that rsyncd there is the user? Am I thinking something wrong here? How in the world does this command even work? No no, root is the user, then that means somehow rsyncd --daemon is the command being used. Although, when I'm typing in the command line I'd just use --daemon. Confusing! +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Lotusphere 2011 Register now for Lotusphere 2011 and learn how to connect the dots, take your collaborative environment to the next level, and enter the era of Social Business. http://p.sf.net/sfu/lotusphere-d2d ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] unexpected response rsync version 3.0.7 protocol version 30
Funny you posted this yesterday because I was actually looking in my e-mail logs for a link to the thread. Apparently it was a mistake in xinetd. In /etc/inetd.conf, I put rsync stream tcp nowait root /usr/bin/rsync --daemon instead of rsync stream tcp nowait root /usr/bin/rsync rsyncd --daemon because I thought having that rsyncd there was actually incorrect and was causing rsyncd not to start up properly. I'm guessing that rsyncd there is the user? Am I thinking something wrong here? How in the world does this command even work? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Lotusphere 2011 Register now for Lotusphere 2011 and learn how to connect the dots, take your collaborative environment to the next level, and enter the era of Social Business. http://p.sf.net/sfu/lotusphere-d2d ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Another BackupPC Fuse filesystem
How would I cpan install this one? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Xperia(TM) PLAY It's a major breakthrough. An authentic gaming smartphone on the nation's most reliable network. And it wants your games. http://p.sf.net/sfu/verizon-sfdev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Another BackupPC Fuse filesystem
I appreciate the warning, but then, what am I going to do about the error messages having installed from aptitude? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Xperia(TM) PLAY It's a major breakthrough. An authentic gaming smartphone on the nation's most reliable network. And it wants your games. http://p.sf.net/sfu/verizon-sfdev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Another BackupPC Fuse filesystem
Let's how they do :). I thought the script was something simple to put in there and run. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Xperia(TM) PLAY It's a major breakthrough. An authentic gaming smartphone on the nation's most reliable network. And it wants your games. http://p.sf.net/sfu/verizon-sfdev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Block-level rsync-like hashing dd?
Was the conclusion that you could or could not rsync a logical volume? And even if so, what are the speeds to run checksums on a device and compare it with another device? I've got an iSCSI target setup and had a few ideas, all of which failed. Here are my findings on various methodologies or possible solutions to backing up BackupPC's data. :: lvm2 :: I was hoping to do something like a pvmove but copy the logical volume; that one failed because there's no way to do it. :: rsync :: I thought about rsyncing devices using --copy-devices, but you'd have to get a patched rsync to do it and word on the street is, it only works well for devices 2GB and under. Copying an lvm snapshot using rsync would still be just as bad so this, also, won't work. The absolute only way to get rsync to play nice is to break it down into multiple operations and save the checksum table to disk instead of memory and allow it to be usable like that or else you will lose all the hard linking. Either this would be another program completely like Unison, or it'd require some heavy additions to rsync's codebase. It's just not going to happen. Now, if you had a filesystem that supported block-level dedup on the backup end of all of this, it would be more feasible to depend on that along side rsync provided it worked like I'm assuming. :: ddsnap :: Then I thought maybe I could find something else when I came upon an age-old ddsnap project harking back to the now-defunct Zumastor. Since this project's been out of commission since Hardy, I believe it's a no-go. :: mdadm :: I could try adding the drive into the Linux Software RAID1 mirror, but that would slow down all transfers because if the iSCSI target I'm writing to are being written to, that will slow down writes of all these really small files with hardlinks on top of that. No matter what, I can't imagine any sort of non-block-level sync associated with RAID1 being anywhere useful over Ethernet when BackupPC's being used. And what if I restart the iSCSI target machine etc? Doesn't seem workable. :: DRBD :: Here comes DRBD. I absolutely have no clue how to use this and usage examples online are scarce. The an up-side to DRBD is that it's going to act like a RAID1 mirror, but that it works at the block level which will speed up transfers greatly. The next good thing about it is it's able to work asynchronously. Right there, I have a capable solution with may or may not work for me. From what I've seen, it's really designed for high availability clustering and requires both sides to have DRBD for it to be useful. Since I'm backing this up to an embedded FreeBSD 7.3 box which does not have DRBD, the iSCSI target comes to mind as a capable solution; yet, it almost seems impossible because this machine has to be both the sending and receiving end. I don't quite know yet if this will really be worthwhile or not or if I can even get it working. :: Final Thoughts :: I hope that helps anyone trying to figure out how insane this is. My last idea is to either way for BackupPC4 which uses a database or write my own. I'm backing on the latter because that's the quickest way to get something worthwhile accomplished. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Xperia(TM) PLAY It's a major breakthrough. An authentic gaming smartphone on the nation's most reliable network. And it wants your games. http://p.sf.net/sfu/verizon-sfdev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Zlib Compression not working
How do you know it's not working? And how did you install those packages? Did you configure BackupPC to have compression enabled? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Xperia(TM) PLAY It's a major breakthrough. An authentic gaming smartphone on the nation's most reliable network. And it wants your games. http://p.sf.net/sfu/verizon-sfdev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] DeltaCopy Windows Server Enterprise
Does running cmd /c before the command fix it? And did you replace the DeltaCopy files with the newer ones? Because I did this myself, go to http://badmarkup.com/deltacopy/ and pickup the four non-setup.exe files from there and replace them in your DeltaCopy folder. That should update a lot of the functions and bring you up to a much newer version of rsync. Tell me if that fixes your issues. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Xperia(TM) PLAY It's a major breakthrough. An authentic gaming smartphone on the nation's most reliable network. And it wants your games. http://p.sf.net/sfu/verizon-sfdev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Block-level rsync-like hashing dd?
Have or use bandwidth? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Xperia(TM) PLAY It's a major breakthrough. An authentic gaming smartphone on the nation's most reliable network. And it wants your games. http://p.sf.net/sfu/verizon-sfdev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Block-level rsync-like hashing dd?
But none of that solves the issue we're having now. How in the world do we backup the current pool of data? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Xperia(TM) PLAY It's a major breakthrough. An authentic gaming smartphone on the nation's most reliable network. And it wants your games. http://p.sf.net/sfu/verizon-sfdev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Block-level rsync-like hashing dd?
Les, that's a pretty good idea, running two, but I cannot do that with these systems sadly. It'd be really nice to not have to take the machine down to do the backup though. So I guess that's my question, I'm looking for a way to backup the pool over Ethernet while it's running. Double twice the backups would be great but having it done once at all is tough. The machine doing it is barely able to because of the amount of rsync operations and hardlinks. My other machines are even lower power so they'd be far less effective. While it's possible to dd an lvm snapshot, that's 1TB/day which is quite a huge amount of time and bandwidth consumed. Is there no other solution like ddsnap? I mean, since my BackupPC pool is on an lvm, doing a dd itself isn't the problem. It's that it's 1TB/day or more depending on my file system size. It'd be really nice to have a solution to sync only the changes of that day which then, I can snapshot on the other machine. In my case, that other machine is the iSCSI target. I think DRBD is my best bet now, but it's more than a handle to figure out; that's for sure. Has anyone tried DRBD before? +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Forrester Wave Report - Recovery time is now measured in hours and minutes not days. Key insights are discussed in the 2010 Forrester Wave Report as part of an in-depth evaluation of disaster recovery service providers. Forrester found the best-in-class provider in terms of services and vision. Read this report now! http://p.sf.net/sfu/ibm-webcastpromo ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Block-level rsync-like hashing dd?
My goodness that's a lot of replies; although, almost all of them are from a post I made a while back which I've elaborated on at least 3 times since. The main point here that I'm trying to get across, at least for my own setup, is that I cannot handle dd'ing 1TB/day. That would literally take 8-12 hours (at least half a day) easy depending on network load, and it would cream available bandwidth. The reason I am advocating DRBD is because I have it going to an iSCSI target on a machine with ZFS which snapshots the pool. The key there is I'm taking snapshots of it so even if it corrupts, I'm fine. All I'd have to do is mount the .img as EXT4 in an iSCSI target and point a client to that target to retrieve the files. Not necessarily the best idea, but I don't see pool corruption a big deal because I've never experienced anything like that. Even so, I've already arranged the necessary precautions, just need a method of getting only the changes over there without abusing the available bandwidth. The BackupPC_copyPcPool script and the tar copy are great for backing up the pool but both suffer the same fate as dd; there's no method of only transferring changes of files which destroys available bandwidth and leaves drives running constantly almost all day long slowing down crucial backups of machines. And please note, I did talk about ddsnap as well. While the project's out of commission, the software still exists somewhere. +-- |This was sent by saturn2...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Forrester Wave Report - Recovery time is now measured in hours and minutes not days. Key insights are discussed in the 2010 Forrester Wave Report as part of an in-depth evaluation of disaster recovery service providers. Forrester found the best-in-class provider in terms of services and vision. Read this report now! http://p.sf.net/sfu/ibm-webcastpromo ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/