Les Mikesell wrote: > Richard Hansen wrote: >> Apparently not -- both of my clients already have rsync 3.0.5 installed, >> yet rsync is causing them to run out of memory. The clients have 4 to 6 >> million (largely redundant) files each. It appears that I need the >> incremental-recursion feature of protocol 30 to back up this many files. > > If they are grouped in several subdirectories, you could break the > backups into separate runs. If they are all in one directory, even > protocol 30 probably won't help.
The files are in several subdirectories. I was contemplating breaking up the run, but there's a complication: The set of subdirectories (underneath the only directory where it makes sense to split up the run) changes over time. It's a slow change, so maybe I can keep a sharp eye out and manually adjust RsyncShareName as subdirectories are added and removed (blech). > Using tar as the xfer method would avoid the issue with the tradeoff > that you use more bandwidth for full runs and don't reflect changes > quite as accurately in increments. I've switched to tar for now, and I'm hoping that it will prove to be an adequate solution. Thanks for your help, Richard ------------------------------------------------------------------------------ Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july _______________________________________________ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/