Les Mikesell wrote: > You still haven't said whether this is the first run where the files are > actually copied or not - if not you shouldn't expect much network > activity. How long would it take the target to read all it's files? > Something like 'time tar -cf - / | cat >/dev/null' would be a reasonable > test. (Don't do -cf /dev/null with gnutar because it will cheat and not > read the files.) That would be the fastest an rsync run could possibly > complete with the --ignore times option even if you don't transfer any > data or create new files on the server.
This is the initial backup of the server. I am only running 1 session at a time until I solve this bandwidth problem. >> Now if you tell me my hardware isn't fast enough, the BackupPC server is >> a >> dual Opteron 2.2 Ghz with 8 GB RAM and 24 300GB drives in a 3ware RAID5 >> array, it isn't. > > Either end can limit the speed. How many concurrent runs do you do? > Also, you should expect much faster rates if you have a few large files > than if you have millions of tiny ones. I am retrying the same server with --ignore-times but it doesn't seem to be any faster. -- Jeremy Mann jer...@biochem.uthscsa.edu University of Texas Health Science Center Bioinformatics Core Facility http://www.bioinformatics.uthscsa.edu Phone: (210) 567-2672 ------------------------------------------------------------------------------ Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july _______________________________________________ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/