There are products out there that take it one step further by storing the data in "Chunks" rather than files, and storing only one copy of each chunk. They us an algorithm on the clients to describe the chunks, and claim excellent performance. I've been to a presentation from Avamar, but don't know anyone using it, so I don't know about reliability or performance. http://www.avamar.com/
Doug Thorneycroft County Sanitation Districts of Los Angeles County (562) 699-7411 Ext. 1058 FAX (562) 699-6756 [EMAIL PROTECTED] -----Original Message----- From: Coats, Jack [mailto:[EMAIL PROTECTED] Sent: Monday, September 13, 2004 1:25 PM To: [EMAIL PROTECTED] Subject: Wishlist Item Yes, I know I am dreaming, but... An open source program, pcbackup or backuppc something like that on Sourceforge has a very nice feature. If a file is already backed up, it only keeps one copy of that file for ALL its clients! What technique does it use to figure out if the files are identical without comparing them? I didn't research it that far, but I assume it uses something like file size and a checksum of some kind. Anyway, if you have significantly identical client computer you are backing up, just keeping one rather than N copies is better than any compression known to man! It would be another field or so in the database for every file, but it might be worth it! At least as an option. ... Jack
