Just wondering if anyone has done testing or seen documentation (from any 
dedupe vendor) regarding the usage of enabling checkpoints on backups that are 
being deduplicated?  I would think that the introduction of checkpoints every X 
minutes into the datastream would interrupt the continuity of the data and make 
it seem more unique thus negatively affecting dedupe ratios but I'm wondering 
by how much.  Most, if not all, of the variable length guys have the ability to 
're-align' themselves to the start of the files so I would think it might be 
more pronounced on large files vs your average server but I'm just thinking out 
loud.

Anyone seen a recommendation or actually tested themselves?


Barclays             www.barclaycardus.com

This e-mail and any files transmitted with it may contain confidential and/or 
proprietary information. It is intended solely for the use of the individual or 
entity who is the intended recipient. Unauthorized use of this information is 
prohibited. If you have received this in error, please contact the sender by 
replying to this message and delete this material from any system it may be on.

_______________________________________________
Veritas-bu maillist  -  Veritas-bu@mailman.eng.auburn.edu
http://mailman.eng.auburn.edu/mailman/listinfo/veritas-bu

Reply via email to