On Fri, 2006-03-10 at 14:41, Matt wrote:
> I'm not convinced that there
> >is much you can do to maintain any kind of structured order
> >over a long term when you are adding files from multiple
> >sources simultaneously and expiring them more or less randomly.
> >  

> It's not really random!   The data are expiring because a backup of a
> host expires.

You might speed up access to the directory listing you are
expiring, but the data files are still going to be randomly
located.  They will probably have been accumulated by several
simultaneous runs with the first unique copy from any host
being the one that gets saved.

> As I said. Dirvish's performance was more than an order
> of magnitude better.    It uses cross-links but it keeps the original
> tree  structure for  each host.   To me this  shows that there has to be
> is a better way to do things and Dave's proposal seems right on target.

Are you comparing uncompressed native rsync runs to the perl version
that handles backuppc's compressed files?  There are more variables
involved than the link structure.  Also, backuppc may be running
several backups at once, which is a good thing if you have a fast
server and slow or remote clients. 

-- 
  Les Mikesell
   [EMAIL PROTECTED]
 



-------------------------------------------------------
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=110944&bid=241720&dat=121642
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to