Hi, > > because you might run in an O(n^2) issue ... > > I don't really care this is not a performance important task you might > run > it once a month and it can take many hours, no problem and the above > algorithm might be somewhere in O(n) maybe little worse.
I do not see a performance problem. One of the basic assumtions is that the files should be of the same size. So working from a size sorted list of files looks linear. For an interesting look at the files of the same size: bash# find /TheServers -type f -printf \\n%s\\t%p|sort -rn|less best ragnar
