GrayHat wrote: > > My thought is that the "TTL" would only be in effect for > the purpose > > of keeping BlockReporting working (for however many days or > weeks you > > wish the emails to be guaranteed resendable). > > After that time, the TTL is null and the files are game for > > replacement. I thought it a simple idea for working around the > > BlockReporting problem Thomas mentioned. > > I see, but there's no need to store something along with > files, the regular filesystem timestamp for each file will > just work fine, just remove all files if "(today - filetime) > TTL"
That's what I was trying to say... I don't recall saying anything about "storing something along with files". =) > > On a low-to-medium traffic box, though, this would not be a > problem. > > We already deal with bunches of identical messages from > time-to-time > > (nothing new). > > there may be a solution for that too, assuming the spam and > notspam folders gets cleaned up using the TTL, the files may > be saved using (e.g.) an MD5 hash (or the like) as the name > so that identical messages won't be stored more than one > time; by the way that may have some side effects and may need > some more thinking but... I think you're thinking of one thing and I'm thinking of another. I'm not quite implying to use TTL for management of the corpus. What I'm suggesting is to use TTL just to fix the problem Thomas brought up: Begin quote Thomas: Sorry - I forgot to say - 1) and 2) will break the BlockReporting - "resend" function (!!!) - because we will delete files randomly. End quote So, going off of what Thomas said, he's talking about deleting files randomly. But, if a check could be added before the random delete that would cause it to skip the files that are not greater than the TTL, I think it might allow BlockReporting to continue to work. Then, when those files surpass the TTL, they would then be allowed to be randomly deleted. That's the way I see it. > > Just an idea, but how do you "NOT" discard data while > keeping rebuild > > times low and maintaining free hard drive space (realistically)? > > Using some kind of "digest" of the previous bases stored in a > more compact format Ahh, I see. That way the actual email files would not necessarily need to be kept. You would use the digest file or files to aid in the newest compilation. This may even help speed up the rebuild process. You could have daily, weekly, bi-weekly or monthly digest files. But then there would need to be a kind of TTL for the digest files so that they do not grow too large in number. The only problem I see with that is how long do the digest files remain valid/pertinent? Kind Regards, Brett ------------------------------------------------------------------------------ Come build with us! The BlackBerry® Developer Conference in SF, CA is the only developer event you need to attend this year. Jumpstart your developing skills, take BlackBerry mobile applications to market and stay ahead of the curve. Join us from November 9-12, 2009. Register now! http://p.sf.net/sfu/devconf _______________________________________________ Assp-test mailing list Assp-test@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/assp-test