> Generally, trying to manage memory yourself inside an application can > work against you. See the ArchitectNotes for Varnish for a general > overview of why and how:
Yeah, I'm kind of concerned about it too, since there are so many details in the way (Python's internal memory buffers, C library allocation behavior, the application's use of objects, etc). Promising an auto-tweaking cache which behaves poorly would be worse than advising developers to tweak their cache size to fit their needs. > For something like Storm though, I suspect having an externally > managed cache which has a proven track record would only do you good. > For this reason, I would suggest creating/investigating a > 'MemcachedCache' if such thing doesn't exist yet. That's something to be considered indeed, but let's please not mix these two conversations together. The current cache mechanism prevents objects from being deallocated, effectively saving the cost of instantiation, and that's a very different scenario than preventing the database from being hit by caching data in a memory mapped database. -- Gustavo Niemeyer http://niemeyer.net -- storm mailing list [email protected] Modify settings or unsubscribe at: https://lists.ubuntu.com/mailman/listinfo/storm
