Duncan Booth wrote:
I think you probably are correct. The only thing I can think that might help is if you can catch all the situations where changes to the dependent values might change the hash and wrap them up: before changing the hash pop the item out of the dict, then reinsert it after the change.

That would probably require a lot of uncomfortable signal handling, especially for a piece of functionality I'd like to be as unobtrusive as possible in the application.

Alternatively give up on defining hash and __eq__ for FragmentInfo and rely on object identity instead.

Object identity wouldn't work so well for caching. Objects would always be drawn as they appeared for the first time. No updates would be shown until the objects were flushed from the cache.

I've been experimenting with a list cache now and I can't say I'm noticing any change in performance for a cache of 100 items. I'm still using the hash to "freeze" a sort of object tag in order to detect changes, and I require both hash and object equality for cache hits, like so:

def index(self, key):
    h = hash(key)
    for i, item in enumerate(self.items):
        if item.hash == h and item.key == key:
            return i
    raise KeyError(key)

This seems to do what I want and does OK performance wise.

Thanks again!

/Joel
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to