Status: New
Owner: ----
Labels: Type-Defect Priority-Medium

New issue 1660 by alleonik: djblets.utils.misc.cache_memoize wrong work for large data in some cases
http://code.google.com/p/reviewboard/issues/detail?id=1660

*NOTE: Do not post confidential information in this bug report.*

What version are you running?
1.0.5.1
Djblets-0.5.5-py2.5

If review contain many changes in big file, I gets in logs that strings:

2010-05-20 19:57:30,058 - INFO - Cache miss for key
reviewboard.xxx.ru:diff-sidebyside-hl-8309-0.
2010-05-20 19:57:30,059 - WARNING - Failed to fetch large data from cache
for key reviewboard.xxx.ru:diff-sidebyside-hl-8309: .

I try to look into code and find that next code from Djblets-0.5.5-
py2.5.egg/djblets/util/misc.py includes several issue:

def _cache_fetch_large_data(cache, key):
    chunk_count = cache.get(key)
    data = []

    chunk_keys = ['%s-%d' % (key, i) for i in range(int(chunk_count))]
    chunks = cache.get_many(chunk_keys)
    for chunk_key in chunk_keys:
        try:
<<<<<<<<<<<<< 1st <<<<<<<<<<<<<<<<<<<<<<<<<<<
            data.append(chunks[chunk_key][0])
<<<<<<< !!! read 1st element of list !!! ^^^^
        except KeyError:
            logging.info('Cache miss for key %s.' % chunk_key)
            raise MissingChunkError

    data = ''.join(data)

    data = zlib.decompress(data)
    try:
        unpickler = pickle.Unpickler(StringIO(data))
        data = unpickler.load()
    except Exception, e:
        logging.warning("Unpickle error for cache key %s: %s." % (key, e))
        raise e

    return data

def _cache_store_large_data(cache, key, data, expiration):
......
    file = StringIO()
    pickler = pickle.Pickler(file)
    pickler.dump(data)
    data = file.getvalue()
    data = zlib.compress(data)

    i = 0
    while len(data) > CACHE_CHUNK_SIZE:
        chunk = data[0:CACHE_CHUNK_SIZE]
        data = data[CACHE_CHUNK_SIZE:]
<<<<<<<<<<<<<<<<< 2nd <<<<<<<<<<<<<<<<<<<<<<<<<
        cache.set('%s-%d' % (key, i), chunk, expiration)
<<<<<<< !!! save string, not list !!! ^^^^^ <<<
        i += 1
<<<<<<<<<<<<<<<<< 3td <<<<<<<<<<<<<<<<<<<<<<<<<
    cache.set('%s-%d' % (key, i), [data], expiration)
<<< !!! save list, not string !!! ^^^^^^^ <<<<<

    cache.set(key, '%d' % (i + 1), expiration)

As we can see type of save and read data is not compatible. But is it not
all. In 3th case if len(data) about CACHE_CHUNK_SIZE cache.set may try to
save object with size is greater megabyte and failed.

I change this code so save and read string - in 1st case remove [0], and
in 3th - remove square brackets.

--
You received this message because you are subscribed to the Google Groups 
"reviewboard-issues" group.
To post to this group, send email to reviewboard-iss...@googlegroups.com.
To unsubscribe from this group, send email to 
reviewboard-issues+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/reviewboard-issues?hl=en.

Reply via email to