Hello community,

here is the log from the commit of package python-pytools for openSUSE:Factory 
checked in at 2017-10-18 10:54:17
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-pytools (Old)
 and      /work/SRC/openSUSE:Factory/.python-pytools.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python-pytools"

Wed Oct 18 10:54:17 2017 rev:3 rq:534412 version:2017.6

Changes:
--------
--- /work/SRC/openSUSE:Factory/python-pytools/python-pytools.changes    
2017-06-12 15:34:56.963055381 +0200
+++ /work/SRC/openSUSE:Factory/.python-pytools.new/python-pytools.changes       
2017-10-18 10:54:17.878131936 +0200
@@ -1,0 +2,6 @@
+Tue Oct 17 11:25:40 UTC 2017 - mplus...@suse.com
+
+- Update to version 17.6
+  * No changelog available
+
+-------------------------------------------------------------------

Old:
----
  pytools-2017.3.tar.gz

New:
----
  pytools-2017.6.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python-pytools.spec ++++++
--- /var/tmp/diff_new_pack.BKQIhm/_old  2017-10-18 10:54:18.970080709 +0200
+++ /var/tmp/diff_new_pack.BKQIhm/_new  2017-10-18 10:54:18.974080521 +0200
@@ -16,8 +16,9 @@
 #
 
 
+%{?!python_module:%define python_module() python-%{**} python3-%{**}}
 Name:           python-pytools
-Version:        2017.3
+Version:        2017.6
 Release:        0
 Summary:        A collection of tools for Python
 License:        MIT
@@ -64,7 +65,7 @@
 
 %files %{python_files}
 %defattr(-,root,root,-)
-%doc README LICENSE
+%doc README.rst LICENSE
 %{python_sitelib}/*
 
 %changelog

++++++ pytools-2017.3.tar.gz -> pytools-2017.6.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/pytools-2017.3/PKG-INFO new/pytools-2017.6/PKG-INFO
--- old/pytools-2017.3/PKG-INFO 2017-06-03 20:04:28.000000000 +0200
+++ new/pytools-2017.6/PKG-INFO 2017-09-26 02:09:29.000000000 +0200
@@ -1,25 +1,30 @@
 Metadata-Version: 1.1
 Name: pytools
-Version: 2017.3
+Version: 2017.6
 Summary: A collection of tools for Python
 Home-page: http://pypi.python.org/pypi/pytools
 Author: Andreas Kloeckner
 Author-email: inf...@tiker.net
 License: MIT
-Description: 
-              Pytools is a big bag of things that are "missing" from the 
Python standard
-              library. This is mainly a dependency of my other software 
packages, and is
-              probably of little interest to you unless you use those. If 
you're curious
-              nonetheless, here's what's on offer:
+Description: Pytools is a big bag of things that are "missing" from the Python 
standard
+        library. This is mainly a dependency of my other software packages, 
and is
+        probably of little interest to you unless you use those. If you're 
curious
+        nonetheless, here's what's on offer:
+        
+        * A ton of small tool functions such as `len_iterable`, `argmin`,
+          tuple generation, permutation generation, ASCII table pretty 
printing,
+          GvR's mokeypatch_xxx() hack, the elusive `flatten`, and much more.
+        * Michele Simionato's decorator module
+        * A time-series logging module, `pytools.log`.
+        * Batch job submission, `pytools.batchjob`.
+        * A lexer, `pytools.lex`.
+        
+        Links:
+        
+        * `Documentation <https://documen.tician.de/pytools>`_
+        
+        * `Github <https://github.com/inducer/pytools>`_
         
-              * A ton of small tool functions such as `len_iterable`, `argmin`,
-                tuple generation, permutation generation, ASCII table pretty 
printing,
-                GvR's mokeypatch_xxx() hack, the elusive `flatten`, and much 
more.
-              * Michele Simionato's decorator module
-              * A time-series logging module, `pytools.log`.
-              * Batch job submission, `pytools.batchjob`.
-              * A lexer, `pytools.lex`.
-              
 Platform: UNKNOWN
 Classifier: Development Status :: 4 - Beta
 Classifier: Intended Audience :: Developers
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/pytools-2017.3/README new/pytools-2017.6/README
--- old/pytools-2017.3/README   2015-10-17 21:26:02.000000000 +0200
+++ new/pytools-2017.6/README   1970-01-01 01:00:00.000000000 +0100
@@ -1,6 +0,0 @@
-Miscellaneous Python lifesavers.
-
-Andreas Kloeckner <inf...@tiker.net>
-
-Includes Michele Simionato's decorator module, from
-http://www.phyast.pitt.edu/~micheles/python/
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/pytools-2017.3/README.rst 
new/pytools-2017.6/README.rst
--- old/pytools-2017.3/README.rst       1970-01-01 01:00:00.000000000 +0100
+++ new/pytools-2017.6/README.rst       2017-06-14 23:42:47.000000000 +0200
@@ -0,0 +1,18 @@
+Pytools is a big bag of things that are "missing" from the Python standard
+library. This is mainly a dependency of my other software packages, and is
+probably of little interest to you unless you use those. If you're curious
+nonetheless, here's what's on offer:
+
+* A ton of small tool functions such as `len_iterable`, `argmin`,
+  tuple generation, permutation generation, ASCII table pretty printing,
+  GvR's mokeypatch_xxx() hack, the elusive `flatten`, and much more.
+* Michele Simionato's decorator module
+* A time-series logging module, `pytools.log`.
+* Batch job submission, `pytools.batchjob`.
+* A lexer, `pytools.lex`.
+
+Links:
+
+* `Documentation <https://documen.tician.de/pytools>`_
+
+* `Github <https://github.com/inducer/pytools>`_
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/pytools-2017.3/pytools/__init__.py 
new/pytools-2017.6/pytools/__init__.py
--- old/pytools-2017.3/pytools/__init__.py      2017-05-25 03:19:47.000000000 
+0200
+++ new/pytools-2017.6/pytools/__init__.py      2017-09-25 05:04:33.000000000 
+0200
@@ -41,6 +41,96 @@
 else:
     my_decorator = decorator_module.decorator
 
+__doc__ = """
+A Collection of Utilities
+=========================
+
+Math
+----
+
+.. autofunction:: levi_civita
+.. autofunction:: perm
+.. autofunction:: comb
+
+Assertive accessors
+-------------------
+
+.. autofunction:: one
+.. autofunction:: is_single_valued
+.. autofunction:: all_roughly_equal
+.. autofunction:: single_valued
+
+Memoization
+-----------
+
+.. autofunction:: memoize
+.. autofunction:: memoize_on_first_arg
+.. autofunction:: memoize_method
+.. autofunction:: memoize_method_with_uncached
+.. autofunction:: memoize_in
+
+Argmin/max
+----------
+
+.. autofunction:: argmin2
+.. autofunction:: argmax2
+.. autofunction:: argmin
+.. autofunction:: argmax
+
+Cartesian products
+------------------
+.. autofunction:: cartesian_product
+.. autofunction:: distinct_pairs
+
+Permutations, Tuples, Integer sequences
+---------------------------------------
+
+.. autofunction:: wandering_element
+.. autofunction:: indices_in_shape
+.. autofunction:: generate_nonnegative_integer_tuples_below
+.. autofunction:: generate_nonnegative_integer_tuples_summing_to_at_most
+.. autofunction:: generate_all_nonnegative_integer_tuples
+.. autofunction:: generate_all_integer_tuples_below
+.. autofunction:: generate_all_integer_tuples
+.. autofunction:: generate_permutations
+.. autofunction:: generate_unique_permutations
+
+Graph Algorithms
+----------------
+
+.. autofunction:: a_star
+
+Formatting
+----------
+
+.. autoclass:: Table
+.. autofunction:: string_histogram
+.. autofunction:: word_wrap
+
+Debugging
+---------
+
+.. autofunction:: typedump
+.. autofunction:: invoke_editor
+
+Progress bars
+-------------
+
+.. autoclass:: ProgressBar
+
+Name generation
+---------------
+
+.. autofunction:: generate_unique_names
+.. autofunction:: generate_numbered_unique_names
+.. autofunction:: UniqueNameGenerator
+
+Functions for dealing with (large) auxiliary files
+--------------------------------------------------
+
+.. autofunction:: download_from_web_if_not_present
+"""
+
 
 # {{{ math --------------------------------------------------------------------
 
@@ -1163,7 +1253,7 @@
 
 
 def generate_permutations(original):
-    """Generate all permutations of the list `original'.
+    """Generate all permutations of the list *original*.
 
     Nicked from http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/252178
     """
@@ -1177,7 +1267,7 @@
 
 
 def generate_unique_permutations(original):
-    """Generate all unique permutations of the list `original'.
+    """Generate all unique permutations of the list *original*.
     """
 
     had_those = set()
@@ -1314,7 +1404,13 @@
 # {{{ table formatting
 
 class Table:
-    """An ASCII table generator."""
+    """An ASCII table generator.
+
+    .. automethod:: add_row
+    .. automethod:: __str__
+    .. automethod:: latex
+    """
+
     def __init__(self):
         self.rows = []
 
@@ -1402,10 +1498,10 @@
 
 def word_wrap(text, width, wrap_using="\n"):
     # http://code.activestate.com/recipes/148061-one-liner-word-wrap-function/
-    """
+    r"""
     A word-wrap function that preserves existing line breaks
     and most spaces in the text. Expects that existing line
-    breaks are posix newlines (\n).
+    breaks are posix newlines (``\n``).
     """
     space_or_break = [" ", wrap_using]
     return reduce(lambda line, word, width=width: '%s%s%s' %
@@ -1601,6 +1697,14 @@
 # {{{ progress bars
 
 class ProgressBar:
+    """
+    .. automethod:: draw
+    .. automethod:: progress
+    .. automethod:: set_progress
+    .. automethod:: finished
+    .. automethod:: __enter__
+    .. automethod:: __exit__
+    """
     def __init__(self, descr, total, initial=0, length=40):
         import time
         self.description = descr
@@ -1767,6 +1871,12 @@
 
 
 class UniqueNameGenerator(object):
+    """
+    .. automethod:: is_name_conflicting
+    .. automethod:: add_name
+    .. automethod:: add_names
+    .. automethod:: __call__
+    """
     def __init__(self, existing_names=set(), forced_prefix=""):
         self.existing_names = existing_names.copy()
         self.forced_prefix = forced_prefix
@@ -1817,6 +1927,45 @@
 
 # }}}
 
+
+# {{{ recursion limit
+
+class MinRecursionLimit(object):
+    def __init__(self, min_rec_limit):
+        self.min_rec_limit = min_rec_limit
+
+    def __enter__(self):
+        self.prev_recursion_limit = sys.getrecursionlimit()
+        new_limit = max(self.prev_recursion_limit, self.min_rec_limit)
+        sys.setrecursionlimit(new_limit)
+
+    def __exit__(self, exc_type, exc_val, exc_tb):
+        sys.setrecursionlimit(self.prev_recursion_limit)
+
+# }}}
+
+
+# {{{ download from web if not present
+
+def download_from_web_if_not_present(url, local_name=None):
+    """
+    .. versionadded:: 2017.5
+    """
+
+    from os.path import basename, exists
+    if local_name is None:
+        local_name = basename(url)
+
+    if not exists(local_name):
+        from six.moves.urllib.request import urlopen
+        with urlopen(url) as inf:
+            contents = inf.read()
+
+            with open(local_name, "wb") as outf:
+                outf.write(contents)
+
+# }}}
+
 
 def _test():
     import doctest
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/pytools-2017.3/pytools/obj_array.py 
new/pytools-2017.6/pytools/obj_array.py
--- old/pytools-2017.3/pytools/obj_array.py     2016-12-06 02:09:53.000000000 
+0100
+++ new/pytools-2017.6/pytools/obj_array.py     2017-06-15 00:04:28.000000000 
+0200
@@ -2,6 +2,28 @@
 import numpy as np
 from pytools import my_decorator as decorator, MovedFunctionDeprecationWrapper
 
+__doc__ = """
+Handling :mod:`numpy` Object Arrays
+===================================
+
+.. autofunction:: oarray_real
+.. autofunction:: oarray_imag
+.. autofunction:: oarray_real_copy
+.. autofunction:: oarray_imag_copy
+
+Creation
+--------
+
+.. autofunction:: join_fields
+.. autofunction:: make_obj_array
+
+Mapping
+-------
+
+.. autofunction:: with_object_array_or_scalar
+.. autofunction:: with_object_array_or_scalar_n_args
+"""
+
 
 def gen_len(expr):
     from pytools.obj_array import is_obj_array
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/pytools-2017.3/pytools/persistent_dict.py 
new/pytools-2017.6/pytools/persistent_dict.py
--- old/pytools-2017.3/pytools/persistent_dict.py       2017-06-03 
20:03:51.000000000 +0200
+++ new/pytools-2017.6/pytools/persistent_dict.py       2017-09-26 
02:09:20.000000000 +0200
@@ -1,10 +1,11 @@
 """Generic persistent, concurrent dictionary-like facility."""
 
-from __future__ import division, with_statement
-from __future__ import absolute_import
-import six
+from __future__ import division, with_statement, absolute_import
 
-__copyright__ = "Copyright (C) 2011,2014 Andreas Kloeckner"
+__copyright__ = """
+Copyright (C) 2011,2014 Andreas Kloeckner
+Copyright (C) 2017 Matt Wala
+"""
 
 __license__ = """
 Permission is hereby granted, free of charge, to any person obtaining a copy
@@ -30,10 +31,28 @@
 logger = logging.getLogger(__name__)
 
 
+import collections
+import six
 import sys
 import os
+import shutil
 import errno
 
+__doc__ = """
+Persistent Hashing
+==================
+
+This module contains functionality that allows hashing with keys that remain
+valid across interpreter invocations, unlike Python's built-in hashes.
+
+.. autoexception:: NoSuchEntryError
+.. autoexception:: ReadOnlyEntryError
+
+.. autoclass:: KeyBuilder
+.. autoclass:: PersistentDict
+.. autoclass:: WriteOncePersistentDict
+"""
+
 try:
     import hashlib
     new_hash = hashlib.sha256
@@ -43,17 +62,13 @@
     new_hash = sha.new
 
 
-def _erase_dir(dir):
-    from os import listdir, unlink, rmdir
-    from os.path import join, isdir
-    for name in listdir(dir):
-        sub_name = join(dir, name)
-        if isdir(sub_name):
-            _erase_dir(sub_name)
-        else:
-            unlink(sub_name)
-
-    rmdir(dir)
+def _make_dir_recursively(dir):
+    try:
+        os.makedirs(dir)
+    except OSError as e:
+        from errno import EEXIST
+        if e.errno != EEXIST:
+            raise
 
 
 def update_checksum(checksum, obj):
@@ -86,34 +101,33 @@
 
 
 class LockManager(CleanupBase):
-    def __init__(self, cleanup_m, container_dir):
-        if container_dir is not None:
-            self.lock_file = os.path.join(container_dir, "lock")
+    def __init__(self, cleanup_m, lock_file):
+        self.lock_file = lock_file
 
-            attempts = 0
-            while True:
-                try:
-                    self.fd = os.open(self.lock_file,
-                            os.O_CREAT | os.O_WRONLY | os.O_EXCL)
-                    break
-                except OSError:
-                    pass
+        attempts = 0
+        while True:
+            try:
+                self.fd = os.open(self.lock_file,
+                        os.O_CREAT | os.O_WRONLY | os.O_EXCL)
+                break
+            except OSError:
+                pass
 
-                from time import sleep
-                sleep(1)
+            from time import sleep
+            sleep(1)
 
-                attempts += 1
+            attempts += 1
 
-                if attempts > 10:
-                    from warnings import warn
-                    warn("could not obtain lock--delete '%s' if necessary"
-                            % self.lock_file)
-                if attempts > 3 * 60:
-                    raise RuntimeError("waited more than three minutes "
-                            "on the lock file '%s'"
-                            "--something is wrong" % self.lock_file)
+            if attempts > 10:
+                from warnings import warn
+                warn("could not obtain lock--delete '%s' if necessary"
+                        % self.lock_file)
+            if attempts > 3 * 60:
+                raise RuntimeError("waited more than three minutes "
+                        "on the lock file '%s'"
+                        "--something is wrong" % self.lock_file)
 
-            cleanup_m.register(self)
+        cleanup_m.register(self)
 
     def clean_up(self):
         import os
@@ -125,34 +139,26 @@
 
 
 class ItemDirManager(CleanupBase):
-    def __init__(self, cleanup_m, path):
-        from os import mkdir
-        import errno
+    def __init__(self, cleanup_m, path, delete_on_error):
+        from os.path import isdir
 
+        self.existed = isdir(path)
         self.path = path
-        try:
-            mkdir(self.path)
-        except OSError as e:
-            if e.errno != errno.EEXIST:
-                raise
-            self.existed = True
-        else:
-            cleanup_m.register(self)
-            self.existed = False
+        self.delete_on_error = delete_on_error
 
-    def sub(self, n):
-        from os.path import join
-        return join(self.path, n)
+        cleanup_m.register(self)
 
     def reset(self):
         try:
-            _erase_dir(self.path)
+            shutil.rmtree(self.path)
         except OSError as e:
             if e.errno != errno.ENOENT:
                 raise
 
+    def mkdir(self):
+        from os import mkdir
         try:
-            os.mkdir(self.path)
+            mkdir(self.path)
         except OSError as e:
             if e.errno != errno.EEXIST:
                 raise
@@ -161,7 +167,8 @@
         pass
 
     def error_clean_up(self):
-        _erase_dir(self.path)
+        if self.delete_on_error:
+            self.reset()
 
 # }}}
 
@@ -257,20 +264,139 @@
 # }}}
 
 
+# {{{ lru cache
+
+class _LinkedList(object):
+    """The list operates on nodes of the form [value, leftptr, rightpr]. To 
create a
+    node of this form you can use `LinkedList.new_node().`
+
+    Supports inserting at the left and deleting from an arbitrary location.
+    """
+    def __init__(self):
+        self.count = 0
+        self.head = None
+        self.end = None
+
+    @staticmethod
+    def new_node(element):
+        return [element, None, None]
+
+    def __len__(self):
+        return self.count
+
+    def appendleft_node(self, node):
+        self.count += 1
+
+        if self.head is None:
+            self.head = self.end = node
+            return
+
+        self.head[1] = node
+        node[2] = self.head
+
+        self.head = node
+
+    def pop_node(self):
+        end = self.end
+        self.remove_node(end)
+        return end
+
+    def remove_node(self, node):
+        self.count -= 1
+
+        if self.head is self.end:
+            assert node is self.head
+            self.head = self.end = None
+            return
+
+        left = node[1]
+        right = node[2]
+
+        if left is None:
+            self.head = right
+        else:
+            left[2] = right
+
+        if right is None:
+            self.end = left
+        else:
+            right[1] = left
+
+        node[1] = node[2] = None
+
+
+class _LRUCache(collections.MutableMapping):
+    """A mapping that keeps at most *maxsize* items with an LRU replacement 
policy.
+    """
+    def __init__(self, maxsize):
+        self.lru_order = _LinkedList()
+        self.maxsize = maxsize
+        self.cache = {}
+
+    def __delitem__(self, item):
+        node = self.cache[item]
+        self.lru_order.remove_node(node)
+        del self.cache[item]
+
+    def __getitem__(self, item):
+        node = self.cache[item]
+        self.lru_order.remove_node(node)
+        self.lru_order.appendleft_node(node)
+        # A linked list node contains a tuple of the form (item, value).
+        return node[0][1]
+
+    def __contains__(self, item):
+        return item in self.cache
+
+    def __iter__(self):
+        return iter(self.cache)
+
+    def __len__(self):
+        return len(self.cache)
+
+    def clear(self):
+        self.cache.clear()
+        self.lru_order = _LinkedList()
+
+    def __setitem__(self, item, value):
+        if self.maxsize < 1:
+            return
+
+        try:
+            node = self.cache[item]
+            self.lru_order.remove_node(node)
+        except KeyError:
+            if len(self.lru_order) >= self.maxsize:
+                # Make room for new elements.
+                end_node = self.lru_order.pop_node()
+                del self.cache[end_node[0][0]]
+
+            node = self.lru_order.new_node((item, value))
+            self.cache[item] = node
+
+        self.lru_order.appendleft_node(node)
+
+        assert len(self.cache) == len(self.lru_order), \
+                (len(self.cache), len(self.lru_order))
+        assert len(self.lru_order) <= self.maxsize
+
+        return node[0]
+
+# }}}
+
+
 # {{{ top-level
 
 class NoSuchEntryError(KeyError):
     pass
 
 
-class PersistentDict(object):
-    def __init__(self, identifier, key_builder=None, container_dir=None):
-        """
-        :arg identifier: a file-name-compatible string identifying this
-            dictionary
-        :arg key_builder: a subclass of :class:`KeyBuilder`
-        """
+class ReadOnlyEntryError(KeyError):
+    pass
+
 
+class _PersistentDictBase(object):
+    def __init__(self, identifier, key_builder=None, container_dir=None):
         self.identifier = identifier
 
         if key_builder is None:
@@ -291,52 +417,267 @@
 
         self._make_container_dir()
 
+    def store_if_not_present(self, key, value):
+        self.store(key, value, _skip_if_present=True)
+
+    def store(self, key, value, _skip_if_present=False):
+        raise NotImplementedError()
+
+    def fetch(self, key):
+        raise NotImplementedError()
+
+    def _read(self, path):
+        from six.moves.cPickle import load
+        with open(path, "rb") as inf:
+            return load(inf)
+
+    def _write(self, path, value):
+        from six.moves.cPickle import dump, HIGHEST_PROTOCOL
+        with open(path, "wb") as outf:
+            dump(value, outf, protocol=HIGHEST_PROTOCOL)
+
+    def _item_dir(self, hexdigest_key):
+        from os.path import join
+        return join(self.container_dir, hexdigest_key)
+
+    def _key_file(self, hexdigest_key):
+        from os.path import join
+        return join(self._item_dir(hexdigest_key), "key")
+
+    def _contents_file(self, hexdigest_key):
+        from os.path import join
+        return join(self._item_dir(hexdigest_key), "contents")
+
+    def _lock_file(self, hexdigest_key):
+        from os.path import join
+        return join(self.container_dir, str(hexdigest_key) + ".lock")
+
     def _make_container_dir(self):
-        # {{{ ensure container directory exists
+        _make_dir_recursively(self.container_dir)
+
+    def _collision_check(self, key, stored_key):
+        if stored_key != key:
+            # Key collision, oh well.
+            from warnings import warn
+            warn("%s: key collision in cache at '%s' -- these are "
+                    "sufficiently unlikely that they're often "
+                    "indicative of a broken implementation "
+                    "of equality comparison"
+                    % (self.identifier, self.container_dir))
+            # This is here so we can debug the equality comparison
+            stored_key == key
+            raise NoSuchEntryError(key)
+
+    def __getitem__(self, key):
+        return self.fetch(key)
+
+    def __setitem__(self, key, value):
+        self.store(key, value)
 
+    def __delitem__(self, key):
+        raise NotImplementedError()
+
+    def clear(self):
         try:
-            os.makedirs(self.container_dir)
+            shutil.rmtree(self.container_dir)
         except OSError as e:
-            from errno import EEXIST
-            if e.errno != EEXIST:
+            if e.errno != errno.ENOENT:
                 raise
 
-        # }}}
+        self._make_container_dir()
+
+
+class WriteOncePersistentDict(_PersistentDictBase):
+    def __init__(self, identifier, key_builder=None, container_dir=None,
+             in_mem_cache_size=256):
+        """
+        :arg identifier: a file-name-compatible string identifying this
+            dictionary
+        :arg key_builder: a subclass of :class:`KeyBuilder`
+        :arg in_mem_cache_size: retain an in-memory cache of up to
+            *in_mem_cache_size* items
+
+        .. automethod:: __getitem__
+        .. automethod:: __setitem__
+        .. automethod:: clear
+        .. automethod:: store_if_not_present
+        """
+        _PersistentDictBase.__init__(self, identifier, key_builder, 
container_dir)
+        self._cache = _LRUCache(in_mem_cache_size)
+
+    def _spin_until_removed(self, lock_file):
+        from os.path import exists
+
+        attempts = 0
+        while exists(lock_file):
+            from time import sleep
+            sleep(1)
+
+            attempts += 1
+
+            if attempts > 10:
+                from warnings import warn
+                warn("waiting until unlocked--delete '%s' if necessary"
+                        % lock_file)
+
+            if attempts > 3 * 60:
+                raise RuntimeError("waited more than three minutes "
+                        "on the lock file '%s'"
+                        "--something is wrong" % lock_file)
 
-    def store(self, key, value, info_files={}):
+    def store(self, key, value, _skip_if_present=False):
         hexdigest_key = self.key_builder(key)
 
         cleanup_m = CleanupManager()
         try:
             try:
-                LockManager(cleanup_m, self.container_dir)
-
-                from os.path import join
-                item_dir_m = ItemDirManager(cleanup_m,
-                        join(self.container_dir, hexdigest_key))
+                LockManager(cleanup_m, self._lock_file(hexdigest_key))
+                item_dir_m = ItemDirManager(
+                        cleanup_m, self._item_dir(hexdigest_key),
+                        delete_on_error=False)
 
                 if item_dir_m.existed:
-                    item_dir_m.reset()
+                    if _skip_if_present:
+                        return
+                    raise ReadOnlyEntryError(key)
 
-                for info_name, info_value in six.iteritems(info_files):
-                    info_path = item_dir_m.sub("info_"+info_name)
+                item_dir_m.mkdir()
 
-                    with open(info_path, "wt") as outf:
-                        outf.write(info_value)
+                key_path = self._key_file(hexdigest_key)
+                value_path = self._contents_file(hexdigest_key)
 
-                from six.moves.cPickle import dump, HIGHEST_PROTOCOL
-                value_path = item_dir_m.sub("contents")
-                with open(value_path, "wb") as outf:
-                    dump(value, outf, protocol=HIGHEST_PROTOCOL)
+                self._write(value_path, value)
+                self._write(key_path, key)
 
-                logger.debug("%s: cache store [key=%s]" % (
+                logger.debug("%s: disk cache store [key=%s]" % (
+                        self.identifier, hexdigest_key))
+            except:
+                cleanup_m.error_clean_up()
+                raise
+        finally:
+            cleanup_m.clean_up()
+
+    def fetch(self, key):
+        hexdigest_key = self.key_builder(key)
+
+        # {{{ in memory cache
+
+        try:
+            stored_key, stored_value = self._cache[hexdigest_key]
+        except KeyError:
+            pass
+        else:
+            logger.debug("%s: in mem cache hit [key=%s]" % (
+                    self.identifier, hexdigest_key))
+            self._collision_check(key, stored_key)
+            return stored_value
+
+        # }}}
+
+        # {{{ check path exists and is unlocked
+
+        item_dir = self._item_dir(hexdigest_key)
+
+        from os.path import isdir
+        if not isdir(item_dir):
+            logger.debug("%s: disk cache miss [key=%s]" % (
                     self.identifier, hexdigest_key))
+            raise NoSuchEntryError(key)
+
+        lock_file = self._lock_file(hexdigest_key)
+        self._spin_until_removed(lock_file)
+
+        # }}}
+
+        key_file = self._key_file(hexdigest_key)
+        contents_file = self._contents_file(hexdigest_key)
+
+        # Note: Unlike PersistentDict, this doesn't autodelete invalid entires,
+        # because that would lead to a race condition.
+
+        # {{{ load key file and do equality check
+
+        try:
+            read_key = self._read(key_file)
+        except:
+            from warnings import warn
+            warn("pytools.persistent_dict.WriteOncePersistentDict(%s) "
+                    "encountered an invalid "
+                    "key file for key %s. Remove the directory "
+                    "'%s' if necessary."
+                    % (self.identifier, hexdigest_key, item_dir))
+            raise NoSuchEntryError(key)
+
+        self._collision_check(key, read_key)
+
+        # }}}
+
+        logger.debug("%s: disk cache hit [key=%s]" % (
+                self.identifier, hexdigest_key))
+
+        # {{{ load contents
+
+        try:
+            read_contents = self._read(contents_file)
+        except:
+            warn("pytools.persistent_dict.WriteOncePersistentDict(%s) "
+                    "encountered an invalid "
+                    "key file for key %s. Remove the directory "
+                    "'%s' if necessary."
+                    % (self.identifier, hexdigest_key, item_dir))
+            raise NoSuchEntryError(key)
+
+        # }}}
+
+        self._cache[hexdigest_key] = (key, read_contents)
+        return read_contents
+
+    def clear(self):
+        _PersistentDictBase.clear(self)
+        self._cache.clear()
+
+
+class PersistentDict(_PersistentDictBase):
+    def __init__(self, identifier, key_builder=None, container_dir=None):
+        """
+        :arg identifier: a file-name-compatible string identifying this
+            dictionary
+        :arg key_builder: a subclass of :class:`KeyBuilder`
+
+        .. automethod:: __getitem__
+        .. automethod:: __setitem__
+        .. automethod:: __delitem__
+        .. automethod:: clear
+        .. automethod:: store_if_not_present
+        """
+        _PersistentDictBase.__init__(self, identifier, key_builder, 
container_dir)
+
+    def store(self, key, value, _skip_if_present=False):
+        hexdigest_key = self.key_builder(key)
 
-                # Write key last, so that if the reader below
-                key_path = item_dir_m.sub("key")
-                with open(key_path, "wb") as outf:
-                    dump(key, outf, protocol=HIGHEST_PROTOCOL)
+        cleanup_m = CleanupManager()
+        try:
+            try:
+                LockManager(cleanup_m, self._lock_file(hexdigest_key))
+                item_dir_m = ItemDirManager(
+                        cleanup_m, self._item_dir(hexdigest_key),
+                        delete_on_error=True)
+
+                if item_dir_m.existed:
+                    if _skip_if_present:
+                        return
+                    item_dir_m.reset()
+
+                item_dir_m.mkdir()
 
+                key_path = self._key_file(hexdigest_key)
+                value_path = self._contents_file(hexdigest_key)
+
+                self._write(value_path, value)
+                self._write(key_path, key)
+
+                logger.debug("%s: cache store [key=%s]" % (
+                        self.identifier, hexdigest_key))
             except:
                 cleanup_m.error_clean_up()
                 raise
@@ -345,38 +686,29 @@
 
     def fetch(self, key):
         hexdigest_key = self.key_builder(key)
+        item_dir = self._item_dir(hexdigest_key)
 
-        from os.path import join, isdir
-        item_dir = join(self.container_dir, hexdigest_key)
+        from os.path import isdir
         if not isdir(item_dir):
             logger.debug("%s: cache miss [key=%s]" % (
-                self.identifier, hexdigest_key))
+                    self.identifier, hexdigest_key))
             raise NoSuchEntryError(key)
 
         cleanup_m = CleanupManager()
         try:
             try:
-                LockManager(cleanup_m, self.container_dir)
-
-                item_dir_m = ItemDirManager(cleanup_m, item_dir)
-                key_path = item_dir_m.sub("key")
-                value_path = item_dir_m.sub("contents")
+                LockManager(cleanup_m, self._lock_file(hexdigest_key))
+                item_dir_m = ItemDirManager(
+                        cleanup_m, item_dir, delete_on_error=False)
 
-                from six.moves.cPickle import load
+                key_path = self._key_file(hexdigest_key)
+                value_path = self._contents_file(hexdigest_key)
 
-                # {{{ load key file
-
-                exc = None
+                # {{{ load key
 
                 try:
-                    with open(key_path, "rb") as inf:
-                        read_key = load(inf)
-                except IOError as e:
-                    exc = e
-                except EOFError as e:
-                    exc = e
-
-                if exc is not None:
+                    read_key = self._read(key_path)
+                except:
                     item_dir_m.reset()
                     from warnings import warn
                     warn("pytools.persistent_dict.PersistentDict(%s) "
@@ -385,36 +717,18 @@
                             % (self.identifier, hexdigest_key))
                     raise NoSuchEntryError(key)
 
-                # }}}
+                self._collision_check(key, read_key)
 
-                if read_key != key:
-                    # Key collision, oh well.
-                    from warnings import warn
-                    warn("%s: key collision in cache at '%s' -- these are "
-                            "sufficiently unlikely that they're often "
-                            "indicative of a broken implementation "
-                            "of equality comparison"
-                            % (self.identifier, self.container_dir))
-                    # This is here so we can debug the equality comparison
-                    read_key == key
-                    raise NoSuchEntryError(key)
+                # }}}
 
                 logger.debug("%s: cache hit [key=%s]" % (
-                    self.identifier, hexdigest_key))
+                        self.identifier, hexdigest_key))
 
                 # {{{ load value
 
-                exc = None
-
                 try:
-                    with open(value_path, "rb") as inf:
-                        read_contents = load(inf)
-                except IOError as e:
-                    exc = e
-                except EOFError as e:
-                    exc = e
-
-                if exc is not None:
+                    read_contents = self._read(value_path)
+                except:
                     item_dir_m.reset()
                     from warnings import warn
                     warn("pytools.persistent_dict.PersistentDict(%s) "
@@ -423,10 +737,10 @@
                             % (self.identifier, hexdigest_key))
                     raise NoSuchEntryError(key)
 
-                # }}}
-
                 return read_contents
 
+                # }}}
+
             except:
                 cleanup_m.error_clean_up()
                 raise
@@ -436,17 +750,36 @@
     def remove(self, key):
         hexdigest_key = self.key_builder(key)
 
-        from os.path import join, isdir
-        item_dir = join(self.container_dir, hexdigest_key)
+        item_dir = self._item_dir(hexdigest_key)
+        from os.path import isdir
         if not isdir(item_dir):
             raise NoSuchEntryError(key)
 
         cleanup_m = CleanupManager()
         try:
             try:
-                LockManager(cleanup_m, self.container_dir)
+                LockManager(cleanup_m, self._lock_file(hexdigest_key))
+                item_dir_m = ItemDirManager(
+                        cleanup_m, item_dir, delete_on_error=False)
+                key_file = self._key_file(hexdigest_key)
+
+                # {{{ load key
+
+                try:
+                    read_key = self._read(key_file)
+                except:
+                    item_dir_m.reset()
+                    from warnings import warn
+                    warn("pytools.persistent_dict.PersistentDict(%s) "
+                            "encountered an invalid "
+                            "key file for key %s. Entry deleted."
+                            % (self.identifier, hexdigest_key))
+                    raise NoSuchEntryError(key)
+
+                self._collision_check(key, read_key)
+
+                # }}}
 
-                item_dir_m = ItemDirManager(cleanup_m, item_dir)
                 item_dir_m.reset()
 
             except:
@@ -455,24 +788,9 @@
         finally:
             cleanup_m.clean_up()
 
-    def __getitem__(self, key):
-        return self.fetch(key)
-
-    def __setitem__(self, key, value):
-        return self.store(key, value)
-
     def __delitem__(self, key):
         self.remove(key)
 
-    def clear(self):
-        try:
-            _erase_dir(self.container_dir)
-        except OSError as e:
-            if e.errno != errno.ENOENT:
-                raise
-
-        self._make_container_dir()
-
 # }}}
 
 # vim: foldmethod=marker
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/pytools-2017.3/pytools/version.py 
new/pytools-2017.6/pytools/version.py
--- old/pytools-2017.3/pytools/version.py       1970-01-01 01:00:00.000000000 
+0100
+++ new/pytools-2017.6/pytools/version.py       2017-09-26 02:09:20.000000000 
+0200
@@ -0,0 +1,3 @@
+VERSION = (2017, 6)
+VERSION_STATUS = ""
+VERSION_TEXT = ".".join(str(x) for x in VERSION) + VERSION_STATUS
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/pytools-2017.3/pytools.egg-info/PKG-INFO 
new/pytools-2017.6/pytools.egg-info/PKG-INFO
--- old/pytools-2017.3/pytools.egg-info/PKG-INFO        2017-06-03 
20:04:28.000000000 +0200
+++ new/pytools-2017.6/pytools.egg-info/PKG-INFO        2017-09-26 
02:09:29.000000000 +0200
@@ -1,25 +1,30 @@
 Metadata-Version: 1.1
 Name: pytools
-Version: 2017.3
+Version: 2017.6
 Summary: A collection of tools for Python
 Home-page: http://pypi.python.org/pypi/pytools
 Author: Andreas Kloeckner
 Author-email: inf...@tiker.net
 License: MIT
-Description: 
-              Pytools is a big bag of things that are "missing" from the 
Python standard
-              library. This is mainly a dependency of my other software 
packages, and is
-              probably of little interest to you unless you use those. If 
you're curious
-              nonetheless, here's what's on offer:
+Description: Pytools is a big bag of things that are "missing" from the Python 
standard
+        library. This is mainly a dependency of my other software packages, 
and is
+        probably of little interest to you unless you use those. If you're 
curious
+        nonetheless, here's what's on offer:
+        
+        * A ton of small tool functions such as `len_iterable`, `argmin`,
+          tuple generation, permutation generation, ASCII table pretty 
printing,
+          GvR's mokeypatch_xxx() hack, the elusive `flatten`, and much more.
+        * Michele Simionato's decorator module
+        * A time-series logging module, `pytools.log`.
+        * Batch job submission, `pytools.batchjob`.
+        * A lexer, `pytools.lex`.
+        
+        Links:
+        
+        * `Documentation <https://documen.tician.de/pytools>`_
+        
+        * `Github <https://github.com/inducer/pytools>`_
         
-              * A ton of small tool functions such as `len_iterable`, `argmin`,
-                tuple generation, permutation generation, ASCII table pretty 
printing,
-                GvR's mokeypatch_xxx() hack, the elusive `flatten`, and much 
more.
-              * Michele Simionato's decorator module
-              * A time-series logging module, `pytools.log`.
-              * Batch job submission, `pytools.batchjob`.
-              * A lexer, `pytools.lex`.
-              
 Platform: UNKNOWN
 Classifier: Development Status :: 4 - Beta
 Classifier: Intended Audience :: Developers
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/pytools-2017.3/pytools.egg-info/SOURCES.txt 
new/pytools-2017.6/pytools.egg-info/SOURCES.txt
--- old/pytools-2017.3/pytools.egg-info/SOURCES.txt     2017-06-03 
20:04:28.000000000 +0200
+++ new/pytools-2017.6/pytools.egg-info/SOURCES.txt     2017-09-26 
02:09:29.000000000 +0200
@@ -1,6 +1,6 @@
 LICENSE
 MANIFEST.in
-README
+README.rst
 setup.cfg
 setup.py
 pytools/__init__.py
@@ -22,6 +22,7 @@
 pytools/spatial_btree.py
 pytools/stopwatch.py
 pytools/test.py
+pytools/version.py
 pytools.egg-info/PKG-INFO
 pytools.egg-info/SOURCES.txt
 pytools.egg-info/dependency_links.txt
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/pytools-2017.3/pytools.egg-info/requires.txt 
new/pytools-2017.6/pytools.egg-info/requires.txt
--- old/pytools-2017.3/pytools.egg-info/requires.txt    2017-06-03 
20:04:28.000000000 +0200
+++ new/pytools-2017.6/pytools.egg-info/requires.txt    2017-09-26 
02:09:29.000000000 +0200
@@ -1,4 +1,4 @@
-decorator>=3.2.0
 appdirs>=1.4.0
-six>=1.8.0
+decorator>=3.2.0
 numpy>=1.6.0
+six>=1.8.0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/pytools-2017.3/setup.cfg new/pytools-2017.6/setup.cfg
--- old/pytools-2017.3/setup.cfg        2017-06-03 20:04:28.000000000 +0200
+++ new/pytools-2017.6/setup.cfg        2017-09-26 02:09:29.000000000 +0200
@@ -9,5 +9,4 @@
 [egg_info]
 tag_build = 
 tag_date = 0
-tag_svn_revision = 0
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/pytools-2017.3/setup.py new/pytools-2017.6/setup.py
--- old/pytools-2017.3/setup.py 2017-06-03 19:59:17.000000000 +0200
+++ new/pytools-2017.6/setup.py 2017-06-14 23:43:06.000000000 +0200
@@ -3,23 +3,19 @@
 
 from setuptools import setup
 
+ver_dic = {}
+version_file = open("pytools/version.py")
+try:
+    version_file_contents = version_file.read()
+finally:
+    version_file.close()
+
+exec(compile(version_file_contents, "pytools/version.py", 'exec'), ver_dic)
+
 setup(name="pytools",
-      version="2017.3",
+      version=ver_dic["VERSION_TEXT"],
       description="A collection of tools for Python",
-      long_description="""
-      Pytools is a big bag of things that are "missing" from the Python 
standard
-      library. This is mainly a dependency of my other software packages, and 
is
-      probably of little interest to you unless you use those. If you're 
curious
-      nonetheless, here's what's on offer:
-
-      * A ton of small tool functions such as `len_iterable`, `argmin`,
-        tuple generation, permutation generation, ASCII table pretty printing,
-        GvR's mokeypatch_xxx() hack, the elusive `flatten`, and much more.
-      * Michele Simionato's decorator module
-      * A time-series logging module, `pytools.log`.
-      * Batch job submission, `pytools.batchjob`.
-      * A lexer, `pytools.lex`.
-      """,
+      long_description=open("README.rst", "r").read(),
       classifiers=[
           'Development Status :: 4 - Beta',
           'Intended Audience :: Developers',
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/pytools-2017.3/test/test_persistent_dict.py 
new/pytools-2017.6/test/test_persistent_dict.py
--- old/pytools-2017.3/test/test_persistent_dict.py     2017-06-03 
16:44:02.000000000 +0200
+++ new/pytools-2017.6/test/test_persistent_dict.py     2017-09-26 
02:09:20.000000000 +0200
@@ -2,39 +2,316 @@
 
 import pytest  # noqa
 import sys  # noqa
+import tempfile
+import shutil
+
 from six.moves import range
 from six.moves import zip
 
+from pytools.persistent_dict import (
+        PersistentDict, WriteOncePersistentDict, NoSuchEntryError,
+        ReadOnlyEntryError)
+
+
+# {{{ type for testing
+
+class PDictTestingKeyOrValue(object):
+
+    def __init__(self, val, hash_key=None):
+        self.val = val
+        if hash_key is None:
+            hash_key = val
+        self.hash_key = hash_key
+
+    def __getstate__(self):
+        return {"val": self.val, "hash_key": self.hash_key}
+
+    def __eq__(self, other):
+        return self.val == other.val
+
+    def __ne__(self, other):
+        return not self.__eq__(other)
+
+    def update_persistent_hash(self, key_hash, key_builder):
+        key_builder.rec(key_hash, self.hash_key)
+
+    def __repr__(self):
+        return "PDictTestingKeyOrValue(val=%r,hash_key=%r)" % (
+                (self.val, self.hash_key))
+
+    __str__ = __repr__
+
+# }}}
+
+
+def test_persistent_dict_storage_and_lookup():
+    try:
+        tmpdir = tempfile.mkdtemp()
+        pdict = PersistentDict("pytools-test", container_dir=tmpdir)
+
+        from random import randrange
+
+        def rand_str(n=20):
+            return "".join(
+                    chr(65+randrange(26))
+                    for i in range(n))
+
+        keys = [(randrange(2000), rand_str(), None) for i in range(20)]
+        values = [randrange(2000) for i in range(20)]
+
+        d = dict(list(zip(keys, values)))
+
+        # {{{ check lookup
+
+        for k, v in zip(keys, values):
+            pdict[k] = v
+
+        for k, v in d.items():
+            assert d[k] == pdict[k]
+
+        # }}}
+
+        # {{{ check updating
+
+        for k, v in zip(keys, values):
+            pdict[k] = v + 1
+
+        for k, v in d.items():
+            assert d[k] + 1 == pdict[k]
+
+        # }}}
+
+        # {{{ check store_if_not_present
+
+        for k, v in zip(keys, values):
+            pdict.store_if_not_present(k, d[k] + 2)
+
+        for k, v in d.items():
+            assert d[k] + 1 == pdict[k]
+
+        pdict.store_if_not_present(2001, 2001)
+        assert pdict[2001] == 2001
+
+        # }}}
+
+        # check not found
+
+        with pytest.raises(NoSuchEntryError):
+            pdict[3000]
+
+    finally:
+        shutil.rmtree(tmpdir)
+
+
+def test_persistent_dict_deletion():
+    try:
+        tmpdir = tempfile.mkdtemp()
+        pdict = PersistentDict("pytools-test", container_dir=tmpdir)
+
+        pdict[0] = 0
+        del pdict[0]
+
+        with pytest.raises(NoSuchEntryError):
+            pdict[0]
+
+        with pytest.raises(NoSuchEntryError):
+            del pdict[1]
+
+    finally:
+        shutil.rmtree(tmpdir)
+
+
+def test_persistent_dict_synchronization():
+    try:
+        tmpdir = tempfile.mkdtemp()
+        pdict1 = PersistentDict("pytools-test", container_dir=tmpdir)
+        pdict2 = PersistentDict("pytools-test", container_dir=tmpdir)
+
+        # check lookup
+        pdict1[0] = 1
+        assert pdict2[0] == 1
+
+        # check updating
+        pdict1[0] = 2
+        assert pdict2[0] == 2
+
+        # check deletion
+        del pdict1[0]
+        with pytest.raises(NoSuchEntryError):
+            pdict2[0]
+
+    finally:
+        shutil.rmtree(tmpdir)
+
+
+def test_persistent_dict_cache_collisions():
+    try:
+        tmpdir = tempfile.mkdtemp()
+        pdict = PersistentDict("pytools-test", container_dir=tmpdir)
+
+        key1 = PDictTestingKeyOrValue(1, hash_key=0)
+        key2 = PDictTestingKeyOrValue(2, hash_key=0)
+
+        pdict[key1] = 1
+
+        # check lookup
+        with pytest.warns(UserWarning):
+            with pytest.raises(NoSuchEntryError):
+                pdict[key2]
+
+        # check deletion
+        with pytest.warns(UserWarning):
+            with pytest.raises(NoSuchEntryError):
+                del pdict[key2]
+
+        # check presence after deletion
+        assert pdict[key1] == 1
+
+        # check store_if_not_present
+        pdict.store_if_not_present(key2, 2)
+        assert pdict[key1] == 1
+
+    finally:
+        shutil.rmtree(tmpdir)
+
+
+def test_persistent_dict_clear():
+    try:
+        tmpdir = tempfile.mkdtemp()
+        pdict = PersistentDict("pytools-test", container_dir=tmpdir)
+
+        pdict[0] = 1
+        pdict[0]
+        pdict.clear()
+
+        with pytest.raises(NoSuchEntryError):
+            pdict[0]
+
+    finally:
+        shutil.rmtree(tmpdir)
+
+
+@pytest.mark.parametrize("in_mem_cache_size", (0, 256))
+def test_write_once_persistent_dict_storage_and_lookup(in_mem_cache_size):
+    try:
+        tmpdir = tempfile.mkdtemp()
+        pdict = WriteOncePersistentDict(
+                "pytools-test", container_dir=tmpdir,
+                in_mem_cache_size=in_mem_cache_size)
+
+        # check lookup
+        pdict[0] = 1
+        assert pdict[0] == 1
+        # do two lookups to test the cache
+        assert pdict[0] == 1
+
+        # check updating
+        with pytest.raises(ReadOnlyEntryError):
+            pdict[0] = 2
+
+        # check not found
+        with pytest.raises(NoSuchEntryError):
+            pdict[1]
+
+        # check store_if_not_present
+        pdict.store_if_not_present(0, 2)
+        assert pdict[0] == 1
+        pdict.store_if_not_present(1, 1)
+        assert pdict[1] == 1
+
+    finally:
+        shutil.rmtree(tmpdir)
+
+
+def test_write_once_persistent_dict_lru_policy():
+    try:
+        tmpdir = tempfile.mkdtemp()
+        pdict = WriteOncePersistentDict(
+                "pytools-test", container_dir=tmpdir, in_mem_cache_size=3)
+
+        pdict[1] = PDictTestingKeyOrValue(1)
+        pdict[2] = PDictTestingKeyOrValue(2)
+        pdict[3] = PDictTestingKeyOrValue(3)
+        pdict[4] = PDictTestingKeyOrValue(4)
+
+        val1 = pdict[1]
+
+        assert pdict[1] is val1
+        pdict[2]
+        assert pdict[1] is val1
+        pdict[2]
+        pdict[3]
+        assert pdict[1] is val1
+        pdict[2]
+        pdict[3]
+        pdict[2]
+        assert pdict[1] is val1
+        pdict[2]
+        pdict[3]
+        pdict[4]
+        assert pdict[1] is not val1
+
+    finally:
+        shutil.rmtree(tmpdir)
+
+
+def test_write_once_persistent_dict_synchronization():
+    try:
+        tmpdir = tempfile.mkdtemp()
+        pdict1 = WriteOncePersistentDict("pytools-test", container_dir=tmpdir)
+        pdict2 = WriteOncePersistentDict("pytools-test", container_dir=tmpdir)
+
+        # check lookup
+        pdict1[1] = 0
+        assert pdict2[1] == 0
+
+        # check updating
+        with pytest.raises(ReadOnlyEntryError):
+            pdict2[1] = 1
+
+    finally:
+        shutil.rmtree(tmpdir)
+
+
+def test_write_once_persistent_dict_cache_collisions():
+    try:
+        tmpdir = tempfile.mkdtemp()
+        pdict = WriteOncePersistentDict("pytools-test", container_dir=tmpdir)
 
-def test_persistent_dict():
-    from pytools.persistent_dict import PersistentDict
-    pdict = PersistentDict("pytools-test")
-    pdict.clear()
+        key1 = PDictTestingKeyOrValue(1, hash_key=0)
+        key2 = PDictTestingKeyOrValue(2, hash_key=0)
+        pdict[key1] = 1
 
-    from random import randrange
+        # check lookup
+        with pytest.warns(UserWarning):
+            with pytest.raises(NoSuchEntryError):
+                pdict[key2]
 
-    def rand_str(n=20):
-        return "".join(
-                chr(65+randrange(26))
-                for i in range(n))
+        # check update
+        with pytest.raises(ReadOnlyEntryError):
+            pdict[key2] = 1
 
-    keys = [(randrange(2000), rand_str(), None) for i in range(20)]
-    values = [randrange(2000) for i in range(20)]
+        # check store_if_not_present
+        pdict.store_if_not_present(key2, 1)
+        assert pdict[key1] == 1
 
-    d = dict(list(zip(keys, values)))
+    finally:
+        shutil.rmtree(tmpdir)
 
-    for k, v in zip(keys, values):
-        pdict[k] = v
-        pdict.store(k, v, info_files={"hey": str(v)})
 
-    for k, v in list(d.items()):
-        assert d[k] == pdict[k]
+def test_write_once_persistent_dict_clear():
+    try:
+        tmpdir = tempfile.mkdtemp()
+        pdict = WriteOncePersistentDict("pytools-test", container_dir=tmpdir)
 
-    for k, v in zip(keys, values):
-        pdict.store(k, v+1, info_files={"hey": str(v)})
+        pdict[0] = 1
+        pdict[0]
+        pdict.clear()
 
-    for k, v in list(d.items()):
-        assert d[k] + 1 == pdict[k]
+        with pytest.raises(NoSuchEntryError):
+            pdict[0]
+    finally:
+        shutil.rmtree(tmpdir)
 
 
 if __name__ == "__main__":


Reply via email to