Changes by Serhiy Storchaka storch...@gmail.com:
--
resolution: - fixed
stage: patch review - resolved
status: open - closed
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23191
___
Roundup Robot added the comment:
New changeset fe12c34c39eb by Serhiy Storchaka in branch '2.7':
Issue #23191: fnmatch functions that use caching are now threadsafe.
https://hg.python.org/cpython/rev/fe12c34c39eb
--
nosy: +python-dev
___
Python
Changes by Serhiy Storchaka storch...@gmail.com:
--
assignee: - serhiy.storchaka
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23191
___
___
New submission from M. Schmitzer:
The way the fnmatch module uses its regex cache is not threadsafe. When
multiple threads use the module in parallel, a race condition between
retrieving a - presumed present - item from the cache and clearing the cache
(because the maximum size has been
STINNER Victor added the comment:
I guess that a lot of stdlib modules are not thread safe :-/ A workaround is to
protect calls to fnmatch with your own lock.
--
nosy: +haypo
___
Python tracker rep...@bugs.python.org
M. Schmitzer added the comment:
Ok, if that is the attitude in such cases, feel free to close this.
--
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23191
___
STINNER Victor added the comment:
It would be nice to fix the issue, but I don't know how it is handled in other
stdlib modules.
--
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23191
___
Serhiy Storchaka added the comment:
It is easy to make fnmatch caching thread safe without locks. Here is a patch.
The problem with fnmatch is that the caching is implicit and a user don't know
that any lock are needed. So either the need of the lock should be explicitly
documented, or
M. Schmitzer added the comment:
@serhiy.storchaka: My thoughts exactly, especially regarding the caching being
implicit. From the outside, fnmatch really doesn't look like it could have
threading issues.
The patch also looks exactly like what I had in mind.
--