On 06/09/2013 08:14, Nick Coghlan wrote: > On 6 September 2013 15:50, Chris Withers <ch...@simplistix.co.uk> wrote: >> Continuous testing is a wonderful thing when it comes to finding weird edge >> case problems, like this one:
[... snip ...] >> os.rmdir(path) >> OSError: [WinError 145] The directory is not empty: >> 'c:\\users\\jenkins\\appdata\\local\\temp\\tmpkeg4d7\\a' > This feels a lot like an issue we were seeing on the Windows > buildbots, which we ended up working around in the test support > library: http://bugs.python.org/issue15496 > > That would be some awfully ugly code to upgrade from "hack in the test > support library" to "this is just how Python unlinks files on > Windows", though :P I think that any kind of delay-retry loop falls squarely within the remit of the calling application, not core Python. This isn't a problem of Python's making: IIUC, you would see the same effect if you used any other language or simply deleted the folder from within Explorer. (I don't know whether Explorer itself does anything canny under the covers to retry). Obviously our test suite *is* a calling application, and so it makes perfect sense to put some workaround in place. The trouble with this class of problem, where a share-delete handle allows operations to succeed and to fail later which would normally fail early, is that the bad effect is at one remove from its cause. Here, by the time the rmtree has reached the point of removing a parent directory, it's long-since left behind the file which has a still-open handle: the DeleteFile succeeded and the code moved on. You can't even tell which file it was. A related problem arises where the DeleteFile succeeds and an error occurs when a subsequent CreateFile fails for the same filepath, again because a share-delete handle is still open for a file at that path. This is another one which hits our test suite because of an overuse of one temp filename. What should Python do? With some effort it could look for open file handles against the file it's trying to delete, but what then? Wait until they're all closed? That could leave it hanging. And even with a timeout it would introduce a delay which might be unnecessary. A lot of the time, no harm will come of the file existing a few seconds after the DeleteFile has succeeded. In short, I don't believe there's any mileage in introducing extra code into Python's os or io modules. It falls on the shoulders of the calling code to implement retry loops or whatever logic as needed. TJG _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com