[issue32937] Multiprocessing worker functions not terminating with a large number of processes and a manager

2018-02-24 Thread Eric Gorr

New submission from Eric Gorr <ericg...@gmail.com>:

I have the following code:

import multiprocessing
from multiprocessing import Pool, Manager
import time
import random

def worker_function( index, messages ):

print( "%d: Entered" % index )
time.sleep( random.randint( 3, 15 ) )
messages.put( "From: %d" % index )
print( "%d: Exited" % index )

manager = Manager()
messages = manager.Queue()

with Pool( processes = None ) as pool:

for x in range( 30 ):
pool.apply_async( worker_function, [ x, messages ] )

pool.close()
pool.join()

It does not terminate -- all entered messages are printed, but not all exited 
messages are printed.

If I remove all the code related to the Manager and Queue, it will terminate 
properly with all messages printed.

If I assign processes explicitly, I can continue to increase the number 
assigned to processes and have it continue to work until I reach a value of 20 
or 21. > 20, it fails all of the time. With a value == 20 it fails some of the 
time. With a value of < 20, it always succeeds.

multiprocessing.cpu_count() returns 24 for my MacPro.

--
components: Library (Lib), macOS
messages: 312718
nosy: Eric Gorr, ned.deily, ronaldoussoren
priority: normal
severity: normal
status: open
title: Multiprocessing worker functions not terminating with a large number of 
processes and a manager
versions: Python 3.6

___
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue32937>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue24301] gzip module failing to decompress valid compressed file

2015-05-27 Thread Eric Gorr

New submission from Eric Gorr:

I have a file whose first four bytes are 1F 8B 08 00 and if I use gunzip from 
the command line, it outputs:

gzip: zImage_extracted.gz: decompression OK, trailing garbage ignored

and correctly decompresses the file. However, if I use the gzip module to read 
and decompress the data, I get the following exception thrown:

  File /usr/lib/python3.4/gzip.py, line 360, in read
while self._read(readsize):
  File /usr/lib/python3.4/gzip.py, line 433, in _read
if not self._read_gzip_header():
  File /usr/lib/python3.4/gzip.py, line 297, in _read_gzip_header
raise OSError('Not a gzipped file')

I believe the problem I am facing is the same one described here in this SO 
question and answer:

http://stackoverflow.com/questions/4928560/how-can-i-work-with-gzip-files-which-contain-extra-data


This would appear to be serious bug in the gzip module that needs to be fixed.

--
components: Extension Modules
messages: 244188
nosy: Eric Gorr
priority: normal
severity: normal
status: open
title: gzip module failing to decompress valid compressed file
type: crash
versions: Python 3.4

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue24301
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com