Hi Anthony,
Thank you very much for your answer (it works). I will try to remodel my
code around this trick but I'm not sure it's possible because I use a
framework that need arrays.
Can somebody explain what is going on? I was thinking that PyTables keep
weakref to the file for lazy
On Fri, Jul 12, 2013 at 1:51 AM, Mathieu Dubois duboismathieu_g...@yahoo.fr
wrote:
Hi Anthony,
Thank you very much for your answer (it works). I will try to remodel my
code around this trick but I'm not sure it's possible because I use a
framework that need arrays.
I think that this
Le 11/07/2013 21:56, Anthony Scopatz a écrit :
On Thu, Jul 11, 2013 at 2:49 PM, Mathieu Dubois
duboismathieu_g...@yahoo.fr mailto:duboismathieu_g...@yahoo.fr wrote:
Hello,
I wanted to use PyTables in conjunction with multiprocessing for some
embarrassingly parallel tasks.
Hi Mathieu,
I think you should try opening a new file handle per process. The
following works for me on v3.0:
import tables
import random
import multiprocessing
# Reload the data
# Use multiprocessing to perform a simple computation (column average)
def f(filename):
h5file =