Hi Quincey,

A Tuesday 27 November 2007, Quincey Koziol escrigué:
[snip]
> > While I understand that this approach is suboptimal
> > (2*2*600*100=240000
> > chunks has to 'updated' for each update operation in the loop), I
> > don't
> > understand completely the reason why the user reports that the
> > script is consuming so much memory (the script crashes, but perhaps
> > it is asking for several GB).  My guess is that perhaps HDF5 is
> > trying to load all the affected chunks in-memory before trying to
> > update them, but I thought it is best to report this here just in
> > case this is a bug, or, if not, the huge demand of memory can be
> > somewhat alleviated.
>
>       Is this with the 1.6.x library code?  If so, it would be worthwhile
> checking with the 1.8.0 code, which is designed to do all the I/O on
> each chunk at once and then proceed to the next chunk.  However, it
> does build information about the selection on each chunk to update
> and if the I/O operation will update 240,000 chunks, that could be a
> large amount of memory...

Yes, this was using 1.6.x library.  I've directed the user to compile 
PyTables with the latest 1.8.0 (beta5) library (with 
the "--with-default-api-version=v16" flag) but he is reporting 
problems.  Here is the relevant excerpt of the traceback:

Program received signal SIGSEGV, Segmentation fault.
[Switching to Thread -1210186064 (LWP 12304)]
0xb7b578b1 in H5S_close (ds=0xbfb11178) at H5S.c:464
464         H5S_SELECT_RELEASE(ds);
(gdb) bt
#0  0xb7b578b1 in H5S_close (ds=0xbfb11178) at H5S.c:464
#1  0xb7a0ab4e in H5D_destroy_chunk_map (fm=0xbfb0fff8) at H5Dio.c:2651
#2  0xb7a0b04c in H5D_create_chunk_map (fm=0xbfb0fff8, 
    io_info=<value optimized out>, nelmts=1440000, file_space=0x84bd140, 
    mem_space=0x84b40f0, mem_type=0x8363000) at H5Dio.c:2556
#3  0xb7a0cd1a in H5D_chunk_write (io_info=0xbfb13c24, nelmts=1440000, 
    mem_type=0x8363000, mem_space=0x84b40f0, file_space=0x84bd140, 
    tpath=0x8363e30, src_id=50331970, dst_id=50331966, buf=0xb57b8008)
    at H5Dio.c:1765
#4  0xb7a106f9 in H5D_write (dataset=0x840a418, mem_type_id=50331970, 
    mem_space=0x84b40f0, file_space=0x84bd140, dxpl_id=167772168, 
    buf=0xb57b8008) at H5Dio.c:732
#5  0xb7a117aa in H5Dwrite (dset_id=83886080, mem_type_id=50331970, 
    mem_space_id=67108874, file_space_id=67108875, plist_id=167772168, 
    buf=0xb57b8008) at H5Dio.c:434

We don't have time right now to look into it, but it could be a problem 
with PyTables code (although, if the "--with-default-api-version=v16" 
flag is working properly, this should not be the case).  It is strange, 
because PyTables used to work perfectly up to HDF5 1.8.0 beta3 (i.e. 
all tests passed).

If we do more progress on this issue, I'll let you know.

Thanks!

-- 
>0,0<   Francesc Altet     http://www.carabos.com/
V   V   Cárabos Coop. V.   Enjoy Data
 "-"

-------------------------------------------------------------------------
SF.Net email is sponsored by: The Future of Linux Business White Paper
from Novell.  From the desktop to the data center, Linux is going
mainstream.  Let it simplify your IT future.
http://altfarm.mediaplex.com/ad/ck/8857-50307-18918-4
_______________________________________________
Pytables-users mailing list
Pytables-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/pytables-users

Reply via email to