Hi Pauli,

This is the new answer of Quincey about the 'wrong' chunksize thread.  
If you can come with a standalone C program that reproduces the problem 
with HDF5 1.8.0-beta5, that would be great.  If not, I'll try to do it 
later today.

Thanks!

----------  Missatge transmès  ----------

Subject: Re: Writing to a dataset with 'wrong' chunksize
Date: Thursday 29 November 2007
From: Quincey Koziol <[EMAIL PROTECTED]>
To: Francesc Altet <[EMAIL PROTECTED]>

Hi Francesc,

On Nov 28, 2007, at 10:44 AM, Francesc Altet wrote:

> Hi Quincey,
>
> A Tuesday 27 November 2007, Quincey Koziol escrigué:
> [snip]
>>> While I understand that this approach is suboptimal
>>> (2*2*600*100=240000
>>> chunks has to 'updated' for each update operation in the loop), I
>>> don't
>>> understand completely the reason why the user reports that the
>>> script is consuming so much memory (the script crashes, but perhaps
>>> it is asking for several GB).  My guess is that perhaps HDF5 is
>>> trying to load all the affected chunks in-memory before trying to
>>> update them, but I thought it is best to report this here just in
>>> case this is a bug, or, if not, the huge demand of memory can be
>>> somewhat alleviated.
>>
>>      Is this with the 1.6.x library code?  If so, it would be worthwhile
>> checking with the 1.8.0 code, which is designed to do all the I/O on
>> each chunk at once and then proceed to the next chunk.  However, it
>> does build information about the selection on each chunk to update
>> and if the I/O operation will update 240,000 chunks, that could be a
>> large amount of memory...
>
> Yes, this was using 1.6.x library.  I've directed the user to compile
> PyTables with the latest 1.8.0 (beta5) library (with
> the "--with-default-api-version=v16" flag) but he is reporting
> problems.  Here is the relevant excerpt of the traceback:
>
> Program received signal SIGSEGV, Segmentation fault.
> [Switching to Thread -1210186064 (LWP 12304)]
> 0xb7b578b1 in H5S_close (ds=0xbfb11178) at H5S.c:464
> 464         H5S_SELECT_RELEASE(ds);
> (gdb) bt
> #0  0xb7b578b1 in H5S_close (ds=0xbfb11178) at H5S.c:464
> #1  0xb7a0ab4e in H5D_destroy_chunk_map (fm=0xbfb0fff8) at H5Dio.c: 
> 2651
> #2  0xb7a0b04c in H5D_create_chunk_map (fm=0xbfb0fff8,
>     io_info=<value optimized out>, nelmts=1440000,  
> file_space=0x84bd140,
>     mem_space=0x84b40f0, mem_type=0x8363000) at H5Dio.c:2556
> #3  0xb7a0cd1a in H5D_chunk_write (io_info=0xbfb13c24, nelmts=1440000,
>     mem_type=0x8363000, mem_space=0x84b40f0, file_space=0x84bd140,
>     tpath=0x8363e30, src_id=50331970, dst_id=50331966, buf=0xb57b8008)
>     at H5Dio.c:1765
> #4  0xb7a106f9 in H5D_write (dataset=0x840a418, mem_type_id=50331970,
>     mem_space=0x84b40f0, file_space=0x84bd140, dxpl_id=167772168,
>     buf=0xb57b8008) at H5Dio.c:732
> #5  0xb7a117aa in H5Dwrite (dset_id=83886080, mem_type_id=50331970,
>     mem_space_id=67108874, file_space_id=67108875, plist_id=167772168,
>     buf=0xb57b8008) at H5Dio.c:434
>
> We don't have time right now to look into it, but it could be a  
> problem
> with PyTables code (although, if the "--with-default-api-version=v16"
> flag is working properly, this should not be the case).  It is  
> strange,
> because PyTables used to work perfectly up to HDF5 1.8.0 beta3 (i.e.
> all tests passed).

        Hmm, I have been working on that section of code a lot, it's  
certainly possible that I've introduced a bug. :-/

> If we do more progress on this issue, I'll let you know.

        If you can characterize it in a standalone program, that would be  
really great!

        Thanks,
                Quincey

>
> Thanks!
>
> -- 
>> 0,0<   Francesc Altet     http://www.carabos.com/
> V   V   Cárabos Coop. V.   Enjoy Data
>  "-"
>
> ----------------------------------------------------------------------
> This mailing list is for HDF software users discussion.
> To subscribe to this list, send a message to hdf-forum- 
> [EMAIL PROTECTED]
> To unsubscribe, send a message to [EMAIL PROTECTED]
>
>


----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to 
[EMAIL PROTECTED]
To unsubscribe, send a message to [EMAIL PROTECTED]


-------------------------------------------------------

-- 
>0,0<   Francesc Altet     http://www.carabos.com/
V   V   Cárabos Coop. V.   Enjoy Data
 "-"

-------------------------------------------------------------------------
SF.Net email is sponsored by: The Future of Linux Business White Paper
from Novell.  From the desktop to the data center, Linux is going
mainstream.  Let it simplify your IT future.
http://altfarm.mediaplex.com/ad/ck/8857-50307-18918-4
_______________________________________________
Pytables-users mailing list
Pytables-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/pytables-users

Reply via email to