Hello,

I have ~100 512x512x3 numpy arrays that I currently save on disks as png
images. If I were to put all these data into one single hdf5 file, will
reading only 1-3 of these arrays incur a full read of the entire hdf5 from
disk?

My problem is that the file server has problems serving up many small files.
If I pack all these small files into a single hdf5, the overhead of reading
will be reduced. However, if reading a single array makes the entire dataset
be read then what I am doing is counter-productive.

With warm regards,

Dat Chu
------------------------------------------------------------------------------
Download new Adobe(R) Flash(R) Builder(TM) 4
The new Adobe(R) Flex(R) 4 and Flash(R) Builder(TM) 4 (formerly 
Flex(R) Builder(TM)) enable the development of rich applications that run
across multiple browsers and platforms. Download your free trials today!
http://p.sf.net/sfu/adobe-dev2dev
_______________________________________________
Pytables-users mailing list
Pytables-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/pytables-users

Reply via email to