I tried a few different tests:

1.       Adding some filters (SHUF, GZIP) looks to have failed silently.
>$ h5repack -f SHUF -f GZIP=1 h5g_output_parallel.global.100.10.h5 
>h5g_output_parallel.global.100.10.shuf.gzip1.h5
No error reported, but the data was not accessible. H5dump output in attached.


2.       I attempted to change the layout to CONTI (contiguous), but it also 
fails silently.
>$ h5repack -l CONTI h5g_output_parallel.global.100.10.h5 
>h5g_output_parallel.global.100.10.conti.h5
Same as test#1: No error reported, but the data was not accessible. H5dump 
output in attached.


3.       I attempted to change the layout to CHUNK=100, but it looks like the 
resulting file is identical. H5dump doesn't indicate that the layout is now 
chunked, and the dataset is still virtual:
>$ h5repack -l CHUNK=100  h5g_output_parallel.global.100.10.h5 
>h5g_output_parallel.global.100.10.chunk100.h5
No errors, but the h5dump output is identical to the original.


4.       Same result with CHUNK=100 and SHUF, GZIP filters.
>$ h5repack -l CHUNK=100 -f SHUF -f GZIP=1  
>h5g_output_parallel.global.100.10.h5 
>h5g_output_parallel.global.100.10.chunk100.shuf.gzip.h5
Same result as test #3: No errors, but the h5dump output is identical to the 
original.


5.       I also (unwisely) attempted to change layout to CONTI and add 
SHUF,GZIP filters, but realized that you can't add these filters with 
contiguous layout. I suppose this error is expected.

HDF5-DIAG: Error detected in HDF5 (1.10.0) thread 0:

  #000: H5Pdcpl.c line 2009 in H5Pset_chunk(): chunk dimensionality must be 
positive

    major: Invalid arguments to routine

    minor: Out of range

h5repack error: <h5g_output_parallel.global.100.10.h5>: Could not copy data to: 
h5g_output_parallel.conti.shuf.gzip1.h5

Resulting files and "h5dump -p" output in the attached (with exception of 
CONTI/SHUF/GZIP test).

Original file was created in HDF5 1.10.0 with single VDS with source datasets 
spread across several files (attached).

>$ h5repack --version
h5repack: Version 1.10.0
>$ h5dump --version
h5dump: Version 1.10.0

My suspicion is that what I'm attempting to do is not (yet?) supported.

Jarom

From: Hdf-forum [mailto:[email protected]] On Behalf Of 
Miller, Mark C.
Sent: Tuesday, April 05, 2016 3:17 PM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] h5repack on files with VDS

I honestly don't know. But, if you have a small file with VDS datasetes in it, 
maybe give it a try and see what happens.

Mark


From: Hdf-forum 
<[email protected]<mailto:[email protected]>>
 on behalf of "Nelson, Jarom" <[email protected]<mailto:[email protected]>>
Reply-To: HDF Users Discussion List 
<[email protected]<mailto:[email protected]>>
Date: Tuesday, April 5, 2016 3:06 PM
To: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Subject: [Hdf-forum] h5repack on files with VDS

Can h5repack be used to un-virtualize a VDS?

Jarom Nelson
Lawrence Livermore National Lab

Attachment: h5repack.vds.tar.gz
Description: h5repack.vds.tar.gz

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to