Hi Nathanael,

Thank you for the report! I entered a bug report (HDFFV-8629) and we will 
investigate. 

Elena
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal  The HDF Group  http://hdfgroup.org   
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~




On Nov 22, 2013, at 9:49 AM, huebbe <[email protected]>
 wrote:

> Hello Elena,
> I have just finished adapting our MAFISC plugin to the new plugin scheme
> (it's available at the old site:
> http://wr.informatik.uni-hamburg.de/research/projects/icomex/mafisc),
> however, during my testing, I came across this little problem:
> 
> I can compress a file with
>       h5repack --filter=UD=32002,1,0 source.h5 dest.h5
> 
> I can also read it with
>       h5dump dest.h5 | less
> 
> I can copy is with
>       h5repack dest.h5 dest2.h5
> 
> But I cannot uncompress it with
>       h5repack --filter=NONE dest.h5 unpacked.h5
> 
> because --filter=NONE _deactivates_ dynamically loaded filters while
> reading the data. The following error is displayed:
>       warning: dataset </faked and compressed data> cannot be read,
>       user defined filter is not available
> 
> I believe, the semantics of --filter=NONE should be changed, so that
> h5repack will use any filters necessary to read the data, but will write
> unfiltered data.
> 
> Cheers,
> Nathanael
> 
> 
> 
> On 11/20/2013 03:36 AM, Elena Pourmal wrote:
>> Hi Richard and Nathanael,
>> 
>> Please check this document 
>> http://www.hdfgroup.org/HDF5/doc/Advanced/DynamicallyLoadedFilters/HDF5DynamicallyLoadedFilters.pdf
>> 
>> It points to the bzip2 example that you could follow. Let [email protected] 
>> know if you have any questions after reading the document.
>> 
>> Richard,
>> 
>> You should use the filter number for LZF that is registered with The HDF 
>> Group http://www.hdfgroup.org/services/contributions.html#filters 
>> 
>> Good luck :-)
>> 
>> Elena
>> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
>> Elena Pourmal  The HDF Group  http://hdfgroup.org   
>> 1800 So. Oak St., Suite 203, Champaign IL 61820
>> 217.531.6112
>> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
>> 
>> 
>> 
>> 
>> On Nov 18, 2013, at 10:47 AM, Richard van Hees <[email protected]> wrote:
>> 
>>> I have the same question, a little bit more specific: how can I create a 
>>> plugin for the LZF commpression and install it such that h5ls and h5dump 
>>> can use it?
>>> 
>>> Greetings, Richard
>>> 
>>> On 11/18/2013 05:30 PM, huebbe wrote:
>>>> I understand that you have now added a plugin mechanism, but I can't
>>>> find documentation for it. Is there already documentation on that
>>>> feature (detailing what to do to write a plugin, for instance), and
>>>> where can I find it?
>>>> 
>>>> Cheers,
>>>> Nathanael
>>>> 
>>>> 
>>>> 
>>> 
>>> 
>>> _______________________________________________
>>> Hdf-forum is for HDF software users discussion.
>>> [email protected]
>>> http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>> 
>> 
>> _______________________________________________
>> Hdf-forum is for HDF software users discussion.
>> [email protected]
>> http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>> 
> 
> 
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Reply via email to