Adding support for runtime plugins may impose its own portability issues,
needing to handle incompatibilities of libraries (was the plugin compiled
with the same library as the library trying to load it, etc.), and some
additional management schemes (such as search path of directories for
plugins). All solvable, but certainly requires some time. But with HDF5
there might also be the option to embed a filter into HDF5 itself,
as a binary, platform-dependent code which is stored in an "unfiltered"
HDF5 dataset in a predefined location. This could allow an HDF5 file
to bring its own filter with it, precompiled for a set of supported
platforms. Maybe this would make it easier? Just an idea. Of course,
would require some kind of certification scheme to avoid opening the
path for viruses and trojans, but the same is the case with runtime
plugins as well.

        Werner


On Thu, 13 Oct 2011 08:11:47 +0200, Mark Miller <[email protected]> wrote:

Well, my recollection is something that I think Elena Pourmal was trying
to start where source code for (open source) 3rd party filters somehow
got submitted to HDF Group's web pages along with example data files
containing the 'filtered' data and information about the filter such as
Point of contact, purpose, performance characteristics, etc.

I like the idea of making it 'easy' to install such 3rd party filters
with (or next to) the main HDF5 library so they can be loaded as needed.
But, that is a step beyond what I was trying to open a dialog about
here ;)

Mark

On Wed, 2011-10-12 at 22:51 -0700, Ger van Diepen wrote:
I think it would be nice if a 3rd party filter is loaded dynamically
using dlopen.

The name of the filter needs to be reflected in the library name in
some standard way. The init function of the library can register the
filter in HDF5's filter registry. I think it only requires a small
addition to the way HDF5 finds a filter.

In this way one the filter repository can be a set of shared libraries
or dlls; one does not need to change HDF5 to use a new filter. Also
all HDF5 tools can work with all filters.


This is the way python works and how query languages execute user
defined functions.

I've used it myself successfully several times.


Cheers,

Ger

>>> Mark Miller <[email protected]> 10/13/2011 4:32 AM >>>
Sorry my last email was a bit off topic. I guess I latched on to the
"...we just need to know (in)formal requirements to integrate a new
filter" part.

So, related to this, I thought I recall a move afoot a couple of years
back now to formalize submission, repository of 3rd party HDF5
'filters'
of various sorts and compression in particular. What happened with
that?
I don't recall seeing much response on the forum to it but this sounds
like a good test case ;)

Mark


On Tue, 2011-10-11 at 08:18 -0700, Nathanael Huebbe wrote:
> Hello all,
>
> here at the DKRZ (German Climate Computing Center) we have to store
> large volumes of climate data, some of which is stored in
HDF5-files.
> So, during the last few month we have been doing some research into
> climate data compression.
>
> With very interesting results: We have been able to shrink our test
data
> set to 38.76% of the original file size. This is a compression
factor of
> more than 2.5 and it is significantly better than the performance of
all
> the standard methods we tested (bzip2 = 54%, gzip = 58%, sldc = 66%
and
> lzma = 46%). Also, we have seen that the lzma-algorithm performs
much
> better than the other standard algorithms.
>
> Even though we have constructed our methods to fit climate data, the
> features we exploited for compression are very general and likely to
> apply to other scientific data as well. This is why we are confident
> that many of you could profit from these methods as well, and we
would
> be happy to share our results with the rest of the community.
>
> The filtering mechanism in HDF5 predestines it to be the first place
for
> us to share our algorithms. But first we would be very interested to
see
> the lzma algorithm integrated as an optional filtering method,
something
> that should be very easy to do and offers large benefits to all
users.
>
> Since we would be willing to do the necessary work, we just need to
know
> the (in)formal requirements to integrate a new filter. And, of
course,
> we would be very interested to hear about other recent work which
> adresses compression in HDF5, and to get in touch with whoever works
on
> it.
>
> Best regards,
> Nathanael Hübbe
>
> http://wr.informatik.uni-hamburg.de/
> http://wr.informatik.uni-hamburg.de/people/nathanael_huebbe
>
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
--
Mark C. Miller, Lawrence Livermore National Laboratory
================!!LLNL BUSINESS ONLY!!================
[email protected]      urgent: [email protected]
T:8-6 (925)-423-5901    M/W/Th:7-12,2-7 (530)-753-8511


_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org




--
___________________________________________________________________________
Dr. Werner Benger                Visualization Research
Laboratory for Creative Arts and Technology (LCAT)
Center for Computation & Technology at Louisiana State University (CCT/LSU)
211 Johnston Hall, Baton Rouge, Louisiana 70803
Tel.: +1 225 578 4809                        Fax.: +1 225 578-5362

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to