Friedhelm,

   I’m not familiar with the specifics of your web app, but another possibility 
is to just have the app call h5serv directly.

   Anika @ NASA Goddard wrote a nice blog article on this approach: 
https://www.hdfgroup.org/2017/04/the-gfed-analysis-tool-an-hdf-server-implementation/.

John

From: Hdf-forum <hdf-forum-boun...@lists.hdfgroup.org> on behalf of Thomas 
Caswell <tcasw...@gmail.com>
Reply-To: HDF Users Discussion List <hdf-forum@lists.hdfgroup.org>
Date: Saturday, August 5, 2017 at 9:48 AM
To: HDF Users Discussion List <hdf-forum@lists.hdfgroup.org>, 
"friedhelm.mat...@iscad-it.de" <friedhelm.mat...@iscad-it.de>
Subject: Re: [Hdf-forum] hdf5 parallel h5py

I would also look at h5serv (https://github.com/HDFGroup/h5serv) which puts a 
server in front of your hdf5 file.  It serves as a single process owns it and 
serves as the serialization point which side-steps almost all of the 
multiple-client issues.  h5pyd (https://github.com/HDFGroup/h5pyd) is a client 
of h5serv which has an identical high-level API to h5py.

Tom

On Fri, Aug 4, 2017 at 12:01 PM Nelson, Jarom 
<nelso...@llnl.gov<mailto:nelso...@llnl.gov>> wrote:
It doesn’t sound like parallel HDF5 is what you are wanting to do here. 
Parallel HDF5 is for an application where all applications are writing in a 
very coordinated manner. All processes need to write the same metadata to the 
file in “collective” calls to the library, i.e. each application makes the same 
calls using the same arguments in the same order when making calls that modify 
the file metadata (creating files, datasets or groups, writing attributes, 
etc.<https://support.hdfgroup.org/HDF5/doc/RM/CollectiveCalls.html>).
It sounds like you have separate applications that are executing in a somewhat 
independent manner. This will not work with parallel HDF5.

Using the serial library, I can think of at least one approach that might work 
well for you. HDF5 1.10 introduced a Single-writer multi-reader (SWMR) mode = 
to open a file. Using a SWMR file for each process, each process would open one 
file as the writer in SWMR mode, and open the files from all the other 
processes as read-only in SWMR mode.

http://docs.h5py.org/en/latest/swmr.html

Jarom

From: Hdf-forum 
[mailto:hdf-forum-boun...@lists.hdfgroup.org<mailto:hdf-forum-boun...@lists.hdfgroup.org>]
 On Behalf Of ISCaD GmbH
Sent: Thursday, August 3, 2017 1:19 AM
To: hdf-forum@lists.hdfgroup.org<mailto:hdf-forum@lists.hdfgroup.org>
Subject: [Hdf-forum] hdf5 parallel h5py

Dear all,

I work on an web application which should store and receive the data from an 
hdf5 file.

Because several people work with this file and long running processes I would 
like
to use the mpi4py, h5py and HDF5.

I work on debian linux stretch 64 Bit.

What’s the way to parallel h5py.

Thanks and regards

Friedhelm Matten

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to