Re: [Hdf-forum] HDF5 1.8.11 release candidate is available for testing

2013-04-15 Thread Dana Robinson
The legacy Windows build scripts have been deprecated and removed.  We now
only support using CMake to construct Visual Studio solutions.
 Instructions for building with CMake can be found in the release_docs
directory.

Dana


On Mon, Apr 15, 2013 at 4:59 PM, Bernd Rinn br...@ethz.ch wrote:

 Hello Albert,

 The legacy Windows build scripts (subdirectory windows\) seem to be
 missing in hdf5-1.8.11-pre1.tar.gz. When I copy this directory over from
 hdf5-1.8.10-patch1.tar.bz2 and compile with VisualStudio 2008 on Windows
 XP (32bit), then I get the following fatal errors:


 -
 Error   113 error C2054: expected '(' to follow 'H5PLUGIN_DLL'
 c:\jhdf5\hdf5\hdf5-1.8.11-pre1\src\H5PLextern.h 76  hdf5

 Error   114 error C2085: 'H5PLget_plugin_type' : not in formal
 parameter list  c:\jhdf5\hdf5\hdf5-1.8.11-pre1\src\H5PLextern.h 76
  hdf5

 Error   115 error C2061: syntax error : identifier 'H5PLUGIN_DLL'
 c:\jhdf5\hdf5\hdf5-1.8.11-pre1\src\H5PLextern.h 77  hdf5

 Error   123 error C2054: expected '(' to follow 'H5PLUGIN_DLL'
 c:\jhdf5\hdf5\hdf5-1.8.11-pre1\src\H5PLextern.h 76  hdf5

 Error   124 error C2085: 'H5PLget_plugin_type' : not in formal
 parameter list  c:\jhdf5\hdf5\hdf5-1.8.11-pre1\src\H5PLextern.h 76
  hdf5

 Error   125 error C2061: syntax error : identifier 'H5PLUGIN_DLL'
 c:\jhdf5\hdf5\hdf5-1.8.11-pre1\src\H5PLextern.h 77  hdf5

 -

 Any idea what I need to do in order to fix those errors?

 Best regards,

 Bernd

 On 2013-04-12 22:59, Albert Cheng wrote:
  Hello everyone,
 
  A pre-release candidate version of HDF5 1.8.11 is available for testing
  and can be downloaded at the following link:
 
 
 http://www.hdfgroup.uiuc.edu/ftp/pub/outgoing/hdf5/hdf5-1.8.11/hdf5-1.8.11-pre1.tar.gz
 
 
  If you have some time to test this pre-release, we would greatly
  appreciate it. We try to test on a wide variety of platforms and
  environments but are unable to test everywhere so feedback from the user
  community is always welcome.
 
  Please note that while the release notes contained in the pre-release
  are reflective of the changes and additions present in this release, the
  'platforms tested' and 'tested configurations' sections have yet to be
  updated for this version of HDF5.
 
  We plan to release HDF5 1.8.11 in mid-May barring the discovery of any
  critical issues.
 
  Thank you!
 
  The HDF Group
 
 
  ___
  Hdf-forum is for HDF software users discussion.
  Hdf-forum@hdfgroup.org
  http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5 1.8.11 release candidate is available for testing

2013-04-15 Thread Dana Robinson
Hi Bernd,


On Mon, Apr 15, 2013 at 5:21 PM, Bernd Rinn br...@ethz.ch wrote:

 Thanks for the info. I'd suggest that you adapt
 release_docs/INSTALL_Windows.txt accordingly. It still reads:

 
 The old solutions and projects found in the windows\ folder will be
 maintained for legacy users until HDF5 1.10.
 


Thanks.  I'll pass that along.


 BTW: Are the compile errors I get due to using the legacy builds scripts?


Yes.  The old legacy build scripts won't know about any new files and it
looks like they are failing on the new dynamic filter plugin functionality.

Dana
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Query regarding HDF5

2013-03-21 Thread Dana Robinson


 C is binary compatible in a sense that there is no name mangling, but
 different compilers may need different C runtimes, specially if there is
 memory allocation/deallocation across the libraries it will be a disaster.


This is especially true if you are on Windows.  Each version of Visual
Studio will link against a different runtime.  Even debug and release are
different runtimes.

The most important thing to ensure is that memory allocation and freeing
are done in the same runtime.  Also file opens and closes.  Otherwise you
will get bugs that are very difficult to track down.

Dana
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDFds - proposed conventions for using HDF5 for data sharing

2013-01-30 Thread Dana Robinson
Hi Jeff,

I had a look at your document I have some comments:

(page 2) It looks like you intend for your schema to be a human-readable
description of how the data are organized and not a formal specification
that can be used to validate an HDF5 file via a check tool of some sort (as
in an XML schema).  Although I'm sure this is informally useful to you,
this lack of a machine-verifiable formal specification would be a major
weakness of HDF5ds.

(page 3) HDF5 not specifying how user metadata should be structured is not
really a limitation of HDF5.  Different users will have differing ideas
about what metadata is important so we don't lock people into a particular
arrangement.

(page 3) You can store mixed types in an attribute using a compound type.

(page 3) Encoding your metadata as a JSON object is similar to storing
parseable strings in database tables - you aren't leveraging the strength
of the platform.  In the grand scheme of things, it probably isn't a big
deal to store your metadata as JSON strings (especially if they are small
and infrequently accessed) and maybe that fits well into your code, but the
more HDF5-centered way to store that metadata would be as a several
independent attributes.

(page 4) HDF5 specifies references to nodes as absolute paths.  You can use
region references to refer to subsets of a dataset.  HDF5 also supports
external links to other files.

(page 4)  The term settings is probably too experimentally oriented for
general HDF5 use.  What does settings mean in a file that stores
phylogenetic tree data or patient history data?

(page 5)  The idea of associating attributes to collections of objects in
the HDF5 file is an interesting one, though I'm not sure how to cleanly
handle that off the top of my head.  Definitely something to keep in mind.
 I would want to handle that inside the library, though, and not via easily
broken parseable string attributes.

Unfortunately, I can't really weigh in on what sort of similar work has
been done in this area (others at THG will have to do that) but there's
clearly a need for some sort of formal, verifiable HDF5 schema.

Cheers,

Dana

On Tue, Jan 22, 2013 at 1:42 PM, Jeff Teeters jteet...@berkeley.edu wrote:

 I'm part of a group which is working to develop standards for sharing
 neuroscience electrophysiology data using HDF5.  As part of this effort we
 developed proposed conventions for using HDF5 for data sharing which are
 independent of any domain.  A main goal of these conventions is to provide
 a standard way of specifying schemata that describe data and metadata
 within an HDF5 file.  We named these proposed conventions, HDFds - (ds for
 data sharing).

 I'm concerned that some of what is proposed might overlap with previous
 work or that there might be better ways of achieving the desired
 functionality.  I would very much appreciate any feedback about these
 proposed conventions, either to the mailing list or to me directly.

 Thanks,
 Jeff


 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] h5fc on Linux

2013-01-05 Thread Dana Robinson
Hi Pradeep,

Did you configure with --enable-fortran?  Fortran is not built by default.
 You can see the configure options and their defaults with configure
--help.

Cheers,

Dana
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Fwd: Testing large dataset write *FAILED*

2012-12-12 Thread Dana Robinson
Hi Ahsan,

On Wed, Dec 12, 2012 at 12:32 AM, Syed Ahsan Ali Bokhari 
ahsan@gmail.com wrote:

 Finally I am able to build hdf-1.8.9 without error. But now netcdf
 installation gives error looking for libhdf5_hl.so.7, I checked this
 library is not present when after hdf5 is build.
 How can I get this library too?


Since 1.8.9 works, the problem may have been due to some POSIX
incorrectness issues I fixed about that time.  In POSIX read and write
calls, the number of bytes to write is unsigned and the return value (# of
bytes read/written) is signed.  Writing a number of bytes greater than can
be returned via the signed value is undefined behavior.  Back around 1.8.9
I changed the library to split those reads and writes into two I/O
operations.

I'm not sure why the high-level library isn't being built for you since it
should be built by default.  What do you have in the target directory after
running make install?  When you run configure, does it say High Level
Library: yes under Features?

Dana
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Fwd: Testing large dataset write *FAILED*

2012-12-11 Thread Dana Robinson
Hi Ahsan,


 I am trying to install hdf-1.8.6 on RHEL5.3, I configured with the
 following command

 ./configure --with-zlib=/usr/local/ --prefix=/usr/local/


Is your system 32 or 64-bit?  What file system are you using?  How much
free space do you have on the drive?  Do you have any disk quotas?  The
output of mount -l, df -H, and quota would be helpful.

Is the error repeatable?  You don't need to re-run make check - just run
test/big directly.



 make check install fails with the following error



 Test passed with the Family Driver.

 Checking if file system supports big files...
 Testing big file with the SEC2 Driver
 Testing large dataset write
 *FAILED*
 Command terminated by signal 11
 0.02user 0.03system 0:00.06elapsed 100%CPU (0avgtext+0avgdata
 0maxresident)k
 0inputs+0outputs (0major+7343minor)pagefaults 0swaps
 make[4]: *** [big.chkexe_] Error 1
 make[4]: Leaving directory `/home/pmdtest/hdf5-1.8.6/test'
 make[3]: *** [build-check-s] Error 2
 make[3]: Leaving directory `/home/pmdtest/hdf5-1.8.6/test'
 make[2]: *** [test] Error 2
 make[2]: Leaving directory `/home/pmdtest/hdf5-1.8.6/test'
 make[1]: *** [check-am] Error 2
 make[1]: Leaving directory `/home/pmdtest/hdf5-1.8.6/test'
 make: *** [check-recursive] Error 1
 [root@pmd03 hdf5-1.8.6]#


If it passes on the family driver and fails on SEC2, then the first thing
I'd check is that you don't have quota or disk space issues.

Cheers,

Dana
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5 string dataset compression

2012-12-11 Thread Dana Robinson
Hi Nathan,

On Tue, Dec 11, 2012 at 11:20 AM, Nathan Smith nathanjsm...@gmail.comwrote:

 Hi all,

 I'm very new to HDF5 and am trying to find an example of how to compress a
 string written to an HDF5 file.  I've been using the C++ API and writing
 scalar strings to my dataset, but I can't figure out how to use data
 chunking so that I can enable compression.


Variable-length data cannot be compressed due to the way we store it in the
HDF5 library.

Dana
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] linker error under cygwin

2012-11-22 Thread Dana Robinson
Hi,

Also, is the HDF5 library in a location where it can be found at run time?
 The default directory for make install is a subdirectory named 'hdf5'
under the build directory, which is probably not searched at run time.

I'm not all that familiar with cygwin, but that's the first thing I'd look
into.

Dana

On Thu, Nov 22, 2012 at 12:05 PM, George N. White III gnw...@gmail.comwrote:

 Is there a reason you can't use the existing Cygwin HDF5 1.8.9 port?  You
 might want to contact the Cygwin maintainer whose name appears in the
 Cygwin library docs.


 On Thu, Nov 22, 2012 at 6:09 AM, Philipp Kraus philipp.kr...@flashpixx.de
  wrote:

 Hello,

 I have build the HDF without Fortran support under Cygwin (used
 configure, make, make install).
 If I compile my program and the linker command comes up, an error is
 occured:

 /usr/lib/gcc/i686-pc-cygwin/4.**5.3/../../../../i686-pc-**cygwin/bin/ld:
 library/build_release/hdf/1.8.**10/lib/libhdf5_cpp.a(**H5CommonFG.o):
 bad reloc address 0x4 in section `.rdata$_ZTVN2H58CommonFGE[**vtable for
 H5::CommonFG]'

 I have tested HDF 1.8.9  1.8.10. How can I fix this problem?

 Thanks

 Phil



 __**_
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/**mailman/listinfo/hdf-forum_**hdfgroup.orghttp://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org




 --
 George N. White III aa...@chebucto.ns.ca
 Head of St. Margarets Bay, Nova Scotia

 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] memory leaks with a simple open/close ?!

2012-11-21 Thread Dana Robinson
Hi,

MSVCR80D.dll


Your executable is linked to MSVCR80D.dll.  I'd imagine that you'll find
that if you run dumpbin /imports on the HDF5 library, you'll find that it's
linked to MSVCR71.dll and that this is the problem.  The solution is to
either build your application with the same version of Visual Studio that
the library used or to rebuild that version of HDF5 with Visual Studio 2005.

Actually, the best solution would be to move to a currently-supported
version of HDF5 and a recent version of Visual Studio. HDF5 1.6.4 and
VS2005 are really, really old and out of date :)


 But what I am more concerned about: How this stuff behaves in Linux? Is it
 a
 Microsoft only problem or can this stuff even happen on Linux?


This is only a Windows problem.  In Linux the C runtime is a part of the
core operating system so the problem of unshared CRT state does not exist.

Dana
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] memory leaks with a simple open/close ?!

2012-11-20 Thread Dana Robinson
Hi,

On Tue, Nov 20, 2012 at 10:15 AM, saracaeus saraca...@gmx.de wrote:

 There are 2 HDF5 libraries: hdf5-1.6.4_d.lib (debug version) and
 hdf5-1.6.4.lib (release version). I use the debug version for my sample
 program which is compiled in debug mode. So this shouldnt be the problem.
 And it wouldn't explain, why the statically linked MFC library is working
 while the dll MFC version is not.

 But in the end the program will run under Linux, using Qt... The sample
 program is meant to make first steps in HDF5. So I dont think I will have
 the same probs in the final release :)


It's not just debug vs. release.  You also have to look at the underlying C
runtime, which should be the same for all libraries and executables.  You
can see what a particular library or program uses by executing dumpbin
/imports program from a Visual Studio command prompt.  Each one should
be linked to the same MSVCR#.dll.

The problem is that Windows implements the C library calls in separate
dlls, which do not share state.

Dana
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF Newsletter #129

2012-11-16 Thread Dana Robinson
Hi all,

I just wanted to mention that dropping VS2008 support is a very
 serious issue for anyone working with Python.  All major Python
 versions currently available (2.6, 2.7 and 3.2) are built with VS2008.
  Python extension modules (and any of their dependencies which share C
 standard library bits like malloc and free) have to be built with the
 same compiler in order to avoid mixing C runtimes.  Right now h5py
 (and I would expect, PyTables) is built against a version of HDF5
 compiled with VS2008 for this reason.  In particular, if you allocate
 any memory on the Python side which is freed by HDF5 (or vice versa)
 you have to use the same compiler.

 Ah, I hadn't thought of that.  I'll mention that to the appropriate people.

Does anyone know when the Python folks plan to upgrade to VS2010?  VS2008
is four years old and is now the n-2 version.  That's getting a little long
in the tooth.


 In practical terms, what this means for us is that the version of HDF5
 shipping with h5py for Python 2.6, 2.7 and 3.2 will be frozen at
 1.8.10.  In particular, HDF5 1.10 will also be unsupported on all
 currently available Python platforms, unless we can figure out a way
 around the C runtime issue.

 There is no way around the C runtime issue that I know of.

Do people who use h5py, etc. use our binaries (which I assume) or do they
build HDF5 via CMake?



 Would the HDF Group accept bug reports/patches from the community to
 fix issues with VS2008 as they come up?


I would assume so, but I don't think we've had a lot of problems with
VS-specific issues in HDF5.  Usually Windows bugs apply to all versions of
Visual Studio and Windows.

Cheers,

Dana
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5 and SWMR

2012-11-15 Thread Dana Robinson
Hi Shashi,

On Thu, Nov 15, 2012 at 2:30 PM, Shashi Kamath coolsha...@gmail.com wrote:

 Is there a version of HDF that is released that support SWMR( Single write
 multiple read) from different processes ?

 I read some Forum questions and it was told it will be released in Nov
 2011.


SWMR is still under development.  It should appear in HDF5 1.10.0.  There
is no official release date for that yet.

Dana
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF Newsletter #129

2012-11-13 Thread Dana Robinson
Hi Mike,

On Tue, Nov 13, 2012 at 9:03 PM, Michael Jackson 
mike.jack...@bluequartz.net wrote:

 Hello,
   First congratulations on the new release. I would like some
 clarifications on the Supported platform changes. Specifically the
 VS2008 and Windows XP. I can read that in 2 different ways:

 1: The *combination* of VS2008 running on Windows XP is deprecated
 2: VS2008 on *any* version of Windows AND Windows XP


Unless I am misinformed,

- VS2008 will not be supported.
- Windows XP will not be supported.

From what I know if the code, I'd imagine this shouldn't be an issue for
most people.  It mainly affects what binaries we produce and our test
systems.  CMake will still produce VS2008 projects and we don't have very
much Windows-specific code.  I can't think of anything off the top of my
head that won't work on any NT-based Windows system.

Dana
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] free memory for variable length types

2012-11-08 Thread Dana Robinson
Hi all,

On Thu, Nov 8, 2012 at 11:17 AM, sahon ssa...@gmail.com wrote:

 I've finally found the workaround. I've used debug configuration before,
 with
 release everything works fine, as I understood allocation/deallocation
 memory MUST have same configurations (debug/release).


All software must use the same version of the C runtime (CRT) in order to
avoid errors when CRT objects such as allocated memory are passed between
libraries and/or executables.  Since Microsoft implements each CRT in a
separate library (dll), you can't allocate memory in one CRT and free it in
another without risking heap errors.  You should also never statically link
to the CRT since each library/exe will have its own private version of the
CRT.

You can check the linked versions with dumpbin /imports via a Visual Studio
command prompt, though I don't think that will show static linkage.

Same version means 32/64-bit, debug/release, and Visual Studio version.
 We currently don't distribute debug binaries though there are plans to do
so in the near future.

I'm working on a document describing this in more detail.  It should appear
on the web sometime soon, hopefully this month.

Cheers,

Dana
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Problem writing 3GB to a data set.

2012-10-29 Thread Dana Robinson
Hi Mike,

 THis is with HDF version 1.8.9 compiled for 64 bit on OS X 10.6.8. Is there 
 something special I need to be doing in order to write more than 2^32? bytes 
 of data? Or maybe (most likely) there is a bug in how I am calling the HDF5 
 functions to write this data?

The VFDs break down reads and writes that are too large into several
smaller writes (too large = larger than the underlying function calls
can handle.  In particular the return type, which is signed, unlike
the # of bytes to read/write parameter, which is not).  For all
systems other than Windows, this is SSIZET_MAX.  So the driver should
handle very large writes just fine, even on 32-bit filesystems.

I'll take a look at this tomorrow.

Dana

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] 1.8.8+ Unicode and Windows OS

2012-10-12 Thread Dana Robinson
Hi Felix,

 Before 1.8.8  it was quite easy to extend H5FDwindows.c (windows vfd) to
 cope with Unicode by using the wopen/w_char API for IO access on Windows
 OS’.

 With 1.8.8. the windows driver was reduced to be a wrapper for the H5FD_SEC2
 driver.

 My question is what is the best way to come back to  Unicode Windows support
 for file access? H5FDstdio is not recommended for productive use but could
 be extended similar to H5FDwindows.c in the early versions. Re-animating the
 the full H5FDwindows seems to be against the development of the drivers,
 right?

I would say you have two options, assuming you are ok with modifying
the HDF5 code and rebuilding the library (as it seems from your
email):

1) Just modify the SEC2 driver the way you used to modify the Windows
driver.  They were basically the same, which was the reason that we
tossed the Windows VFD.

2) Copy the SEC2 VFD code, rename it H5FDWindows.c, adjust the
internal names to use 'Windows' instead of 'sec2', make your fix, and
recompile.

A true Windows driver could be written pretty easily, and is at the
back of my queue, but is currently unfunded.  Also, the stdio code is,
in fact, intended to be a demo driver for writing your own VFD (it
only uses the public API, not our internal API).  It isn't
particularly well tested and there are a few open bugs related to
external links that have yet to be fixed.

Cheers,

Dana

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Fwd: Unable to use 1.8 API macro mappings

2012-01-12 Thread Dana Robinson
Hi Gordon,

 Thank you for your reply. I was hoping to avoid the function mapping
 macro definitions. It was good of you to gather them together.

 My understanding is that most 1.8 libraries do not require any macro
 definitions and the 1.8 backing functions are used by default. I had
 overlooked the library mapping flag --with-default-api-version=v16
 which is what must have been used in this case.

Did you install HDF5 from the Ubuntu package manager?  That version of
HDF5 is maintained by a third party and is wildly out of date.  It is
much better to build HDF5 from source.

Cheers,

Dana Robinson

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] hdf5 and ImageMagick conflicts on Windows 32-bit

2011-12-02 Thread Dana Robinson
Hi all,

On Fri, Dec 2, 2011 at 8:25 AM, Michael Jackson
mike.jack...@bluequartz.net wrote:
 (VERY Red faced).
  That is on OS X that there is a difference:

 32 Bit:
 sizeof(int): 4
 sizeof(long): 4
 sizeof(long long): 8

 64 Bit
 sizeof(int): 4
 sizeof(long): 8
 sizeof(long long): 8

 Again, sorry for the confusion. Trying to keep it all straight is confusing. 
 Over the years I have just stayed away from long where absolutely possible.

Windows is LLP64, most other platforms are LP64.

http://en.wikipedia.org/wiki/64-bit  (scroll down to 64-bit data models)
http://www.unix.org/version2/whatsnew/lp64_wp.html

 (ie just inverting the order from INT - LONG - LONG LONG to LONG LONG - 
 LONG - INT.) In this way ssize_t is of type long and there is no conflict 
 with ImageMagick. And I think there is no real change in HDF5 too, since 
 sizeof(int)==sizeof(long).

The problem is that ssize_t appears in our public API, yet is not
standard C (it's POSIX).  Typically, in a project that started on
Linux that is being ported to Windows, the Linux project will
implement a hard-coded work-around, as we have in HDF5.  The problem
comes when these work-arounds collide, as they have here.  Configure
and build systems like CMake wouldn't even help here since there are
essentially two choices, neither of which is standard and either of
which will make someone's upstream software unhappy.

Unfortunately, although changing the order of the #defines solves the
problem for ImageMagick, it will cause the same problem for anyone who
expected the former typedef.  Since so much software exists that
builds on HDF5 and uses the public headers it is very likely that we
would break something, possibly in an important customer, so we've
decided to leave the behavior as it is.  As a work-around, you'll
probably have to hack up the header file for one of the packages.

The real solution is to remove non-ANSI (or non-HDF group specific)
types from the public API to avoid this type of problem in the future.
 Since this would amount to an API-breaking change, it will have to
wait for a major version change (HDF5 1.10.0) to be considered.  I
will enter an issue in our bug-tracking system so that it'll be on our
radar in the future.

Thank you for bringing this to our attention!

Cheers,

Dana

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5 check fail

2011-11-27 Thread Dana Robinson
Hi all,

Thank you for the detailed bug reports!

This has been entered as a high-priority bug (HDFFV-7829) in JIRA.
We'll be addressing it immediately and will post more information when
we know more and/or have a solution.

The current work-around is to either build with gcc  4.5 (or a
different compiler) or use pre-compiled binaries.

Sorry for the inconvenience,

Dana Robinson
The HDF Group

On Sat, Nov 26, 2011 at 4:34 PM, George N. White III gnw...@gmail.com wrote:
 On Fri, Nov 25, 2011 at 10:41 AM, marcialieec marciali...@gmail.com wrote:
 Hi all,

 I've been checking both the 1.8.7 and 1.8.8 HDF5 libraries and when I do the
 make check I get the attached error.

 - Any ideas on what might be causing the error and how to fix it?
 - Should we worry about it, or it's just the check that fails?

 I just built 1.8.7 on Ubuntu oneiric (as part of NASA SeaDAS 6.3).  I
 also get failures for dt_arith, but no signal 11:

 $ grep FAIL *.log
 Testing hard signed char - long double conversions                   *FAILED*
 Testing hard unsigned char - long double conversions                 *FAILED*
 Testing hard short - long double conversions                         *FAILED*
 Testing hard unsigned short - long double conversions                *FAILED*
 Testing hard int - long double conversions                           *FAILED*
 Testing hard unsigned int - long double conversions                  *FAILED*
 Testing hard long - long double conversions                          *FAILED*
 Testing hard unsigned long - long double conversions                 *FAILED*

 There are many warnings from the compile of dt_arith

 Linux dormarth 3.0.0-13-generic #22-Ubuntu SMP Wed Nov 2 13:27:26 UTC
 2011 x86_64 x86_64 x86_64 GNU/Linux

 $ gcc --version
 gcc-4.6 (Ubuntu/Linaro 4.6.1-9ubuntu3) 4.6.1

 Using gcc-4.5 there are no failures for any of the tests.


 Thanks,
 Marcial

 
 Testing  dt_arith
 
  dt_arith  Test Log
 
 [passed all previous tests]
 .
 Testing hard normalized float - long conversions
 PASSED
 Testing hard normalized double - long conversions
 PASSED
 Testing hard normalized float - unsigned long conversions
 PASSED
 Testing hard normalized double - unsigned long conversions
 PASSED
 *Testing hard normalized long double - signed char conversions
 Command terminated by signal 11*
 0.15user 0.02system 0:00.24elapsed 71%CPU (0avgtext+0avgdata
 28512maxresident)k
 16inputs+280outputs (0major+16993minor)pagefaults 0swaps
 make[4]: *** [dt_arith.chkexe_] Error 1

 *The machine details:*

 Linux Obiwan 3.0.0-13-generic #22-Ubuntu SMP Wed Nov 2 13:27:26 UTC 2011
 x86_64 x86_64 x86_64 GNU/Linux

 Using built-in specs.
 COLLECT_GCC=gcc
 COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-linux-gnu/4.6.1/lto-wrapper
 Target: x86_64-linux-gnu
 Configured with: ../src/configure -v --with-pkgversion='Ubuntu/Linaro
 4.6.1-9ubuntu3' --with-bugurl=file:///usr/share/doc/gcc-4.6/README.Bugs
 --enable-languages=c,c++,fortran,objc,obj-c++,go --prefix=/usr
 --program-suffix=-4.6 --enable-shared --enable-linker-build-id
 --with-system-zlib --libexecdir=/usr/lib --without-included-gettext
 --enable-threads=posix --with-gxx-include-dir=/usr/include/c++/4.6
 --libdir=/usr/lib --enable-nls --with-sysroot=/ --enable-clocale=gnu
 --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-plugin
 --enable-objc-gc --disable-werror --with-arch-32=i686 --with-tune=generic
 --enable-checking=release --build=x86_64-linux-gnu --host=x86_64-linux-gnu
 --target=x86_64-linux-gnu
 Thread model: posix
 gcc version 4.6.1 (Ubuntu/Linaro 4.6.1-9ubuntu3)

 Using gcc-4.5 there are no failures for any of the tests.

 gcc-4.5 (Ubuntu/Linaro 4.5.3-9ubuntu1) 4.5.4



 --
 George N. White III aa...@chebucto.ns.ca
 Head of St. Margarets Bay, Nova Scotia

 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5 1.8.8 release candidate is available for testing

2011-11-02 Thread Dana Robinson
 This is exactly the kind of situation where it is handy to have daily build 
 results reported to a dashboard.  It would be nice if the HDF5 dashboard had 
 more submissions.  The 'big' test has been failing on one of my machines for 
 days:
 http://cdash.hdfgroup.uiuc.edu/index.php?project=HDF518

Hi Sean,

I can't speak to the dashboard issue, but the stdio VFD test that is
failing is not that surprising.  The stdio VFD is intended to be a
demo (how to write a VFD using the VFD API) driver and is not as
well tested as the production VFDs.  Making the stdio VFD more robust,
especially for large files, is on my to-do list.

Cheers,

Dana

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5 1.8.8 release candidate is available for testing

2011-10-31 Thread Dana Robinson
Hi Bernd,

Thanks for finding that!

The reason I added -lbsd-compat is that it should be linked when
-D_BSD_SOURCE is defined, which has been true in the library for a
long time.  See the GNU libc recommendations here here:

http://www.gnu.org/s/hello/manual/libc/Feature-Test-Macros.html

I do not know if we use any features from 4.3 BSD that conflict with
the POSIX 1 standard (the purpose of the symbol is to resolve the
conflicts in favor of BSD), though I doubt that we do.  If one goes,
the other should go as well.

If we are going to yank -D_BSD_SOURCE and -lbsd-compat this for the
release, I can test this on the BSD VMs tonight.  Those would be the
most likely systems to have issues.

Cheers,

Dana

On Mon, Oct 31, 2011 at 4:58 AM, Bernd Rinn bernd.r...@bsse.ethz.ch wrote:
 Hi Mike,

 I'd like to report a regression of 1.8.8-pre1 (compared to 1.8.7) on
 Linux i386 with GCC 4.6.2. I get the following test failure (excerpt):

 
 
 Testing  big
 
  big  Test Log
 
 Testing big file with the Family Driver
 Checking if file system is adequate for this test...
 Testing Huge dataset write
 PASSED
 #000 0x002416144964nknownx
     HDF5-DIAG: Error detected in HDF5 (1.8.8-snap1) thread 0:
  #000: H5Dio.c line 153 in H5Dread(): selection+offset not within extent
    major: Dataspace
    minor: Out of range
 *FAILED*
         at big.c:523 in reader()...
 HDF5-DIAG: Error detected in HDF5 (1.8.8-snap1) thread 0:
  #000: H5Dio.c line 153 in H5Dread(): selection+offset not within extent
    major: Dataspace
    minor: Out of range
 *** TEST FAILED ***
 

 I tracked down the test failure is happend due to adding -lbsd-compat
 in r20898 and can be fixed by applying the patch attached (which reverts
 one line from r20898).

 I need to note that I'm using a special build setup:

 The build machine is CentOS 4.6 and I have bootstrapped GCC 4.6.2
 myself. The regression does not occur when using a bootstrapped GCC
 4.2.4 instead of GCC 4.6.2. Unfortunately I have no distribution handy
 that ships with GCC 4.6 as the system compiler (like openSUSE 12.1
 will). It is entirely possible that the issue I am reporting is specific
 to the combo RHEL4/GCC4.6.

 Best regards,

 Bernd

 On 11-10-27 17:46, Mike McGreevy wrote:
 Hi all,

 A pre-release candidate version of HDF5 1.8.8 is available for testing
 and can be downloaded at the following link:

 http://www.hdfgroup.uiuc.edu/ftp/pub/outgoing/hdf5/hdf5-1.8.8-pre1/

 If you have some time to test this, we would greatly appreciate it.
 While we test on a wide variety of platforms, there are many systems and
 setups that we are unable to test on, and feedback from the user
 community is quite welcome.

 Of particular interest in this maintenance release is the introduction
 of several additions to the Fortran interface:

   -- Fortran wrappers for the Dimension Scales APIs were added. See
 http://www.hdfgroup.org/HDF5/doc_test/H51.8docs/HL/RM_H5DS.html for
      the subroutines signatures.

   -- Fortran interface was enhanced to support the Fortran 2003
 standard. The release provides a wider set of Fortran and
      HDF5 datatypes such as
            - any kind of INTEGER or REAL
            - Fortran derived types
            - Fortran and HDF5 enumeration
            - HDF5 variable-length datatypes
            - HDF5 compound datatypes of any complexity

      It also contains new subroutines corresponding to the C functions
 with the callback functions as parameters. For general overview and
 information
      on how to enable these new features please see New Features in the
 HDF5 Fortran Library: Adding support for the Fortran 2003 standard
 paper posted
      at
 http://www.hdfgroup.org/HDF5/doc_test/H51.8docs/fortran/index.html. You
 may download HDF5 Fortran 2003 examples (names have F03 suffix) from

 http://www.hdfgroup.org/ftp/HDF5/examples/examples-by-api/api18-fortran.html


 We would appreciate feedback from the community on these additions.

 Please note that while the release notes contained in the pre-release
 are reflective of the new features and bug fixes present in this
 release, the platforms tested and tested configurations sections still
 need to be updated with our current testing configurations and are not
 reflective of this pre-release candidate at this time.

 We plan to release HDF5 1.8.8 mid-November barring the discovery of any
 critical issues.

 Thank you!

 The HDF Group

 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.

Re: [Hdf-forum] HDF5 linking error

2011-10-18 Thread Dana Robinson
Hi Ichikawa,

I am out of the office until tomorrow, but I will look at this tomorrow morning.

_timezone should be declared if time.h is #included when time.h is
included so I'll check into that.

http://msdn.microsoft.com/en-us/library/htb3tdkc(v=VS.100).aspx

Cheers,

Dana

On Sat, Oct 15, 2011 at 9:02 PM,  ichikawa.dor...@jamss.co.jp wrote:
 Hello

 When I am linking  C source code h5ex_t_arrayatt.c downloaded from HDF5 
 home page with hdf5.lib, it has been shown following error messages.

 1Linking...
 1hdf5.lib(H5Omtime.obj) : error LNK2019: unresolved external symbol 
 _timezone referenced in function _H5O_mtime_decode
 1oldnames.lib(timezone.obj) : error LNK2001: unresolved external symbol 
 _timezone
 1oldnames.lib(timezone.obj) : error LNK2001: unresolved external symbol 
 __timezone
 1C:\Users\Anders\Documents\src32\vizir\Debug\vizir.exe : fatal error 
 LNK1120: 2 unresolved externals


 I use Visual Studio 2008 and pre-compiled hdf5.lib downloaded from HDF5 page.

 Could you please advice how to correct it?

 Regrads,
 Ichikawa

 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Compression filters

2011-10-12 Thread Dana Robinson
Hi Nathanael,

One thing to keep in mind is that HDF5 is BSD licensed so you might
want to make your compression code BSD licensed as well, if that is
possible at your organization.

Cheers,

Dana

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Vlen compression

2011-10-05 Thread Dana Robinson
Hi Marcial,

On Wed, Oct 5, 2011 at 10:22 AM, marcialieec marciali...@gmail.com wrote:
 Hi!

 I'm using HDF5 to create an intermediate storage stage for an ESA project.
 Currently our HDF5 file structure is the following:

 /Dset1 (with a compound datatype)
 /Dset2 (vlen datatype)
 /Dset3 (vlen datatype)

 Most of the data is in datasets 2 and 3.

 We are interested in using the HDF5 chunking and compression features.
 However, although we have activated the GZIP compression the file almost
 does not get compressed at all. I've been searching and I found some posts
 saying that compression in vlen is only acting on the references of the
 vlen dataset.

 My questions are:

 Is this still the actual situation? Is there a plan to have this
 solved/modified in the near future?
 What are our other options if vlen datasets cannot be compressed? Assuming
 our data has a variable length nature, of course.

Yes, this is still the case.  I'm pretty sure this is not going to be
fixed anytime soon since it involves major library changes, though
it's something we'd like to do.

You have two other options:

1) If your data is only a little variable, you can create a regular
chunked and compressed dataset that is guaranteed to fit any
reasonable data, and store your data with some trailing empty space.
The compression will usually efficiently handle the trailing space,
even if it's quite large.  The downside is that guessing a good size
can be difficult and you'll have to come up with a scheme for handling
data that exceed the pre-guessed fixed size.  If you have a situation
where you can pre-scan the input, you can always compute the fixed
sizes to use.

2) Store your data concatenated in a 1D dataset and use a second
dataset of start/end indexes to get the data for each element.  This
is basically how the HDF5 variable-length datatype works; you are just
implementing it at a higher level using the public API.  This probably
will result in a larger file than the first scheme that I mentioned,
and will have a slower access time but your mileage may vary so you'll
want to try both on realistic data.  It does have the advantage of
handling any size data.

I've also kicked around the idea of a hybrid scheme that uses a
fixed-size dataset where the first bytes are interpreted as indexes
into a secondary concatenated dataset if a magic byte, bit in a bitmap
index, etc. is set, but this would be harder to implement in a nice
way using the public API.

Cheers,

Dana

 Thanks!
 Marcial

 --
 View this message in context: 
 http://hdf-forum.184993.n3.nabble.com/Vlen-compression-tp3396836p3396836.html
 Sent from the hdf-forum mailing list archive at Nabble.com.

 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] szip Linking error

2010-06-09 Thread Dana Robinson
Have you tried adding the path to the szip dll to your PATH environment
variable?

Dana

On Wed, Jun 9, 2010 at 7:28 PM, kunal rao kunalg...@gmail.com wrote:

 I tried by setting the environment variable HDF5_EXT_SZIP with value:

 1) C:\MyHDFStuff\szip\dll\szip.lib
 2) C:\MyHDFStuff\szip\dll\szip.dll

 Both don't work..If I disable szip compression..Then it builds
 successfully..

 What might be going wrong?

 Thanks  Regards,
 Kunal


 On Wed, Jun 9, 2010 at 8:19 PM, kunal rao kunalg...@gmail.com wrote:

 against .lib as mentioned in the document.


 On Wed, Jun 9, 2010 at 8:08 PM, Mike Jackson mike.jack...@bluequartz.net
  wrote:

 Are linking against the .lib or the .dll file?

 -
 Mike Jackson   http://www.bluequartz.net
 www.bluequartz.net
 Principal Software Engineer   mike.jack...@bluequartz.net
 mike.jack...@bluequartz.net
 BlueQuartz Software   Dayton, Ohio


 On Jun 9, 2010, at 19:43, kunal rao kunalg...@gmail.com wrote:

 Hi,

 I am trying to build hdf5-1.8 with Visual Studio 2008 under Windows HPC
 Server 2008 (x64).

 I am doing the exact steps as mentioned in INSTALL_Windows.txt.

 I have set the environment variables:

 HDF5_EXT_ZLIB
 HDF5_EXT_SZIP

 and also included the header and library directories for them in Visual
 Studio. But when I build the solution, I get the following error:

 

 5Linking...
 13-- Build started: Project: h5repart_gentest, Configuration: Debug
 x64 --
 8C:\MyHDFStuff\szip\dll\szip.dll : fatal error LNK1107: invalid or
 corrupt file: cannot read at 0x2E8
 6C:\MyHDFStuff\szip\dll\szip.dll : fatal error LNK1107: invalid or
 corrupt file: cannot read at 0x2E8

 

 I have downloaded szip from : 
 ftp://ftp.hdfgroup.org/lib-external/szip/2.1/bin/windows
 ftp://ftp.hdfgroup.org/lib-external/szip/2.1/bin/windows

 and have selected: 
 szip-2.1-win64-vs2008-enc.zipftp://ftp.hdfgroup.org/lib-external/szip/2.1/bin/windows/szip-2.1-win64-vs2008-enc.zip

 Can you please suggest how should I rectify this?

 Thanks  Regards,
 Kunal


 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org




 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Building with Debug but 1 project fails in Release mode

2010-06-09 Thread Dana Robinson
Do you have _HDF5USEDLL_ defined for the release as well as the debug
projects?

Dana


On Wed, Jun 9, 2010 at 8:11 PM, kunal rao kunalg...@gmail.com wrote:

 Hi,

 I am building HDF5-1.8 with Visual Studio 2008 under Windows HPC Server
 2008 (x64).

 I am able to build the solution successfully in Debug mode but in Release
 mode 1 project fails. The result of the build is:


 == Build: 160 succeeded, 1 failed, 1 up-to-date, 1 skipped
 ==

 The failed project info is:


 -

 61objcopy.obj : error LNK2019: unresolved external symbol __imp_h5_reset
 referenced in function main

 61..\..\..\test\objcopydll\Release\objcopydll.exe : fatal error LNK1120:
 111 unresolved externals

 74Linking...

 61Build log was saved at
 file://c:\MyHDFStuff\hdf5\test\objcopydll\Release\BuildLog.htm

 61objcopydll - 112 error(s), 0 warning(s)


 ---

 I am attaching the complete log of the particular failed project. Kindly
 have a look and please let me know

 what am I missing.

 Thanks  Regards,

 Kunal

 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Won't Let Go

2010-04-23 Thread Dana Robinson
The .NET garbage collector is just for memory and not system resources.  It
won't clean up window and file handles, database connections, etc.  Windows
will complain if you try to move a file that has an open file handle so that
needs to be fully closed.  You can check as per Peter's suggestion and close
anything that is still open.

Dana

On Fri, Apr 23, 2010 at 3:17 PM, Peter Cao x...@hdfgroup.org wrote:

 garbage collection may not be able to close all things opened by HDF5
 calls.
 You can use H5Fget_obj_count( ) to check if there is any open object .

 Mitchell, Scott - IS wrote:


 I’m running an application as a Windows Service. At the end of a test, the
 datasets  file are closed. And since this is .NET, I’ve forced garbage
 collection to make sure all that stuff is freed. The service continues to
 run.

 However when I try to modify, move, whatever the HDF5 files that I’ve just
 finished, Windows (Server 2008) won’t let me because it believes my service
 is still connected. I end up having to restart the service before they’re
 free.

 Is there something else I need to do to get HDF to let go of them?

 Scott


 
 This e-mail and any files transmitted with it may be proprietary and are
 intended solely for the use of the individual or entity to whom they are
 addressed. If you have received this e-mail in error please notify the
 sender.
 Please note that any views or opinions presented in this e-mail are solely
 those of the author and do not necessarily represent those of ITT
 Corporation. The recipient should check this e-mail and any attachments for
 the presence of viruses. ITT accepts no liability for any damage caused by
 any virus transmitted by this e-mail.
 

 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5-1.8.4 patch1 on Windows7 64bit

2010-03-22 Thread Dana Robinson
Does the SP1 patch replace the msvcr80.dll file?  If so, then everything
should be fine.  If not, and there are two versions of msvcr80 on the system
(perhaps using SxS deployment), then you may have problems since different
dlls have different state and won't share CRT resources.

Dana

On Mon, Mar 22, 2010 at 9:58 AM, Michael Jackson 
mike.jack...@bluequartz.net wrote:

 As I am still learning about this myself, does this mean that I can build
 something with a RTM version of Visual Studio 2005 and still link against
 the HDF5 libraries that were built with VS 2005 SP1 + Security Update? That
 would seem to indicate that my application would need/use the newer c/c++
 runtimes that the latest HDF5 would need? Everytime I try something like
 this (usually through accident) I get random crashes and other
 irregularities with HDF5 operation.

 ___
 Mike Jackson  www.bluequartz.net

 On Mar 22, 2010, at 10:56 AM, Allen D Byrne wrote:

  I think have a handle on this runtime library problem. I was able to get
 consistent results to come to the following conclusion:

 Users must have a runtime library version at least equal to the version
 used to build product. Also, it is compiler dependent; mscvr80.dll for
 VS2005 and mscvr90.dll for VS2008.

 I keep the machines that I build on up to date with MS updates, and there
 was a recent security patch that advanced the version of the runtime library
 for VS2005. I found the same version in a redistributable download from MS
 called VS2008 C++ security patch. Yes it has the mscvr80.dll version needed.

 I have been able to rebuild all versions of szip(removed static
 dependency), zlib and hdf5 1.8.4-patch1 with the BIND define that was
 recommended and I will be making all these versions available as soon as I
 get the documentation updated.

 Comments?

 Allen

  As an example of extreme naming take a look at 
 http://www.boost.org/doc/libs/1_42_0/more/getting_started/windows.html

 in section 6.3 Library Naming. This gets dicey quick as not every

 option is supported on every OS.

  Also, MS may be keeping the same version of the C/C++ runtimes but
 I have a hard time believing that the C/C++ runtimes between VS2008
 and VS2010 are going to be interchangeable ( glass is half-empty
 attitude) but I would be happy to be proven wrong.

 ___
 Mike Jackson  www.bluequartz.net


 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5-1.8.4 patch1 on Windows7 64bit

2010-03-17 Thread Dana Robinson
I don't know if this solves people's linking problems but I thought I'd
weigh in with what I know about Windows C runtime (CRT) linking.

From what I understand, it boils down to this:

Every library and executable in your application should use the same
instance of the C runtime library.

The reason for this is that separate dlls have a separate internal state and
thus different ideas about things like valid file handles and allocated
memory.  For example, if you malloc() some memory in a library linked to
msvcr80.dll and then try to realloc() or free() it in a program linked to
msvcr90.dll, you will have problems since msvcr90 doesn't know about the
first allocation's internal memory handles (writing to memory allocated
using a different C runtime should work fine, though, since the underlying
internal memory handles are not being manipulated).  Debug libraries (e.g.
msvcr80d.dll) are also different in this respect since they are different
dlls so using a library compiled in release mode with an executable compiled
in debug mode can cause problems.  Keep in mind that many things will appear
to work just fine when you mix C runtime libraries - it's just when you pass
CRT resources around that you'll run into issues.  It's also important to
remember that statically linking to a C runtime gives you a private version
of that runtime that is NOT shared with any other components.  For this
reason, it is recommended that you only statically link against a library
when you are building an executable and NOT when you are building a library.

You can check your dependencies with dumpbin if you have Visual Studio
installed.  Fire up a Visual Studio command prompt and use dumpbin /imports
path to library or program to see which dynamic libraries a particular
program or dll requires.  The existing 32-bit HDF binaries show a dependency
on msvcr80.dll (Visual C++ 8.0 / Visual Studio 2005), the VS2005 version of
zlib is also linked to msvcr80.dll and szip does not require any C library
functions (it's either just linked to kernel32 and uses Windows functions or
it incorrectly statically links to the CRT - I'll have to check that out).
The tools are an odd case - they are statically linked to HDF5 and have a
private version of the C runtime that is separate from, say, msvcr80.dll but
are also dynamically linked to zlib which is, in turn, linked to
msvcr80.dll.  This means that zlib code used by the tools and the rest of
the tool code will have different CRT states.

So this is (pretty much) all well and good as long as you are using Visual
Studio 2005.  Problems can arise, however, when you try to use Visual Studio
2008 to build an application that links against VS2005 versions of HDF5.
VS2008 will link your application against msvcr90.dll and this can cause the
issues described above.  Unfortunately, we do not distribute VS2008 binaries
on the HDF5 downloads page but I'm sure this will be rectified soon.

We're looking into the library issues here and we'll have more information
in the future, but I think a short term roadmap for us could be:

1) We need to offer Visual Studio 2008 HDF5 binaries that are linked against
msvcr90.dll.

2) Our binaries (HDF5, szip, zlib) need to include debug libraries so that
users can properly link when creating debug builds.

3) We need to educate our Windows users about these issues, probably via a
README document in the distribution.

4) Our static tools should be 100% statically linked.


Also, in general, you should use the official Windows CRT installers to
install the required C runtime, instead of just copying dlls around.  You
can find the proper installers on Microsoft's website (links which will
surely go stale soon provided below).  I have no idea why the CRTs aren't
just considered a normal part of a Windows installation and maintained
accordingly.  You'd think that would make everyone's life easier.

Visual Studio 2005 SP1
http://www.microsoft.com/downloads/details.aspx?displaylang=enFamilyID=200b2fd9-ae1a-4a14-984d-389c36f85647

Visual Studio 2008 SP1
http://www.microsoft.com/downloads/details.aspx?displaylang=enFamilyID=fbee1648-7106-44a7-9649-6d9f6d58056e


I hope that helps.

Dana Robinson
Bioinformatics Software Engineer
The HDF Group
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5-1.8.4 patch1 on Windows7 64bit

2010-03-17 Thread Dana Robinson
 to the Service Pack AND Security Patch Level.
  RULE #2: NEVER break rule #1.

 Following those rules will allow you to have a much easier time with
 debugging/running/compiling on windows. What do I mean by exact versions?
 Each library that you are going to compile into your application MUST be
 linked against the same version of the C/C++ runtime libraries with respect
 to the following:
  Single/Multi Threaded
  Static/Dynamic Linked to the Runtimes
  Debug/Release Versions of each runtime.

 Again, following the above will help the developer have a much easier
 time with Visual Studio Development.

 If you get link warnings about adding NODEFAULTLIB then you are NOT
 following the above rules and nice random crashes and bugs WILL pop up in
 your program.

 What can HDFGroup do to Help? PLEASE PLEASE PLEASE list the exact version
 of Visual Studio used to compile the binaries. This should be something like
 Visual Studio 2008 SP1 (9.0.xxx.xxx SPI) or if the original Visual Studio
 2008 was used, then 9.0.xxx.xxx RTM). These values can be found by going to
 Help-About Visual Studio. Also explicitly list HOW they were compiled
 with respect to Single/MultiThreaded, Static/Dynamic Runtime linkage,
 Dynamic/Static libraries, Debug/Release compiles.

  Having this information on the download page will help alleviate help
 emails from confused users.

  Again, these are my experiences, Yours may vary. Following those rules
 helps keep things simple for Visual Studio Development.
 ___
 Mike Jackson  www.bluequartz.net
 Principal Software Engineer   mike.jack...@bluequartz.net
 BlueQuartz Software   Dayton, Ohio


 On Mar 17, 2010, at 12:29 PM, Dana Robinson wrote:

  I don't know if this solves people's linking problems but I thought I'd
 weigh in with what I know about Windows C runtime (CRT) linking.

 From what I understand, it boils down to this:

 Every library and executable in your application should use the same
 instance of the C runtime library.

 The reason for this is that separate dlls have a separate internal state
 and thus different ideas about things like valid file handles and allocated
 memory.  For example, if you malloc() some memory in a library linked to
 msvcr80.dll and then try to realloc() or free() it in a program linked to
 msvcr90.dll, you will have problems since msvcr90 doesn't know about the
 first allocation's internal memory handles (writing to memory allocated
 using a different C runtime should work fine, though, since the underlying
 internal memory handles are not being manipulated).  Debug libraries (e.g.
 msvcr80d.dll) are also different in this respect since they are different
 dlls so using a library compiled in release mode with an executable 
 compiled
 in debug mode can cause problems.  Keep in mind that many things will 
 appear
 to work just fine when you mix C runtime libraries - it's just when you 
 pass
 CRT resources around that you'll run into issues.  It's also important to
 remember that statically linking to a C runtime gives you a private version
 of that runtime that is NOT shared with any other components.  For this
 reason, it is recommended that you only statically link against a library
 when you are building an executable and NOT when you are building a 
 library.

 You can check your dependencies with dumpbin if you have Visual Studio
 installed.  Fire up a Visual Studio command prompt and use dumpbin /imports
 path to library or program to see which dynamic libraries a particular
 program or dll requires.  The existing 32-bit HDF binaries show a 
 dependency
 on msvcr80.dll (Visual C++ 8.0 / Visual Studio 2005), the VS2005 version of
 zlib is also linked to msvcr80.dll and szip does not require any C library
 functions (it's either just linked to kernel32 and uses Windows functions 
 or
 it incorrectly statically links to the CRT - I'll have to check that out).
  The tools are an odd case - they are statically linked to HDF5 and have a
 private version of the C runtime that is separate from, say, msvcr80.dll 
 but
 are also dynamically linked to zlib which is, in turn, linked to
 msvcr80.dll.  This means that zlib code used by the tools and the rest of
 the tool code will have different CRT states.

 So this is (pretty much) all well and good as long as you are using
 Visual Studio 2005.  Problems can arise, however, when you try to use 
 Visual
 Studio 2008 to build an application that links against VS2005 versions of
 HDF5.  VS2008 will link your application against msvcr90.dll and this can
 cause the issues described above.  Unfortunately, we do not distribute
 VS2008 binaries on the HDF5 downloads page but I'm sure this will be
 rectified soon.

 We're looking into the library issues here and we'll have more
 information in the future, but I think a short term roadmap for us could 
 be:

 1) We need to offer Visual Studio 2008 HDF5 binaries

Re: [Hdf-forum] HDF5-1.8.4 patch1 on Windows7 64bit

2010-03-17 Thread Dana Robinson
Good point.  I hadn't thought of that.  I'll leave them unchanged.

Dana

On Wed, Mar 17, 2010 at 3:08 PM, Michael Jackson 
mike.jack...@bluequartz.net wrote:

 please don't do that..

  Boost does this and absolutely wreaks havoc on things like CMake and other
 build tools that are relying on a library naming scheme NOT changing. In
 particular if you are going to switch up to CMake in the near future you are
 going to have to sync with CMake for releases so that FindHDF5.cmake can
 stay constantly updated with all the new flavors of Visual Studio, GCC, ICC,
 Absoft, Borland that come out. Read back through some of the  CMake archives
 and look for things like CMake can't find boost, but I have it installed.
 Messages like that hit about once a month because Boost seems to want to
 change their naming scheme at their leisure. Now look at Qt from Nokia.
 CMake has FindQt4.cmake which doesn't get that many updates and seems to
 always work because the library naming is consistent from release to release
 and across platforms.

  So in light of this I wold make it painfully obvious on your download site
 how your binaries were built. Eclipse has a click through page for OS X
 downloads that has a short blurb regarding the needed things for Eclipse on
 64 Bit OS X. I suggest the HDFGroup does something like that also. Heck, you
 could have CMake configure a Compiler_Used.txt file for each build with a
 description of how the binaries were built.

  Thanks for listening.

 ___
 Mike Jackson  www.bluequartz.net

 On Mar 17, 2010, at 3:51 PM, Dana Robinson wrote:
 -8---

  In particular, I'd like for our dlls to be named zlib90.dll,
 hdf590d.dll and so forth, which I think might make incorrect linking more
 apparent.

 -8---


 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org