Re: [Hdf-forum] HDF5 1.8.11 release candidate is available for testing

2013-05-25 Thread Michael Jackson
Bernd,
   Run CmakeGUI and make sure the variable BUILD_SHARED_LIBS is
OFF. Then click the Configure button and then the Generate button.
The Visual Studio solution is now ready to open and compile.

--

Mike Jackson
_
Mike Jackson  mike.jack...@bluequartz.net
BlueQuartz Softwarewww.bluequartz.net
Principal Software Engineer  Dayton, Ohio


On Sat, May 25, 2013 at 11:06 AM, Bernd Rinn br...@ethz.ch wrote:
 I just tried to compile HDF5 1.8.11 with Visual Studio 9 using CMake. I
 need to link the library statically, thus I need to use the flag /MT for
 the C compiler, however, I can't find a way to do it. I tried this:

 1. Create build\ subdirectory and change into it

 2. Call

 cmake -G Visual Studio 9 2008 -DBUILD_TESTING:BOOL=ON
 -DHDF5_BUILD_TOOLS:BOOL=ON ..

 3. Load file CMakeCache.txt and replace all occurrences of /MD by /MT.

 for example I replace the line

 CMAKE_C_FLAGS_RELEASE:STRING=/MD /O2 /Ob2 /D NDEBUG

 with

 CMAKE_C_FLAGS_RELEASE:STRING=/MT /O2 /Ob2 /D NDEBUG

 4. Call

 cmake --build . --config Release

 The build still uses /MD as I can see e.g. in
 /build/src/hdf5.dir/Release/BuildLog.htm where these flags are listed:

 /O2 /Ob2 /I C:\JHDF5\hdf5\hdf5-1.8.11\build /I
 C:\JHDF5\hdf5\hdf5-1.8.11\src /I C:\JHDF5\hdf5\hdf5-1.8.11\build\bin
 /I C:\JHDF5\hdf5\hdf5-1.8.11 /D WIN32 /D _WINDOWS /D NDEBUG /D
 BIND_TO_CURRENT_VCLIBS_VERSION=1 /D _CRT_SECURE_NO_WARNINGS /D
 _CONSOLE /D CMAKE_INTDIR=\Release\ /D _MBCS /FD /MD
 /Fohdf5.dir\Release\\
 /FdC:\JHDF5\hdf5\hdf5-1.8.11\build\bin\Release/libhdf5.pdb /W3 /c /TC
   /Zm1000

 How do I make the compiler actually use the flag /MT?

 Thanks,

 Bernd


 On 2013-04-16 00:21, Bernd Rinn wrote:
 Thanks for the info. I'd suggest that you adapt
 release_docs/INSTALL_Windows.txt accordingly. It still reads:

 
 The old solutions and projects found in the windows\ folder will be
 maintained for legacy users until HDF5 1.10.
 

 BTW: Are the compile errors I get due to using the legacy builds scripts?

 Bernd

 On 2013-04-16 00:08, Dana Robinson wrote:
 The legacy Windows build scripts have been deprecated and removed.  We
 now only support using CMake to construct Visual Studio solutions.
  Instructions for building with CMake can be found in the release_docs
 directory.

 Dana


 On Mon, Apr 15, 2013 at 4:59 PM, Bernd Rinn br...@ethz.ch
 mailto:br...@ethz.ch wrote:

 Hello Albert,

 The legacy Windows build scripts (subdirectory windows\) seem to be
 missing in hdf5-1.8.11-pre1.tar.gz. When I copy this directory over from
 hdf5-1.8.10-patch1.tar.bz2 and compile with VisualStudio 2008 on Windows
 XP (32bit), then I get the following fatal errors:

 
 -
 Error   113 error C2054: expected '(' to follow 'H5PLUGIN_DLL'
 c:\jhdf5\hdf5\hdf5-1.8.11-pre1\src\H5PLextern.h 76  hdf5

 Error   114 error C2085: 'H5PLget_plugin_type' : not in formal
 parameter list  c:\jhdf5\hdf5\hdf5-1.8.11-pre1\src\H5PLextern.h 76
hdf5

 Error   115 error C2061: syntax error : identifier 'H5PLUGIN_DLL'
 c:\jhdf5\hdf5\hdf5-1.8.11-pre1\src\H5PLextern.h 77  hdf5

 Error   123 error C2054: expected '(' to follow 'H5PLUGIN_DLL'
 c:\jhdf5\hdf5\hdf5-1.8.11-pre1\src\H5PLextern.h 76  hdf5

 Error   124 error C2085: 'H5PLget_plugin_type' : not in formal
 parameter list  c:\jhdf5\hdf5\hdf5-1.8.11-pre1\src\H5PLextern.h 76
hdf5

 Error   125 error C2061: syntax error : identifier 'H5PLUGIN_DLL'
 c:\jhdf5\hdf5\hdf5-1.8.11-pre1\src\H5PLextern.h 77  hdf5
 
 -

 Any idea what I need to do in order to fix those errors?

 Best regards,

 Bernd

 On 2013-04-12 22:59, Albert Cheng wrote:
  Hello everyone,
 
  A pre-release candidate version of HDF5 1.8.11 is available for
 testing
  and can be downloaded at the following link:
 
 
 
 http://www.hdfgroup.uiuc.edu/ftp/pub/outgoing/hdf5/hdf5-1.8.11/hdf5-1.8.11-pre1.tar.gz
 
 
  If you have some time to test this pre-release, we would greatly
  appreciate it. We try to test on a wide variety of platforms and
  environments but are unable to test everywhere so feedback from
 the user
  community is always welcome.
 
  Please note that while the release notes contained in the pre-release
  are reflective of the changes and additions present in this
 release, the
  'platforms tested' and 'tested configurations' sections have yet to be
  updated for this version of HDF5.
 
  We plan to release HDF5 1.8.11 in mid-May barring the discovery of any
  critical issues.
 
  Thank you!
 
  The HDF Group
 
 
   

Re: [Hdf-forum] HDF5 1.8.11 release candidate is available for testing -- HighLevel library missing

2013-04-19 Thread Michael Jackson
actually for #2 below the HDF5 devs _could_ do this but it might be fairly 
Tedious to do it. The first line of the very top level CMakeLists.txt file 
should have a line like:

project(HDF5)

It would have to be constantly updated to 
project(HDF5-1.8.10.pre.foo.1.whatever) and I am not sure what that really buys 
the developer?

Typically I have several versions of HDF5 being compiled but I have them all in 
separate folders in my workspace. I guess from a Recent Files point of view 
from within Visual Studio all those show up as HDF5.sln which may not be big 
help.

 There are also other IDEs and environments that depend on having a stable 
project name. Imagine the following:

I have using the latest Git version of HDF5. I checkout out the code and 
generate my solutions. Then HDF5 devs increment the version number for the 
project() command. If I am NOT recognizing that this is going on I am going 
to end up using an out of date solution file because the next time CMake runs 
it will generate a whole NEW solution file with the new version number but I'll 
still be using the older solution which would cause all sorts of problems.

 So it is best to just leave it the way it is.

Thanks
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Apr 19, 2013, at 2:59 PM, Albert Cheng wrote:

 Hi Pedro,
 
 Thank you for your feedback.
 I think you are on a Windows system but could you give more details like
 OS versions, platform specifics (32, 64 bits,... ), compilers, ...
 
 I will check with our Cmake person of your suggestions.
 A question about your suggestion 2.  Wouldn't it be better
 to use folder names to specify different versions?  E.g.,
 
 ...\hdf5-1.8.11-pre1\HDF5.sln
 ...\hdf5-1.8.10\HDF5.sln
 
 On 4/18/13 9:47 PM, Pedro Vicente wrote:
 Hi Albert
 
 I tried the CMake build with
 
 T:\hdf5-1.8.11-pre1\buildcmake ..
 
 and unlike the previous hdf5-1.8.10 reported errors, no errors this time,
 
 However, the High-level library is not included in the build.
 
 The netCDF build requires the Dimension Scales library, so this is a must 
 for netCDF users.
 
 1) Would it be possible to have a CMake build for the High-level library  ?
 
 2) I have a suggestion, if you could name the CMake generated Visual Studio 
 solution with the HDF5 release version name, like
 
 hdf5-1.8.11-pre1.sln
 
 instead of
 
 HDF5.sln
 
 This is helpful for anyone that uses several versions
 
 Thanks
 
 Pedro
 
 
 
 --
 Pedro Vicente, Earth System Science
 University of California, Irvine
 http://www.ess.uci.edu/
 
 
 - Original Message - From: Albert Cheng ach...@hdfgroup.org
 To: HDF Users Discussion List hdf-forum@hdfgroup.org
 Sent: Friday, April 12, 2013 1:59 PM
 Subject: [Hdf-forum] HDF5 1.8.11 release candidate is available for testing
 
 
 Hello everyone,
 
 A pre-release candidate version of HDF5 1.8.11 is available for testing and 
 can be downloaded at the following link:
 
 http://www.hdfgroup.uiuc.edu/ftp/pub/outgoing/hdf5/hdf5-1.8.11/hdf5-1.8.11-pre1.tar.gz
  
 
 If you have some time to test this pre-release, we would greatly appreciate 
 it. We try to test on a wide variety of platforms and environments but are 
 unable to test everywhere so feedback from the user community is always 
 welcome.
 
 Please note that while the release notes contained in the pre-release are 
 reflective of the changes and additions present in this release, the 
 'platforms tested' and 'tested configurations' sections have yet to be 
 updated for this version of HDF5.
 
 We plan to release HDF5 1.8.11 in mid-May barring the discovery of any 
 critical issues.
 
 Thank you!
 
 The HDF Group
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


[Hdf-forum] Reshaping an Array in HDFView

2013-04-08 Thread Michael Jackson
Is there a way to reshape an array with HDFView? I have an array that is 
stored as a 1D array but in reality it can be interpreted as a 2D Image. So I 
would like to be able to open the data array as an image in HDFView but 
somehow tell the dialog box to convert the array from a 1D to a 2D array.

Is this possible?

Thanks
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Cmake Windows build with errors -- missing Fortran

2013-04-01 Thread Michael Jackson


On Mar 30, 2013, at 12:05 PM, Pedro Vicente wrote:

 Hi Allen
 
 when doing
 
 T:\hdf5-1.8.10-patch1\buildcmake -C ../config/cmake/cacheinit.cmake
 add  . ({space}{dot}) at end of command
 
 
 ah, I missed that dot
 I guess you actually mean 2 dots (folder beneath), since when I did one dot I 
 get an error
 
 my build path is
 
 T:\hdf5-1.8.10-patch1\build
 
 anyway , I did
 
 T:\hdf5-1.8.10-patch1\buildcmake -C ../config/cmake/cacheinit.cmake .. 
 -DSZIP_INCLUDE_DIR:PATH=T:\szip\src 
 -DSZIP_LIBRARY:FILEPATH=T:\szip\all\lib\Debug\szlib.lib -DZLIB_INCL
 UDE_DIR:PATH=T:\zlib-1.2.3 
 -DZLIB_LIBRARY:FILEPATH=T:\zlib-1.2.3\projects\visualc6\Win32_LIB_Debug\zlibd.lib
   out.txt
 
 where the zlib and szip are my own szip and zlib locations
 
 I get errors and warnings
 It seems that
 
 1) my zlib and szip locations are not detected
 2) it complains that I don't have the Fortran compiler (I don't) and gives an 
 error because of this
 
 this seems like a bug in the Cmake build maybe?
 
 
 the relevant messages are:
 
 
 
 loading initial cache file ../config/cmake/cacheinit.cmake
 CMake Warning at CMakeLists.txt:562 (FIND_PACKAGE):
 Could not find a package configuration file provided by ZLIB with any of
 the following names:
 
   zlibConfig.cmake
   zlib-config.cmake
 
 Add the installation prefix of ZLIB to CMAKE_PREFIX_PATH or set
 ZLIB_DIR to a directory containing one of the above files.  If ZLIB
 provides a separate development package or SDK, be sure it has been
 installed.
 
 
 CMake Warning at CMakeLists.txt:606 (FIND_PACKAGE):
 Could not find a package configuration file provided by SZIP with any of
 the following names:
 
   szipConfig.cmake
   szip-config.cmake
 
 Add the installation prefix of SZIP to CMAKE_PREFIX_PATH or set
 SZIP_DIR to a directory containing one of the above files.  If SZIP
 provides a separate development package or SDK, be sure it has been
 installed.
 
 
 CMake Error at C:/Program Files (x86)/CMake 
 2.8/share/cmake-2.8/Modules/CMakeTestFortranCompiler.cmake:54 (message):
 The Fortran compiler ifort is not able to compile a simple test program.
 
 It fails with the following output:
 
  Change Dir: T:/hdf5-1.8.10-patch1/build/CMakeFiles/CMakeTmp
 
 T:\hdf5-1.8.10-patch1\build\CMakeFiles\CMakeTmp\cmTryCompileExec4230441850.vfproj'
 cannot be opened because its project type (.vfproj) is not supported by
 this version of the application.
 
 
 CMake will not be able to correctly generate this project.
 Call Stack (most recent call first):
 config/cmake/HDF5UseFortran.cmake:5 (ENABLE_LANGUAGE)
 CMakeLists.txt:752 (INCLUDE)
 
 
 
 -- Checking IF alignment restrictions are strictly enforced... yes
 -- Found ZLIB: T:/zlib-1.2.3/projects/visualc6/Win32_LIB_Debug/zlibd.lib 
 (found version 1.2.3)
 -- Filter ZLIB is ON
 -- Looking for SZIP_BUILT_AS_DYNAMIC_LIB
 -- Looking for SZIP_BUILT_AS_DYNAMIC_LIB - not found
 -- Filter SZIP is ON
 -- Check for working Fortran compiler using: Visual Studio 10
 -- Check for working Fortran compiler using: Visual Studio 10  -- broken
 -- Configuring incomplete, errors occurred!
 
 
 
 
 --
 Pedro Vicente, Earth System Science
 University of California, Irvine
 http://www.ess.uci.edu/
 
 
 - Original Message - From: Allen D Byrne b...@hdfgroup.org
 To: hdf-forum@hdfgroup.org
 Cc: h...@hdfgroup.org
 Sent: Friday, March 29, 2013 11:40 AM
 Subject: Re: [Hdf-forum] Cmake build witk zlib errors
 
 
 On Friday, March 29, 2013 09:48:19 AM Pedro Vicente wrote:
 Hi HDF Group
 
 I am having some trouble using the Cmake build to build the latest  HDF5 in
 Windows
 
 What I did:
 ...
 
 I followed the instructions on the file /release_docs/Cmake.txt
 
 that says
 
 cmake -C sourcepath/config/cmake/cacheinit.cmake -G generator
 [-Doptions]  sourcepath
options is:
* SZIP_INCLUDE_DIR:PATH=path to szip includes directory
* SZIP_LIBRARY:FILEPATH=path to szip/library file
* ZLIB_INCLUDE_DIR:PATH=path to zlib includes directory
* ZLIB_LIBRARY:FILEPATH=path to zlib/library file
* HDF5OPTION:BOOL=[ON | OFF]
 
 
 when doing
 
 T:\hdf5-1.8.10-patch1\buildcmake -C ../config/cmake/cacheinit.cmake
 add  . ({space}{dot}) at end of command
 
 loading initial cache file ../config/cmake/cacheinit.cmake
 
 there was this error message
 
 CMake Error: The source directory
 T:/hdf5-1.8.10-patch1/config/cmake/cacheinit.cmake is a file, not a
 directory.
 Specify --help for usage, or press the help button on the CMake GUI.
 
 
 
 Thank you in advance for any help on this
 
 
 
 --
 Pedro Vicente, Earth System Science
 University of California, Irvine
 http://www.ess.uci.edu/

Did you try configuring with a clean build directory?
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net



[Hdf-forum] Unknown KEYWORD using h5import

2013-02-28 Thread Michael Jackson
 I have the following script that I am writing which in the end is going to 
generate one of our DREAM3D style hdf5 files. I have a self compiled hdf5 
1.8.10-patch 1 on an OS X 10.6.8 system. here is the script

#!/bin/bash
# This script will attempt to convert ascii text files into a .dream3d file
# We assume that the h5import program is in the current path

# First we need the dimensions, resolution and origin of the voxel data
HDF5_INSTALL=/Users/Shared/Toolkits/hdf5-1.8.10
PATH=$PATH:$HDF5_INSTALL/bin

dimsFile=/tmp/dims.txt
echo 25 50 10  $dimsFile
dimsConfig=/tmp/dims.config
echo PATH VoxelDataContainer DIMENSIONS  $dimsConfig
echo INPUT-CLASS TEXTIN  $dimsConfig
echo RANK 1  $dimsConfig
echo DIMENSION-SIZES 3  $dimsConfig
echo OUTPUT-CLASS IN  $dimsConfig
echo OUTPUT-SIZE 64  $dimsConfig
echo OUTPUT-ARCHITECTURE IEEE  $dimsConfig
echo OUTPUT-BYTE-ORDER LE  $dimsConfig

resFile=/tmp/res.txt
echo 1.0 0.75 0.25  $resFile
resConfig=/tmp/res.config
echo PATH VoxelDataContainer SPACING  $resConfig
echo INPUT-CLASS TEXTFP  $resConfig
echo RANK 1  $resConfig
echo DIMENSION-SIZES 3  $resConfig
echo OUTPUT-CLASS FP  $resConfig
echo OUTPUT-SIZE 32  $resConfig
echo OUTPUT-ARCHITECTURE IEEE  $resConfig
echo OUTPUT-BYTE-ORDER LE  $resConfig

orgFile=/tmp/origin.txt
echo 0 0 0  $orgFile
orgConfig=/tmp/origin.config
echo PATH VoxelDataContainer ORIGIN  $orgConfig
echo INPUT-CLASS TEXTFP  $orgConfig
echo RANK 1  $orgConfig
echo DIMENSION-SIZES 3  $orgConfig
echo OUTPUT-CLASS FP  $orgConfig
echo OUTPUT-SIZE 32  $orgConfig
echo OUTPUT-ARCHITECTURE IEEE  $orgConfig
echo OUTPUT-BYTE-ORDER LE  $orgConfig

h5import $dimsFile -config $dimsConfig $resFile -config $resConfig  $orgFile 
-config $orgConfig -outfile /tmp/out.dream3d

When I run the script I get the following error:
[mjackson@Ferb:Desktop]$ ./AsciiToDream3D.sh 
Unknown keyword in configuration file: /tmp/dims.config
Error in processing the configuration file: /tmp/dims.config.
Program aborted.

I followed one of the examples given in the help for h5import. Any idea what 
h5import is complaining about? Also as a usage note, it would be nice if the 
error message actually output which keyword h5import had an issue with. Would 
help the debugging a lot.

Thanks
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Unknown KEYWORD using h5import

2013-02-28 Thread Michael Jackson
I guess. Just the header part then?

HDF5 /Users/Shared/Data/SyntheticStructures/2Phase/2PhaseTexture.dream3d {
DATASET /VoxelDataContainer/DIMENSIONS {
   DATATYPE  H5T_STD_I64LE
   DATASPACE  SIMPLE { ( 3 ) / ( 3 ) }
}
}

That would work?

--
Mike Jackson www.bluequartz.net

On Feb 28, 2013, at 3:05 PM, Allen D Byrne wrote:

 1.8.10 h5import can now use h5dump output files, would that work?
 
 Allen
 
 On Thursday, February 28, 2013 11:38:02 AM Michael Jackson wrote:
 I have the following script that I am writing which in the end is going to
 generate one of our DREAM3D style hdf5 files. I have a self compiled hdf5
 1.8.10-patch 1 on an OS X 10.6.8 system. here is the script
 
 #!/bin/bash
 # This script will attempt to convert ascii text files into a .dream3d file
 # We assume that the h5import program is in the current path
 
 # First we need the dimensions, resolution and origin of the voxel data
 HDF5_INSTALL=/Users/Shared/Toolkits/hdf5-1.8.10
 PATH=$PATH:$HDF5_INSTALL/bin
 
 dimsFile=/tmp/dims.txt
 echo 25 50 10  $dimsFile
 dimsConfig=/tmp/dims.config
 echo PATH VoxelDataContainer DIMENSIONS  $dimsConfig
 echo INPUT-CLASS TEXTIN  $dimsConfig
 echo RANK 1  $dimsConfig
 echo DIMENSION-SIZES 3  $dimsConfig
 echo OUTPUT-CLASS IN  $dimsConfig
 echo OUTPUT-SIZE 64  $dimsConfig
 echo OUTPUT-ARCHITECTURE IEEE  $dimsConfig
 echo OUTPUT-BYTE-ORDER LE  $dimsConfig
 
 resFile=/tmp/res.txt
 echo 1.0 0.75 0.25  $resFile
 resConfig=/tmp/res.config
 echo PATH VoxelDataContainer SPACING  $resConfig
 echo INPUT-CLASS TEXTFP  $resConfig
 echo RANK 1  $resConfig
 echo DIMENSION-SIZES 3  $resConfig
 echo OUTPUT-CLASS FP  $resConfig
 echo OUTPUT-SIZE 32  $resConfig
 echo OUTPUT-ARCHITECTURE IEEE  $resConfig
 echo OUTPUT-BYTE-ORDER LE  $resConfig
 
 orgFile=/tmp/origin.txt
 echo 0 0 0  $orgFile
 orgConfig=/tmp/origin.config
 echo PATH VoxelDataContainer ORIGIN  $orgConfig
 echo INPUT-CLASS TEXTFP  $orgConfig
 echo RANK 1  $orgConfig
 echo DIMENSION-SIZES 3  $orgConfig
 echo OUTPUT-CLASS FP  $orgConfig
 echo OUTPUT-SIZE 32  $orgConfig
 echo OUTPUT-ARCHITECTURE IEEE  $orgConfig
 echo OUTPUT-BYTE-ORDER LE  $orgConfig
 
 h5import $dimsFile -config $dimsConfig $resFile -config $resConfig  $orgFile
 -config $orgConfig -outfile /tmp/out.dream3d
 
 When I run the script I get the following error:
 [mjackson@Ferb:Desktop]$ ./AsciiToDream3D.sh
 Unknown keyword in configuration file: /tmp/dims.config
 Error in processing the configuration file: /tmp/dims.config.
 Program aborted.
 
 I followed one of the examples given in the help for h5import. Any idea what
 h5import is complaining about? Also as a usage note, it would be nice if
 the error message actually output which keyword h5import had an issue with.
 Would help the debugging a lot.
 
 Thanks
 ___
 Mike JacksonPrincipal Software Engineer
 BlueQuartz SoftwareDayton, Ohio
 mike.jack...@bluequartz.net  www.bluequartz.net
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


[Hdf-forum] Turn off the version decoration on the library names

2013-02-26 Thread Michael Jackson
Is there a way to turn off the library naming decoration and just go with 
hdf5.so on linux? There are 2 symlinks and an actual file which is wreaking 
havoc on my CMake based build system for the deployment side of things. Or has 
someone figured this out already?

Thanks for any help
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] recreating a file structure.

2013-02-22 Thread Michael Jackson
h5dump may be what you are looking for. It is part of the standard HDF5 
distribution.
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Feb 22, 2013, at 10:42 AM, Bruce Bowler wrote:

 Hi, I'm a newb at writing HDF files but have a small bit (really small) of 
 experience reading HDF files in IDL and MATLAB.
 
 I have an HDF file that contains some data.  I need to create a new file 
 that has different data but has the same structure as the extant file.  
 By structure, I mean vgroups, vdata, etc.  Ideally I'd be creating this 
 new file using MATLAB.  It needs to be the same so a bit of 3rd party 
 software that I don't have the source for can process the file properly.
 
 Is there a tool that will describe, in a fair bit of detail, the contents 
 of the extant file such that I can figure out what calls I need to make to 
 write the new one?
 
 Thanks.
 Bruce
 
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Trouble writing 2GB dataset from single task with HDF5 1.8.10

2013-01-25 Thread Michael Jackson
I know there is/was a bug that affected OS X when trying to write a dataset 
larger than 2^31 bytes. It is solved in the current development branch. I am 
not sure if this effects Linux x64 though. but it sure looks like the stack 
traces I was getting.

Try checking out the development code from:

The code can be checked out from:

http://svn.hdfgroup.uiuc.edu/hdf5/branches/hdf5_1_8_10

and see if the problem still persists.
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

--
Mike Jackson www.bluequartz.net

On Jan 24, 2013, at 4:01 PM, Robert McLay wrote:

 I am unable to write a 2GB dataset from a single task.  I have the same 
 problem with 1.8.8, 1.8.9, and 1.8.10.   I have attached a FORTRAN 90 program 
 that shows the problem.  I also have a C program that shows the same problem 
 so I do not think this is a problem of FORTRAN to C convertion.   The program 
 is a parallel f90 program run as a single task.
 
 Here is the error report:
 
 $  ./example
 HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 0:
   #000: H5Dio.c line 266 in H5Dwrite(): can't write data
 major: Dataset
 minor: Write failed
   #001: H5Dio.c line 673 in H5D__write(): can't write data
 major: Dataset
 minor: Write failed
   #002: H5Dmpio.c line 544 in H5D__contig_collective_write(): couldn't finish 
 shared collective MPI-IO
 major: Low-level I/O
 minor: Write failed
   #003: H5Dmpio.c line 1523 in H5D__inter_collective_io(): couldn't finish 
 collective MPI-IO
 major: Low-level I/O
 minor: Can't get value
   #004: H5Dmpio.c line 1567 in H5D__final_collective_io(): optimized write 
 failed
 major: Dataset
 minor: Write failed
   #005: H5Dmpio.c line 312 in H5D__mpio_select_write(): can't finish 
 collective parallel write
 major: Low-level I/O
 minor: Write failed
   #006: H5Fio.c line 158 in H5F_block_write(): write through metadata 
 accumulator failed
 major: Low-level I/O
 minor: Write failed
   #007: H5Faccum.c line 816 in H5F_accum_write(): file write failed
 major: Low-level I/O
 minor: Write failed
   #008: H5FDint.c line 185 in H5FD_write(): driver write request failed
 major: Virtual File Layer
 minor: Write failed
   #009: H5FDmpio.c line 1842 in H5FD_mpio_write(): MPI_File_write_at_all 
 failed
 major: Internal error (too specific to document in detail)
 minor: Some MPI function failed
   #010: H5FDmpio.c line 1842 in H5FD_mpio_write(): Invalid argument, error 
 stack:
 MPI_FILE_WRITE_AT_ALL(84): Invalid count argument
 major: Internal error (too specific to document in detail)
 minor: MPI Error String
 forrtl: severe (174): SIGSEGV, segmentation fault occurred
 Image  PCRoutineLineSource
  
 libpthread.so.00031C4D0C4F0  Unknown   Unknown  Unknown
 libc.so.6  0031C3E721E3  Unknown   Unknown  Unknown
 example0071C646  Unknown   Unknown  Unknown
 libmpich.so.3  2B0682CBF48C  Unknown   Unknown  Unknown
 libmpich.so.3  2B0682E6AE91  Unknown   Unknown  Unknown
 libmpich.so.3  2B0682E6BDB2  Unknown   Unknown  Unknown
 example004AE9EC  Unknown   Unknown  Unknown
 example004A9014  Unknown   Unknown  Unknown
 example00497040  Unknown   Unknown  Unknown
 example004990A2  Unknown   Unknown  Unknown
 example00693991  Unknown   Unknown  Unknown
 example004597A9  Unknown   Unknown  Unknown
 example0045C144  Unknown   Unknown  Unknown
 example0043B6B4  Unknown   Unknown  Unknown
 
 Attached is the configuration data
 
 Attached is a program which produces the error report.  I compiled this 
 fortran 90 program with:
 
 h5pfc -g -O0 -o example example.f90
 
 Internal to the program is a variable LocalSz which is 646.  8*(646^3) is 
 bigger than 2*1024^3.   The program works if LocalSz is 645.
 
 Thanks for looking at this.
 
 
 
 -- 
 Robert McLay, Ph.D.
 TACC
 Manager, HPC Software Tools
 (512) 232-8104
 
 h5.config.txtexample.f90___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Using CMake to build HDF5 on a MAC OS X

2013-01-02 Thread Michael Jackson
Is this a new variable? I see a CMake variable called 
HDF5_BUILD_WITH_INSTALL_NAME? Oh, wait, if I use the advanced Variables view 
in CMake then I also see CMAKE_SKIP_INSTALL_RPATH and CMAKE_SKIP_RPATH? Are 
those what you mean? Otherwise I am not seeing that variable anywhere in my 
CMakeCache.txt file. THis is with CMake 2.8.10.1 on OS X 10.6.8
--
Mike Jackson www.bluequartz.net

On Jan 2, 2013, at 12:08 PM, Allen D Byrne wrote:

 CORRECTION: name of option is:
   CMAKE_BUILD_WITH_INSTALL_RPATH
 
 Notice the R in RPATH.
 
 Allen
 
 On Wednesday, January 02, 2013 10:42:38 AM Michael Jackson wrote:
 I this with the 1.8.10 version located at
 http://www.hdfgroup.org/ftp/HDF5/current/src/
 
 --
 Mike Jackson www.bluequartz.net
 
 On Jan 2, 2013, at 10:26 AM, Allen D Byrne wrote:
 I had a successful cmake build/test/install on a MAC by adding:
 -DCMAKE_BUILD_WITH_INSTALL_PATH:BOOL=OFF
 
 to the build options. This is with cmake 2.8.10.
 
 I would appreciate any community members verifying this as a solution.
 
 Allen
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Testing large dataset write *FAILED*

2012-12-10 Thread Michael Jackson
How much RAM do you have in the machine in question? That test is probably 
trying to write a 2GB or 4 GB contiguous chunk of RAM to memory so you have to 
have at *least* that amount installed in the system. Are you running a 32 or 64 
bit system? The HDF folks will probably have a better explanation.
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Dec 10, 2012, at 5:03 AM, Syed Ahsan Ali Bokhari ahsan@gmail.com wrote:

 
 
 
 Dear All
 
 I am trying to install hdf-1.8.6 on RHEL5.3, I configured with the following 
 command
 
 ./configure --with-zlib=/usr/local/ --prefix=/usr/local/
 
 make check install fails with the following error
 
 
 
 Test passed with the Family Driver.
 
 Checking if file system supports big files...
 Testing big file with the SEC2 Driver 
 Testing large dataset write   *FAILED*
 Command terminated by signal 11
 0.02user 0.03system 0:00.06elapsed 100%CPU (0avgtext+0avgdata 0maxresident)k
 0inputs+0outputs (0major+7343minor)pagefaults 0swaps
 make[4]: *** [big.chkexe_] Error 1
 make[4]: Leaving directory `/home/pmdtest/hdf5-1.8.6/test'
 make[3]: *** [build-check-s] Error 2
 make[3]: Leaving directory `/home/pmdtest/hdf5-1.8.6/test'
 make[2]: *** [test] Error 2
 make[2]: Leaving directory `/home/pmdtest/hdf5-1.8.6/test'
 make[1]: *** [check-am] Error 2
 make[1]: Leaving directory `/home/pmdtest/hdf5-1.8.6/test'
 make: *** [check-recursive] Error 1
 [root@pmd03 hdf5-1.8.6]# 
 
 Please help!
 
 Thanks
 Ahsan
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Advice on Storing data with efficient access

2012-11-28 Thread Michael Jackson
Thanks for all the feedback. I'm going to start experimenting with some 
variations over the next few weeks to see which ones are efficient from an IO 
standpoint and reasonably easy to get at from a coding perspective.

Thanks
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Nov 28, 2012, at 2:12 PM, Quincey Koziol wrote:

 Hi Michael,
 
 On Nov 27, 2012, at 6:33 PM, Michael Jackson mike.jack...@bluequartz.net 
 wrote:
 
 I have a project coming up where we are going to store multiple modalities 
 of data acquired from Scanning Electron Microscopes (EBSD, EDS and ISE**).
 
 For each modality the data is acquired in a grid fashion. For the ISE data 
 the data is actually a gray scale image so those are easy to store and think 
 about. If the acquired is 2048 x 2048 then you have a gray scale image of 
 that same size (unsigned char). This is where things start getting 
 interesting. For the EBSD data there are several signals collected at *each* 
 pixel. Some of the signals are simple scalar values (floats) and we have 
 been storing those also in a 2D array just like the ISE image. But one of 
 the signals is actually itself a 2D image (60x80 pixels). So for example the 
 EBSD sampling grid dimensions is 100 x 75 and at each grid point there is a 
 60x80 array of data.
 
 The EDS data is much the same except we have a 2048 1D Array at each pixel 
 and the dimensions of the EDS sampling grid is 512 x 384
 
 I am trying to figure out a balance between efficient storage and easy 
 access. One thought was to store each grid point as its own group but that 
 would be hundreds of thousands of groups and I don't think HDF5 is going to 
 react well to that. So the other end of that would be to continue to think 
 of each modality of data as an Image and store all the data under a group 
 such as EDS as a large multi-dimensional array. So for example in the EBSD 
 data acquisition from above I would have a 4D array (100x75x80x60). What 
 type of attributes should I store the data set so that later when we are 
 reading through the data we can efficiently grab hyper slabs of the data 
 without having to read the entire data set into memory?
 
   Actually, groups with hundreds of thousands of links should be fine.
 
   However, I would lean toward keeping the image structure and either 
 using an array datatype (80x60, in the case you gave) or a compound datatype 
 for the pixels.  Another useful option is to create a group for each 
 image and then store a separate dataset for each field in the array.
 
   Quincey
 
 I hope all of that was clear enough to elicit some advice on storage. Thanks 
 for any help. Just for clarification the sizes of the data sets are for our 
 experimental data sets where we are just trying to figure this out. The 
 real data sets will likely be multi-gigabytes in size for each slice of 
 data where we may have 250 slices.
 
 
 ** EBSD - Electron Backscatter Diffraction
  EDS - Energy dispersive Spectra
  ISE - Ion Induced Secondary Electron Image
 
 Thanks for any help or advice.
 ___
 Mike JacksonPrincipal Software Engineer
 BlueQuartz SoftwareDayton, Ohio
 mike.jack...@bluequartz.net  www.bluequartz.net
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


[Hdf-forum] Advice on Storing data with efficient access

2012-11-27 Thread Michael Jackson
I have a project coming up where we are going to store multiple modalities of 
data acquired from Scanning Electron Microscopes (EBSD, EDS and ISE**).

For each modality the data is acquired in a grid fashion. For the ISE data the 
data is actually a gray scale image so those are easy to store and think about. 
If the acquired is 2048 x 2048 then you have a gray scale image of that same 
size (unsigned char). This is where things start getting interesting. For the 
EBSD data there are several signals collected at *each* pixel. Some of the 
signals are simple scalar values (floats) and we have been storing those also 
in a 2D array just like the ISE image. But one of the signals is actually 
itself a 2D image (60x80 pixels). So for example the EBSD sampling grid 
dimensions is 100 x 75 and at each grid point there is a 60x80 array of data.

The EDS data is much the same except we have a 2048 1D Array at each pixel and 
the dimensions of the EDS sampling grid is 512 x 384

I am trying to figure out a balance between efficient storage and easy access. 
One thought was to store each grid point as its own group but that would be 
hundreds of thousands of groups and I don't think HDF5 is going to react well 
to that. So the other end of that would be to continue to think of each 
modality of data as an Image and store all the data under a group such as 
EDS as a large multi-dimensional array. So for example in the EBSD data 
acquisition from above I would have a 4D array (100x75x80x60). What type of 
attributes should I store the data set so that later when we are reading 
through the data we can efficiently grab hyper slabs of the data without having 
to read the entire data set into memory?

I hope all of that was clear enough to elicit some advice on storage. Thanks 
for any help. Just for clarification the sizes of the data sets are for our 
experimental data sets where we are just trying to figure this out. The real 
data sets will likely be multi-gigabytes in size for each slice of data where 
we may have 250 slices.


** EBSD - Electron Backscatter Diffraction
   EDS - Energy dispersive Spectra
   ISE - Ion Induced Secondary Electron Image

Thanks for any help or advice.
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Link Errors with 1.8.10 but not in 1.8.9

2012-11-21 Thread Michael Jackson
So for me to compile HDF5 1.8.10 as DLLs AND for me to use those libraries in 
another project my own project somehow needs to define H5_BUILT_AS_DYNAMIC_LIB 
in order for the proper #define 's to be enabled. This seems a bit 
chicken-and-egg to me. How do I know if HDF5 was built as a dynamic or static 
library? Relying on the filename is NOT a 100% way of determining this.

Looking through the HDF5 headers we get this chain.

H5public.h includes H5pubconf.h and also includes H5api_adpt.h which is the 
file that mainly needs the H5_BUILT_AS_DYNAMIC_LIB to be defined properly. 

We can generate a new header file that gets included in H5public.h BEFORE the 
H5api_adpt.h file. 

We could generate a new file that is included at the top of H5api_adpt.h

The H5_BUILT_AS_DYNAMIC_LIB is basically defined in hdf5-config.cmake (line 24) 
that gets installed when HDF5 is built with CMake which would work for my 
Windows and OS X builds but some of my linux distributions include their own 
hdf5 installations that may NOT be built with CMake at which point I guess I 
can fall back on file naming again.

Just my thoughts.
--
Mike Jackson www.bluequartz.net

On Nov 20, 2012, at 3:41 PM, Allen D Byrne wrote:

 Mike,
 
 Sorry didn't mean to confuse - debug/release - I was thinking globally about 
 the issue, obviously these defines have nothing to do with debug/release 
 other 
 then creating another set of folders. The find_package would happen after 
 hdf5 
 is installed and using cmake to read the hdf5 cmake configuration.
 
 I was hoping to remove the build specific defines in H5pubconf.h to somewhere 
 else, because these defines are the only difference between static/dynamic 
 installs. Libraries are named unique enough. 
 
 Of course thinking about it now, there are other optional components in 
 H5pubconf.h! (szip/zlib/parallel...)
 
 Allen
 
 On Tuesday, November 20, 2012 03:27:16 PM Michael Jackson wrote:
 Which FIND_PACKAGE sets that variable?
 
 Inside of the Top level CMakeLists.txt file I have this now.
 
 OPTION (BUILD_SHARED_LIBS Build Shared Libraries OFF)
 set(H5_BUILT_AS_DYNAMIC_LIB 0)
 set(H5_BUILT_AS_STATIC_LIB 1)
 SET (LIB_TYPE STATIC)
 SET (H5_ENABLE_SHARED_LIB NO)
 SET (H5_ENABLE_STATIC_LIB NO)
 IF (BUILD_SHARED_LIBS)
  SET (LIB_TYPE SHARED)
  ADD_DEFINITIONS (-DH5_BUILT_AS_DYNAMIC_LIB)
  set(H5_BUILT_AS_DYNAMIC_LIB 1)
  set(H5_BUILT_AS_STATIC_LIB 0)
  SET (H5_ENABLE_SHARED_LIB YES)
 ELSE (BUILD_SHARED_LIBS)
  ADD_DEFINITIONS (-DH5_BUILT_AS_STATIC_LIB)
  set(H5_BUILT_AS_DYNAMIC_LIB 0)
  set(H5_BUILT_AS_STATIC_LIB 1)
  SET (H5_ENABLE_STATIC_LIB YES)
  IF (NOT WIN32)
# should this be a user setting : Everyone uses it anyway ?
ADD_DEFINITIONS (-DPIC)
  ENDIF (NOT WIN32)
 ENDIF (BUILD_SHARED_LIBS)
 
 
 Then in config/cmake/H5pubconf.h.in I added the following:
 
 /* Defined if HDF5 was built with CMake AND build as a shared library */
 #cmakedefine H5_BUILT_AS_DYNAMIC_LIB @H5_BUILT_AS_DYNAMIC_LIB@
 
 /* Defined if HDF5 was built with CMake AND build as a static library */
 #cmakedefine H5_BUILT_AS_STATIC_LIB @H5_BUILT_AS_STATIC_LIB@
 
 Which now allows everything to work. I am still confused why the change was
 needed for Debug/Release? Ironically enough I'm the original person to put
 the H5_BUILT_AS_DYNAMIC_LIB flag in the header file because when on windows
 and using CMake in another project that includes HDF5 I needed to be able
 to better figure out how the libraries were built so I could make decisions
 on what needs to be copied into my deployment package.
 
 ___
 Mike JacksonPrincipal Software Engineer
 BlueQuartz SoftwareDayton, Ohio
 mike.jack...@bluequartz.net  www.bluequartz.net
 
 On Nov 20, 2012, at 2:36 PM, Allen D Byrne wrote:
 The problem was that I attempted to allow multiple installs
 (debug/release,
 static/dynamic) of hdf5. Everything works great with cmake, because the
 value is set by the FIND_PACKAGE command. However, I failed to account
 for the non- cmake process. I wanted to eliminate the duplicate include
 folders that would result from multiple installs.
 
 What to do with that definition is the issue!
 
 Allen
 
 On Tuesday, November 20, 2012 02:04:21 PM Michael Jackson wrote:
 Hmm
 
 Looking at the H5pubconf.h for the 1.8.9 and 1.8.10 builds it would seem
 
 that there were some merge issues maybe? Otherwise why was that
 definition
 removed? Not that I run the tests much (I assume if HDF5 is release then
 all the tests are passing but how would any of the tests run if nothing
 to
 can correctly? Odd. Let me take a look. Can we get this into the first
 patch release if I figure out what went wrong?
 ___
 Mike JacksonPrincipal Software Engineer
 BlueQuartz SoftwareDayton, Ohio
 mike.jack...@bluequartz.net  www.bluequartz.net
 
 On Nov

Re: [Hdf-forum] Link Errors with 1.8.10 but not in 1.8.9

2012-11-21 Thread Michael Jackson
Ok, I see what you are saying about Debug and Release but this is where one 
would NOT want to include anything in the header files that determine if we are 
in DEBUG or RELEASE mode. Let the compiler do that since they all do anyways. I 
have yet to see a project that did NOT define either NDEBUG or DEBUG for 
their compiles.

  Unless you are talking not about combining Debug/Release into the same 
directory but Static/Dynamic into the same directory. I would agree that the 
H5_BUILT_AS_DYNAMIC_LIB define would definitely mess that up.
--
Mike Jackson www.bluequartz.net

On Nov 21, 2012, at 9:35 AM, Allen D Byrne wrote:

 Mike,
   I did not mean to imply that the filenames determine how they were built, 
 the filenames can be made different to allow installation in the same folder. 
 This define was (until now) the only issue keeping the include files from 
 being 
 installed in the same include folder. I had thought I solved it for CMake 
 use and now see otherwise.
 
 Also, I now see and understand that there are a mixture of configuration and 
 machine defines being stored in the file and I will put the define back into 
 the 
 H5pubconf.h for the next release (there might be a patch released soon 
 anyways).
 
 I was hoping for an elegant solution to the install multiple configurations 
 issue, but there might not be one.
 
 Allen
 
 
 On Wednesday, November 21, 2012 09:17:20 AM Michael Jackson wrote:
 So for me to compile HDF5 1.8.10 as DLLs AND for me to use those libraries
 in another project my own project somehow needs to define
 H5_BUILT_AS_DYNAMIC_LIB in order for the proper #define 's to be enabled.
 This seems a bit chicken-and-egg to me. How do I know if HDF5 was built
 as a dynamic or static library? Relying on the filename is NOT a 100% way
 of determining this.
 
 Looking through the HDF5 headers we get this chain.
 
 H5public.h includes H5pubconf.h and also includes H5api_adpt.h which is the
 file that mainly needs the H5_BUILT_AS_DYNAMIC_LIB to be defined properly.
 
 We can generate a new header file that gets included in H5public.h BEFORE
 the H5api_adpt.h file.
 
 We could generate a new file that is included at the top of H5api_adpt.h
 
 The H5_BUILT_AS_DYNAMIC_LIB is basically defined in hdf5-config.cmake (line
 24) that gets installed when HDF5 is built with CMake which would work for
 my Windows and OS X builds but some of my linux distributions include their
 own hdf5 installations that may NOT be built with CMake at which point I
 guess I can fall back on file naming again.
 
 Just my thoughts.
 --
 Mike Jackson www.bluequartz.net
 
 On Nov 20, 2012, at 3:41 PM, Allen D Byrne wrote:
 Mike,
 
 Sorry didn't mean to confuse - debug/release - I was thinking globally
 about the issue, obviously these defines have nothing to do with
 debug/release other then creating another set of folders. The
 find_package would happen after hdf5 is installed and using cmake to read
 the hdf5 cmake configuration.
 
 I was hoping to remove the build specific defines in H5pubconf.h to
 somewhere else, because these defines are the only difference between
 static/dynamic installs. Libraries are named unique enough.
 
 Of course thinking about it now, there are other optional components in
 H5pubconf.h! (szip/zlib/parallel...)
 
 Allen
 
 On Tuesday, November 20, 2012 03:27:16 PM Michael Jackson wrote:
 Which FIND_PACKAGE sets that variable?
 
 Inside of the Top level CMakeLists.txt file I have this now.
 
 OPTION (BUILD_SHARED_LIBS Build Shared Libraries OFF)
 set(H5_BUILT_AS_DYNAMIC_LIB 0)
 set(H5_BUILT_AS_STATIC_LIB 1)
 SET (LIB_TYPE STATIC)
 SET (H5_ENABLE_SHARED_LIB NO)
 SET (H5_ENABLE_STATIC_LIB NO)
 IF (BUILD_SHARED_LIBS)
 
 SET (LIB_TYPE SHARED)
 ADD_DEFINITIONS (-DH5_BUILT_AS_DYNAMIC_LIB)
 set(H5_BUILT_AS_DYNAMIC_LIB 1)
 set(H5_BUILT_AS_STATIC_LIB 0)
 SET (H5_ENABLE_SHARED_LIB YES)
 
 ELSE (BUILD_SHARED_LIBS)
 
 ADD_DEFINITIONS (-DH5_BUILT_AS_STATIC_LIB)
 set(H5_BUILT_AS_DYNAMIC_LIB 0)
 set(H5_BUILT_AS_STATIC_LIB 1)
 SET (H5_ENABLE_STATIC_LIB YES)
 IF (NOT WIN32)
 
   # should this be a user setting : Everyone uses it anyway ?
   ADD_DEFINITIONS (-DPIC)
 
 ENDIF (NOT WIN32)
 
 ENDIF (BUILD_SHARED_LIBS)
 
 
 Then in config/cmake/H5pubconf.h.in I added the following:
 
 /* Defined if HDF5 was built with CMake AND build as a shared library */
 #cmakedefine H5_BUILT_AS_DYNAMIC_LIB @H5_BUILT_AS_DYNAMIC_LIB@
 
 /* Defined if HDF5 was built with CMake AND build as a static library */
 #cmakedefine H5_BUILT_AS_STATIC_LIB @H5_BUILT_AS_STATIC_LIB@
 
 Which now allows everything to work. I am still confused why the change
 was
 needed for Debug/Release? Ironically enough I'm the original person to
 put
 the H5_BUILT_AS_DYNAMIC_LIB flag in the header file because when on
 windows
 and using CMake in another project that includes HDF5 I needed to be able
 to better figure out how the libraries were built so I could make
 decisions
 on what needs to be copied into my deployment package

[Hdf-forum] Link Errors with 1.8.10 but not in 1.8.9

2012-11-20 Thread Michael Jackson
Just updated to 1.8.10 by building HDf5 myself with VS2010 as dynamic
libraries. When compiling my own application against this new build i
am getting the following errors:

  libDREAM3DLib.lib(StatsData.obj) : error LNK2001: unresolved external symbol
H5T_NATIVE_UINT16_g [C:\Users\mjackson\Workspace\DREAM3D\x64\Tools\H5VoxelToVtk
.vcxproj]

When i switch back to my 1.8.9 build I do not get these errors. is
there something that needs to get updated in projects that link
against hdf5 1.8.10 versus 1.8.9?

thanks for any help
_
Mike Jackson  mike.jack...@bluequartz.net
BlueQuartz Softwarewww.bluequartz.net
Principal Software Engineer  Dayton, Ohio

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Link Errors with 1.8.10 but not in 1.8.9

2012-11-20 Thread Michael Jackson
Hmm
  Looking at the H5pubconf.h for the 1.8.9 and 1.8.10 builds it would seem that 
there were some merge issues maybe? Otherwise why was that definition removed? 
Not that I run the tests much (I assume if HDF5 is release then all the tests 
are passing but how would any of the tests run if nothing to can correctly? 
Odd. Let me take a look. Can we get this into the first patch release if I 
figure out what went wrong?   
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Nov 20, 2012, at 1:56 PM, Allen D Byrne wrote:

 Mike,
 
 Are you using CMake to create the hdf5 library? There is an issue with a 
 definition that needs to be solved for the next release (the pre-built 
 binaries 
 have this define in the H5pubconf.h).
 Add 
#define H5_BUILT_AS_DYNAMIC_LIB 1
 in your H5pubconf.h file after building HDF5.
 
 Allen
 
 On Tuesday, November 20, 2012 10:57:01 AM Michael Jackson wrote:
 Just updated to 1.8.10 by building HDf5 myself with VS2010 as dynamic
 libraries. When compiling my own application against this new build i
 am getting the following errors:
 
  libDREAM3DLib.lib(StatsData.obj) : error LNK2001: unresolved external
 symbol H5T_NATIVE_UINT16_g
 [C:\Users\mjackson\Workspace\DREAM3D\x64\Tools\H5VoxelToVtk .vcxproj]
 
 When i switch back to my 1.8.9 build I do not get these errors. is
 there something that needs to get updated in projects that link
 against hdf5 1.8.10 versus 1.8.9?
 
 thanks for any help
 _
 Mike Jackson  mike.jack...@bluequartz.net
 BlueQuartz Softwarewww.bluequartz.net
 Principal Software Engineer  Dayton, Ohio
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Link Errors with 1.8.10 but not in 1.8.9

2012-11-20 Thread Michael Jackson
Which FIND_PACKAGE sets that variable? 

Inside of the Top level CMakeLists.txt file I have this now.

OPTION (BUILD_SHARED_LIBS Build Shared Libraries OFF)
set(H5_BUILT_AS_DYNAMIC_LIB 0)
set(H5_BUILT_AS_STATIC_LIB 1)
SET (LIB_TYPE STATIC)
SET (H5_ENABLE_SHARED_LIB NO)
SET (H5_ENABLE_STATIC_LIB NO)
IF (BUILD_SHARED_LIBS)
  SET (LIB_TYPE SHARED)
  ADD_DEFINITIONS (-DH5_BUILT_AS_DYNAMIC_LIB)
  set(H5_BUILT_AS_DYNAMIC_LIB 1)
  set(H5_BUILT_AS_STATIC_LIB 0)
  SET (H5_ENABLE_SHARED_LIB YES)
ELSE (BUILD_SHARED_LIBS)
  ADD_DEFINITIONS (-DH5_BUILT_AS_STATIC_LIB)
  set(H5_BUILT_AS_DYNAMIC_LIB 0)
  set(H5_BUILT_AS_STATIC_LIB 1)
  SET (H5_ENABLE_STATIC_LIB YES)
  IF (NOT WIN32)
# should this be a user setting : Everyone uses it anyway ?
ADD_DEFINITIONS (-DPIC)
  ENDIF (NOT WIN32)
ENDIF (BUILD_SHARED_LIBS)


Then in config/cmake/H5pubconf.h.in I added the following:

/* Defined if HDF5 was built with CMake AND build as a shared library */
#cmakedefine H5_BUILT_AS_DYNAMIC_LIB @H5_BUILT_AS_DYNAMIC_LIB@

/* Defined if HDF5 was built with CMake AND build as a static library */
#cmakedefine H5_BUILT_AS_STATIC_LIB @H5_BUILT_AS_STATIC_LIB@

Which now allows everything to work. I am still confused why the change was 
needed for Debug/Release? Ironically enough I'm the original person to put the 
H5_BUILT_AS_DYNAMIC_LIB flag in the header file because when on windows and 
using CMake in another project that includes HDF5 I needed to be able to better 
figure out how the libraries were built so I could make decisions on what needs 
to be copied into my deployment package.

___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Nov 20, 2012, at 2:36 PM, Allen D Byrne wrote:

 The problem was that I attempted to allow multiple installs (debug/release, 
 static/dynamic) of hdf5. Everything works great with cmake, because the value 
 is set by the FIND_PACKAGE command. However, I failed to account for the non-
 cmake process. I wanted to eliminate the duplicate include folders that would 
 result from multiple installs.
 
 What to do with that definition is the issue!
 
 Allen
 
 On Tuesday, November 20, 2012 02:04:21 PM Michael Jackson wrote:
 Hmm
  Looking at the H5pubconf.h for the 1.8.9 and 1.8.10 builds it would seem
 that there were some merge issues maybe? Otherwise why was that definition
 removed? Not that I run the tests much (I assume if HDF5 is release then
 all the tests are passing but how would any of the tests run if nothing to
 can correctly? Odd. Let me take a look. Can we get this into the first
 patch release if I figure out what went wrong?
 ___
 Mike JacksonPrincipal Software Engineer
 BlueQuartz SoftwareDayton, Ohio
 mike.jack...@bluequartz.net  www.bluequartz.net
 
 On Nov 20, 2012, at 1:56 PM, Allen D Byrne wrote:
 Mike,
 
 Are you using CMake to create the hdf5 library? There is an issue with a
 definition that needs to be solved for the next release (the pre-built
 binaries have this define in the H5pubconf.h).
 Add
 
   #define H5_BUILT_AS_DYNAMIC_LIB 1
 
 in your H5pubconf.h file after building HDF5.
 
 Allen
 
 On Tuesday, November 20, 2012 10:57:01 AM Michael Jackson wrote:
 Just updated to 1.8.10 by building HDf5 myself with VS2010 as dynamic
 libraries. When compiling my own application against this new build i
 
 am getting the following errors:
 libDREAM3DLib.lib(StatsData.obj) : error LNK2001: unresolved external
 
 symbol H5T_NATIVE_UINT16_g
 [C:\Users\mjackson\Workspace\DREAM3D\x64\Tools\H5VoxelToVtk .vcxproj]
 
 When i switch back to my 1.8.9 build I do not get these errors. is
 there something that needs to get updated in projects that link
 against hdf5 1.8.10 versus 1.8.9?
 
 thanks for any help
 _
 Mike Jackson  mike.jack...@bluequartz.net
 BlueQuartz Softwarewww.bluequartz.net
 Principal Software Engineer  Dayton, Ohio
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Link Errors with 1.8.10 but not in 1.8.9

2012-11-20 Thread Michael Jackson
Relying on someone else NOT to rename the libraries because they monkey around 
with the CMake files probably isn't a good option either. If you ask how are 
libraries named on Windows you will get at least 2 different answers (most 
likely more). With mingw they are .dylib, with msvc they should be .dll. But 
where are the Runtime libraries installed versus the link time libraries? 
some install them into lib some install them into bin. Some install in both 
places. I got tired of trying to out-think the other person and just added 
those symbols to the header. I have my own FindHDF5.cmake that looks 
specifically for those symbols during my project's configure process then can 
make smarter decisions based on what it finds.

  Still confused on the ... creating other folders. What other folders are 
getting created?

I don't have a problem with moving the build specific defines somewhere out of 
H5pubconf.h. Just let me know where you want them. I guess I'll add the 
necessary version checking code to my own project to know that if we are HDF5 
1.8.10 or less I look in a certain header otherwise look in another place. 
Having that define in there solves a lot of deployment problems for windows and 
to some extent OS X.

Thanks
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Nov 20, 2012, at 3:41 PM, Allen D Byrne wrote:

 Mike,
 
 Sorry didn't mean to confuse - debug/release - I was thinking globally about 
 the issue, obviously these defines have nothing to do with debug/release 
 other 
 then creating another set of folders. The find_package would happen after 
 hdf5 
 is installed and using cmake to read the hdf5 cmake configuration.
 
 I was hoping to remove the build specific defines in H5pubconf.h to somewhere 
 else, because these defines are the only difference between static/dynamic 
 installs. Libraries are named unique enough. 
 
 Of course thinking about it now, there are other optional components in 
 H5pubconf.h! (szip/zlib/parallel...)
 
 Allen
 
 On Tuesday, November 20, 2012 03:27:16 PM Michael Jackson wrote:
 Which FIND_PACKAGE sets that variable?
 
 Inside of the Top level CMakeLists.txt file I have this now.
 
 OPTION (BUILD_SHARED_LIBS Build Shared Libraries OFF)
 set(H5_BUILT_AS_DYNAMIC_LIB 0)
 set(H5_BUILT_AS_STATIC_LIB 1)
 SET (LIB_TYPE STATIC)
 SET (H5_ENABLE_SHARED_LIB NO)
 SET (H5_ENABLE_STATIC_LIB NO)
 IF (BUILD_SHARED_LIBS)
  SET (LIB_TYPE SHARED)
  ADD_DEFINITIONS (-DH5_BUILT_AS_DYNAMIC_LIB)
  set(H5_BUILT_AS_DYNAMIC_LIB 1)
  set(H5_BUILT_AS_STATIC_LIB 0)
  SET (H5_ENABLE_SHARED_LIB YES)
 ELSE (BUILD_SHARED_LIBS)
  ADD_DEFINITIONS (-DH5_BUILT_AS_STATIC_LIB)
  set(H5_BUILT_AS_DYNAMIC_LIB 0)
  set(H5_BUILT_AS_STATIC_LIB 1)
  SET (H5_ENABLE_STATIC_LIB YES)
  IF (NOT WIN32)
# should this be a user setting : Everyone uses it anyway ?
ADD_DEFINITIONS (-DPIC)
  ENDIF (NOT WIN32)
 ENDIF (BUILD_SHARED_LIBS)
 
 
 Then in config/cmake/H5pubconf.h.in I added the following:
 
 /* Defined if HDF5 was built with CMake AND build as a shared library */
 #cmakedefine H5_BUILT_AS_DYNAMIC_LIB @H5_BUILT_AS_DYNAMIC_LIB@
 
 /* Defined if HDF5 was built with CMake AND build as a static library */
 #cmakedefine H5_BUILT_AS_STATIC_LIB @H5_BUILT_AS_STATIC_LIB@
 
 Which now allows everything to work. I am still confused why the change was
 needed for Debug/Release? Ironically enough I'm the original person to put
 the H5_BUILT_AS_DYNAMIC_LIB flag in the header file because when on windows
 and using CMake in another project that includes HDF5 I needed to be able
 to better figure out how the libraries were built so I could make decisions
 on what needs to be copied into my deployment package.
 
 ___
 Mike JacksonPrincipal Software Engineer
 BlueQuartz SoftwareDayton, Ohio
 mike.jack...@bluequartz.net  www.bluequartz.net
 
 On Nov 20, 2012, at 2:36 PM, Allen D Byrne wrote:
 The problem was that I attempted to allow multiple installs
 (debug/release,
 static/dynamic) of hdf5. Everything works great with cmake, because the
 value is set by the FIND_PACKAGE command. However, I failed to account
 for the non- cmake process. I wanted to eliminate the duplicate include
 folders that would result from multiple installs.
 
 What to do with that definition is the issue!
 
 Allen
 
 On Tuesday, November 20, 2012 02:04:21 PM Michael Jackson wrote:
 Hmm
 
 Looking at the H5pubconf.h for the 1.8.9 and 1.8.10 builds it would seem
 
 that there were some merge issues maybe? Otherwise why was that
 definition
 removed? Not that I run the tests much (I assume if HDF5 is release then
 all the tests are passing but how would any of the tests run if nothing
 to
 can correctly? Odd. Let me take a look. Can we

Re: [Hdf-forum] HDF Newsletter #129

2012-11-13 Thread Michael Jackson
Hello,
  First congratulations on the new release. I would like some clarifications on 
the Supported platform changes. Specifically the VS2008 and Windows XP. I 
can read that in 2 different ways:

1: The *combination* of VS2008 running on Windows XP is deprecated
2: VS2008 on *any* version of Windows AND Windows XP

Thanks
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Nov 13, 2012, at 3:18 PM, Barbara Jones wrote:

 
 To view from a browser, see:
  http://www.hdfgroup.org/newsletters/newsletter129.html
 
 --
 The HDF Group Home Page: http://hdfgroup.org/
 Helpdesk  Mailing Lists Info:   http://hdfgroup.org/services/support.html
 --
 
Newsletter #129
   November 13, 2012
 
 CONTENTS:
 
 - Release of HDF5-1.8.10
   . New Features and Changes
   . Future Changes to Supported Platforms
 
 - Release of HDF4 File Content Map Writer 1.0.4
 
 - Retirement of HDF-EOS5 to NetCDF4 Converter (eos5nc4)
 
 - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
 
 Release of HDF5-1.8.10
 ==
 
 The HDF5-1.8.10 release is now available from The HDF Group Downloads page:
 
http://www.hdfgroup.org/downloads/
 
 It can also be obtained directly from:
 
http://www.hdfgroup.org/HDF5/release/obtain5.html
 
 The HDF5-1.8.10 documentation is located in:
 
http://www.hdfgroup.org/HDF5/doc/
 
 For information on HDF5, see the HDF5 home page:
 
http://www.hdfgroup.org/products/hdf5/index.html
 
 
 New Features and Changes
 
 
 HDF5-1.8.10 is a minor release, but contains several important new
 features and bug fixes. Changes in this release include the following:
 
 o Two new query APIs were added to the Parallel library:
 
H5Pget_mpio_no_collective_cause - retrieves reasons why the
 collective I/O was broken during read/write I/O access.
 
H5Pget_mpio_actual_io_mode_f - Fortran counterpart to
 H5Pget_mpio_actual_io_mode (already in library for retrieving
 the parallel I/O type performed).
 
 o The h5import utility now allows the use of h5dump output as input
   into h5import.
 
 o The conversion tests no longer fail in dt_arith.c on Mac OS X 10.7
   or 10.8.
 
 o The windows pre-built binaries are now provided with debug information.
   This enables users to debug their HDF5 applications using the pre-built
   binaries.
 
 This release contains many other changes and bug fixes not listed here,
 including fixes to various tools. Please be sure to read the Release
 Notes for a comprehensive list of new features and bug fixes:
 
   http://www.hdfgroup.org/ftp/HDF5/current/src/hdf5-1.8.10-RELEASE.txt
 
 
 
 Future Changes to Supported Platforms
 -
 
 We will be dropping support for the following platforms after this
 release:
 
 o VS 2008 and Windows XP
 o Mac OS X Snow Leopard 10.6.8 32-bit
 
 Also, we will no longer provide the VS project files for Windows after
 this release, and will be moving to CMake 2.8.10.
 
 
 
 Release of HDF4 File Content Map Writer 1.0.4
 =
 
 We are pleased to announce the release of HDF4 File Content Map Writer
 version 1.0.4.
 
 This release provides a fix for handling some NASA OCTS Level 3
 products (e.g., O19963061996335.L3m_MO_CHLO) that involve palettes.
 
 Please see the h4mapwriter web page for complete details regarding this
 release:
 
  http://www.hdfgroup.org/projects/h4map/h4map_writer.html
 
 
 
 
 Retirement of HDF-EOS5 to NetCDF4 Converter (eos5nc4)
 =
 
 The HDF Group plans to retire the HDF-EOS5 to NetCDF-4 Converter (eos5nc4)
 tool effective Dec 31, 2012.
 
 If you are using the eos5nc4 tool and would like for it to continue to be
 maintained, please either send a message to eosh...@hdfgroup.org or post
 a message to the HDF-EOS Forum by no later than December 19, 2012.
 
 Please see the HDF-EOS Forum announcement for more information:
 
  http://www.hdfeos.org/forums/showthread.php?p=1254#post1254
 
 
 
 -
 For questions regarding these or other HDF issues, contact:
 
   h...@hdfgroup.org
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


[Hdf-forum] Problem writing 3GB to a data set.

2012-10-29 Thread Michael Jackson
I am trying to write out a single array of size 3,101,590,800 bytes and I am 
getting the following HDF error.

 Writing Cell Data 'EulerAngles' to HDF5 File
HDF5-DIAG: Error detected in HDF5 (1.8.9) thread 0:
  #000: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dio.c line 266 in 
H5Dwrite(): can't write data
major: Dataset
minor: Write failed
  #001: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dio.c line 673 in 
H5D_write(): can't write data
major: Dataset
minor: Write failed
  #002: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dcontig.c line 597 in 
H5D_contig_write(): contiguous write failed
major: Dataset
minor: Write failed
  #003: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dselect.c line 306 in 
H5D_select_write(): write error
major: Dataspace
minor: Write failed
  #004: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dselect.c line 217 in 
H5D_select_io(): write error
major: Dataspace
minor: Write failed
  #005: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dcontig.c line 1210 in 
H5D_contig_writevv(): can't perform vectorized sieve buffer write
major: Dataset
minor: Can't operate on object
  #006: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5V.c line 1457 in H5V_opvv(): 
can't perform operation
major: Internal error (too specific to document in detail)
minor: Can't operate on object
  #007: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dcontig.c line 962 in 
H5D_contig_writevv_sieve_cb(): block write failed
major: Dataset
minor: Write failed
  #008: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Fio.c line 158 in 
H5F_block_write(): write through metadata accumulator failed
major: Low-level I/O
minor: Write failed
  #009: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Faccum.c line 808 in 
H5F_accum_write(): file write failed
major: Low-level I/O
minor: Write failed
  #010: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5FDint.c line 185 in 
H5FD_write(): driver write request failed
major: Virtual File Layer
minor: Write failed
  #011: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5FDsec2.c line 852 in 
H5FD_sec2_write(): file write failed: time = Mon Oct 29 15:34:22 2012
, filename = '/tmp/out.dream3d', file descriptor = 14, errno = 22, error 
message = 'Invalid argument', buf = 0x26dc5a000, size = 3101590800, offset = 
6624
major: Low-level I/O
minor: Write failed

Here are some details about the data array
Error Writing Data 'EulerAngles'
rank=2
dim[0]=258465900
dim[1]=3
TotalElements=775397700
Size of Type (Bytes):4
TotalBytes to Write: 3101590800

THis is with HDF version 1.8.9 compiled for 64 bit on OS X 10.6.8. Is there 
something special I need to be doing in order to write more than 2^32? bytes of 
data? Or maybe (most likely) there is a bug in how I am calling the HDF5 
functions to write this data?

Thanks.
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Problem writing 3GB to a data set.

2012-10-29 Thread Michael Jackson
On Oct 29, 2012, at 3:42 PM, Michael Jackson wrote:

 I am trying to write out a single array of size 3,101,590,800 bytes and I am 
 getting the following HDF error.
 
 Writing Cell Data 'EulerAngles' to HDF5 File
 HDF5-DIAG: Error detected in HDF5 (1.8.9) thread 0:
  #000: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dio.c line 266 in 
 H5Dwrite(): can't write data
major: Dataset
minor: Write failed
  #001: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dio.c line 673 in 
 H5D_write(): can't write data
major: Dataset
minor: Write failed
  #002: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dcontig.c line 597 in 
 H5D_contig_write(): contiguous write failed
major: Dataset
minor: Write failed
  #003: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dselect.c line 306 in 
 H5D_select_write(): write error
major: Dataspace
minor: Write failed
  #004: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dselect.c line 217 in 
 H5D_select_io(): write error
major: Dataspace
minor: Write failed
  #005: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dcontig.c line 1210 in 
 H5D_contig_writevv(): can't perform vectorized sieve buffer write
major: Dataset
minor: Can't operate on object
  #006: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5V.c line 1457 in 
 H5V_opvv(): can't perform operation
major: Internal error (too specific to document in detail)
minor: Can't operate on object
  #007: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Dcontig.c line 962 in 
 H5D_contig_writevv_sieve_cb(): block write failed
major: Dataset
minor: Write failed
  #008: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Fio.c line 158 in 
 H5F_block_write(): write through metadata accumulator failed
major: Low-level I/O
minor: Write failed
  #009: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5Faccum.c line 808 in 
 H5F_accum_write(): file write failed
major: Low-level I/O
minor: Write failed
  #010: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5FDint.c line 185 in 
 H5FD_write(): driver write request failed
major: Virtual File Layer
minor: Write failed
  #011: /Users/mjackson/Workspace/hdf5-1.8.9/src/H5FDsec2.c line 852 in 
 H5FD_sec2_write(): file write failed: time = Mon Oct 29 15:34:22 2012
 , filename = '/tmp/out.dream3d', file descriptor = 14, errno = 22, error 
 message = 'Invalid argument', buf = 0x26dc5a000, size = 3101590800, offset = 
 6624
major: Low-level I/O
minor: Write failed
 
 Here are some details about the data array
 Error Writing Data 'EulerAngles'
rank=2
dim[0]=258465900
dim[1]=3
TotalElements=775397700
Size of Type (Bytes):4
TotalBytes to Write: 3101590800
 
 THis is with HDF version 1.8.9 compiled for 64 bit on OS X 10.6.8. Is there 
 something special I need to be doing in order to write more than 2^32? bytes 
 of data? Or maybe (most likely) there is a bug in how I am calling the HDF5 
 functions to write this data?
 
 Thanks.

I have been trying to track this down and have verified that if I try to write 
data that is 2^31 + 1 bytes then I get the error. If I try to write 2^31 then I 
am fine and the write succeeds. I am going to try this same code on my Visual 
Studio 2008 build and see what happens.

---
Mike Jackson



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] hdf5 1.8.9 on mingw made easy.

2012-10-27 Thread Michael Jackson
I did the stupid thing and wrote first before researching. My apologies. What 
follows is my opinion: Seems like the whole MinGW landscape is a mess when it 
comes to 64 bit support (64 bit targets). There is not one main download but a 
few different ones from individuals each with its own set of problems. After 
searching through numerous forums and reading as much of the history of MinGW64 
as possible it would seem that for the immediate future I don't think actually 
supporting HDF5 on MinGW is in the cards for anyone currently. Sorry for the 
noise. I'm going to advise my users that they figure out a way to use Visual 
Studio for their windows builds.

Sorry for the noise. 
--
Mike Jackson www.bluequartz.net

On Oct 26, 2012, at 10:52 PM, Elena Pourmal wrote:

 Mike,
 
 I am sorry to disappoint you, but we don't have any resources to support 
 MinGW. Said this we will be more than happy to work with the members of 
 the HDF5 community who will help us to maintain HDF5 on MinGW and will 
 provide us with the patches, test for the releases, who can answer the 
 questions, etc.  
 
 All,
 
 Please contact me if you are interested in HDF5 on MinGW and can help with 
 maintaining HDF5 on the platform. 
 
 Thank you!
 
 Elena
 ~~~
 Elena Pourmal
 Director of Technical Services and Operations
 The HDF Group
 1800 So. Oak St., Suite 203,
 Champaign, IL 61820
 www.hdfgroup.org
 (217)531-6112 (office)
 ~~~
 
 
 
 On Oct 26, 2012, at 2:31 PM, Michael Jackson wrote:
 
 Are all of these issues resolved for the next HDF5 release? Our project uses 
 HDF5 and our developers have asked us to support MinGW on Windows. HDF5 
 building under MinGW was holding us back in the past so I just thought I 
 would check to see if the issues have been solved?
 
 Thanks
 ___
 Mike JacksonPrincipal Software Engineer
 BlueQuartz SoftwareDayton, Ohio
 mike.jack...@bluequartz.net  www.bluequartz.net
 
 On Jul 19, 2012, at 9:28 AM, Allen D Byrne wrote:
 
 I should specify that the problem with generating th e h5pubconf.h file is 
 with the autotools configure process. CMake works just fine.
 
 Allen
 
 On Thursday, July 19, 2012 08:21:07 AM Allen D Byrne wrote:
 The next release of HDF5 is this fall with the code freeze at the end of 
 September. 
 
 The current code base in svn (including the cmake branch) already works on 
 Windows 7 and mingw using cmake, with the exception of fortran. 
 I will work with the autotools experts to review these patches with the 
 changes I already made (added check for the wsock lib). 
 
 My version of mingw on Windows 7 will not generate the h5pubconf.h file 
 properly (verbatim copy of the config.in file), 
 and that has prevented me from finishing the support for mingw.
 
 Allen
 
 
 On Wednesday, July 18, 2012 10:51:52 PM Michael Jackson wrote:
 If we added these features to the CMake scripts would that help also? I 
 am 
 more of a CMake Guy than an autotools guy but I would think adding these 
 flags to the CMake files for MinGW would allow one to configure HDF5 with 
 CMake 
 on MinGW (or MSYS).
 
 Thoughts? When is the next release of HDF5 scheduled for? Not sure I am 
 going to have time to patch before mid August.
 ___
 Mike JacksonPrincipal Software Engineer
 BlueQuartz SoftwareDayton, Ohio
 mike.jack...@bluequartz.net  www.bluequartz.net
 
 On Jul 18, 2012, at 6:05 PM, edscott wilson garcia wrote:
 
 Hi forum,
 
  In order to get hdf5 1.8.9 to compile cleanly with no fuzz on mingw, 
 just patch two files, configure.in and src/Makefile. Then regenerate 
 scripts with 
 aclocal  autoheader  libtoolize   automake  autoconf.
 Finally run configure with the option --with-mingw, compile and enjoy.
 
 The patches are very simple:
 
 
  Patch for configure.in
 ***
 --- /tmp/hdf5-1.8.9/configure.in.old2012-05-09 10:07:27.0 
 -0500
 +++ /tmp/hdf5-1.8.9/configure.in2012-07-18 16:51:35.0 -0500
 @@ -49,6 +49,16 @@
 dnl rebuild rules.
 AM_MAINTAINER_MODE
 
 +AC_ARG_WITH(mingw, [--with-mingwenable compilation with gnu gcc 
 under 
 mingw])
 +AM_CONDITIONAL([WITH_MINGW], [test $with_mingw = yes])
 +if test $with_mingw = yes; then
 +  AC_DEFINE([HAVE_WINDOWS],[1],[Define if the Windows virtual file 
 driver 
 should be compiled])
 +  AC_DEFINE([HAVE_MINGW],[1],[Define if on mingw])
 +  AC_DEFINE([HAVE_WIN32_API],[1],[Define if on the Windows platform])
 +dnl We also need to add a -lwsock32 to avert _WSAStartup@8 errors
 +  LIBS=$LIBS -lwsock32
 +fi
 +
 dnl Run post processing on files created by configure.
 dnl src/H5pubconf.h:
 dnl Generate src/H5pubconf.h from src/H5config.h by prepending H5_ to all
 
 
  Patch for src

Re: [Hdf-forum] hdf5 1.8.9 on mingw made easy.

2012-10-26 Thread Michael Jackson
Are all of these issues resolved for the next HDF5 release? Our project uses 
HDF5 and our developers have asked us to support MinGW on Windows. HDF5 
building under MinGW was holding us back in the past so I just thought I would 
check to see if the issues have been solved?

Thanks
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Jul 19, 2012, at 9:28 AM, Allen D Byrne wrote:

 I should specify that the problem with generating th e h5pubconf.h file is 
 with the autotools configure process. CMake works just fine.
 
 Allen
 
 On Thursday, July 19, 2012 08:21:07 AM Allen D Byrne wrote:
 The next release of HDF5 is this fall with the code freeze at the end of 
 September. 
 
 The current code base in svn (including the cmake branch) already works on 
 Windows 7 and mingw using cmake, with the exception of fortran. 
 I will work with the autotools experts to review these patches with the 
 changes I already made (added check for the wsock lib). 
 
 My version of mingw on Windows 7 will not generate the h5pubconf.h file 
 properly (verbatim copy of the config.in file), 
 and that has prevented me from finishing the support for mingw.
 
 Allen
 
 
 On Wednesday, July 18, 2012 10:51:52 PM Michael Jackson wrote:
 If we added these features to the CMake scripts would that help also? I am 
 more of a CMake Guy than an autotools guy but I would think adding these 
 flags to the CMake files for MinGW would allow one to configure HDF5 with 
 CMake 
 on MinGW (or MSYS).
 
 Thoughts? When is the next release of HDF5 scheduled for? Not sure I am 
 going to have time to patch before mid August.
 ___
 Mike JacksonPrincipal Software Engineer
 BlueQuartz SoftwareDayton, Ohio
 mike.jack...@bluequartz.net  www.bluequartz.net
 
 On Jul 18, 2012, at 6:05 PM, edscott wilson garcia wrote:
 
 Hi forum,
 
   In order to get hdf5 1.8.9 to compile cleanly with no fuzz on mingw, 
 just patch two files, configure.in and src/Makefile. Then regenerate scripts 
 with 
 aclocal  autoheader  libtoolize   automake  autoconf.
 Finally run configure with the option --with-mingw, compile and enjoy.
 
 The patches are very simple:
 
 
  Patch for configure.in
 ***
 --- /tmp/hdf5-1.8.9/configure.in.old2012-05-09 10:07:27.0 -0500
 +++ /tmp/hdf5-1.8.9/configure.in2012-07-18 16:51:35.0 -0500
 @@ -49,6 +49,16 @@
 dnl rebuild rules.
 AM_MAINTAINER_MODE
 
 +AC_ARG_WITH(mingw, [--with-mingwenable compilation with gnu gcc under 
 mingw])
 +AM_CONDITIONAL([WITH_MINGW], [test $with_mingw = yes])
 +if test $with_mingw = yes; then
 +  AC_DEFINE([HAVE_WINDOWS],[1],[Define if the Windows virtual file driver 
 should be compiled])
 +  AC_DEFINE([HAVE_MINGW],[1],[Define if on mingw])
 +  AC_DEFINE([HAVE_WIN32_API],[1],[Define if on the Windows platform])
 +dnl We also need to add a -lwsock32 to avert _WSAStartup@8 errors
 +  LIBS=$LIBS -lwsock32
 +fi
 +
 dnl Run post processing on files created by configure.
 dnl src/H5pubconf.h:
 dnl Generate src/H5pubconf.h from src/H5config.h by prepending H5_ to all
 
 
  Patch for src/Makefile.am
 ***
 --- /tmp/hdf5-1.8.9/src/Makefile.am.old2012-05-09 10:05:58.0 
 -0500
 +++ /tmp/hdf5-1.8.9/src/Makefile.am2012-07-18 16:54:27.0 -0500
 @@ -103,6 +103,9 @@
 H5Zdeflate.c H5Zfletcher32.c H5Znbit.c H5Zshuffle.c H5Zszip.c  \
 H5Zscaleoffset.c H5Ztrans.c
 
 +if WITH_MINGW
 + libhdf5_la_SOURCES+=H5FDwindows.c
 +endif
 
 # Public headers
 include_HEADERS = hdf5.h H5api_adpt.h H5overflow.h H5pubconf.h H5public.h 
 H5version.h \
 @@ -115,6 +118,10 @@
 H5MMpublic.h H5Opublic.h H5Ppublic.h H5Rpublic.h H5Spublic.h \
 H5Tpublic.h H5Zpublic.h
 
 +if WITH_MINGW
 + libhdf5_la_SOURCES+=H5FDwindows.h
 +endif
 +
 # install libhdf5.settings in lib directory
 settingsdir=$(libdir)
 settings_DATA=libhdf5.settings
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Re: [Hdf-forum] build only release libraries with visual studio

2012-09-19 Thread Michael Jackson
When you use CMake with a Visual Studio Generator CMake will basically ignore 
the CMAKE_CONFIGURATION_TYPES variable. Just generate the Visual Studio 
Solution as normal cmake. Then from the Visual Studio Command prompt add the 
following to the msbuild command:

msbuild /p:Configuration=Release [other arguments] [HDFProject.sln]

That is what i use.
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Sep 18, 2012, at 5:53 PM, Kraus Philipp wrote:

 Hello,
 
 I try to build the HDF libs with Visual Studio, but I would like to create 
 only the release libs. I have set the CMAKE_CONFIGURATION_TYPES to release 
 but I can call msbuild ALL_BUILD.vcxproj. If I call the msbuild command
 I gett the error, that the Release directory exists and so no libs are be 
 build. Also I tested the /property:Configuration=Release flag after the 
 msbuild command, the error that is created is always the same.
 If I run the msbuild command without any modification for the build types, 
 everything is fine, but the lib are always libs with debug symbols. 
 
 How can I create only the release libraries and how can build the debug 
 library only ?
 
 Thanks
 
 Phil
 
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] build only release libraries with visual studio

2012-09-19 Thread Michael Jackson
msbuild /p:Configuration=Release HDF5.sln

You need to use the SOLUTION file and NOT the vcxproj file.
--
Mike Jackson www.bluequartz.net

On Sep 19, 2012, at 11:27 AM, Kraus Philipp wrote:

 
 Am 19.09.2012 um 16:08 schrieb Michael Jackson:
 
 When you use CMake with a Visual Studio Generator CMake will basically 
 ignore the CMAKE_CONFIGURATION_TYPES variable. Just generate the Visual 
 Studio Solution as normal cmake. Then from the Visual Studio Command prompt 
 add the following to the msbuild command:
 
 msbuild /p:Configuration=Release [other arguments] [HDFProject.sln]
 
 That is what i use.
 
 This does not work. I run this commands under VS 2010
 cmake -DHDF5_USE_FOLDERS=OFF -DHDF5_BUILD_CPP_LIB=ON -DHDF5_BUILD_HL_LIB=ON 
 -DBUILD_SHARED_LIBS=ON -DCMAKE_INSTALL_PREFIX=
 msbuild /p:Configuration=Release ALL_BUILDvcxproj
 msbuild INSTALL.vcxproj
 
 the first msbuild creates the error, that the release can not be created 
 because it exsists (it exists a directory with the same name). If I run the 
 ALL_BUILD from UI, it runs without error, but
 on running the INSTALL target no HDF libs will be created, the lib directory 
 within the target directory is empty and within the bin directory only the 
 mscvp100.dll and msvcr100.dll are stored.
 I have also build the hdf5, hdf5_cpp, hdf5_hl, hdf5_hl_cpp project, nothing 
 is changed.
 
 I would like to create under VS dynamic libs (with  without debug symbols) 
 like the linux command:
 configure --enabled-cxx --prefix=
 make
 make install
 
 How can I do this? IMHO I need cmake to create the VS project data and at 
 next how I must run the VS projects for the correct build sequence?
 
 Thanks
 
 Phil
 
 
 
 
 
 ___
 Mike JacksonPrincipal Software Engineer
 BlueQuartz SoftwareDayton, Ohio
 mike.jack...@bluequartz.net  www.bluequartz.net
 
 On Sep 18, 2012, at 5:53 PM, Kraus Philipp wrote:
 
 Hello,
 
 I try to build the HDF libs with Visual Studio, but I would like to create 
 only the release libs. I have set the CMAKE_CONFIGURATION_TYPES to 
 release but I can call msbuild ALL_BUILD.vcxproj. If I call the msbuild 
 command
 I gett the error, that the Release directory exists and so no libs are be 
 build. Also I tested the /property:Configuration=Release flag after the 
 msbuild command, the error that is created is always the same.
 If I run the msbuild command without any modification for the build types, 
 everything is fine, but the lib are always libs with debug symbols. 
 
 How can I create only the release libraries and how can build the debug 
 library only ?
 
 Thanks
 
 Phil
 
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] hdf5 1.8.9 on mingw made easy.

2012-07-18 Thread Michael Jackson
If we added these features to the CMake scripts would that help also? I am more 
of a CMake Guy than an autotools guy but I would think adding these flags to 
the CMake files for MinGW would allow one to configure HDF5 with CMake on MinGW 
(or MSYS).

Thoughts? When is the next release of HDF5 scheduled for? Not sure I am going 
to have time to patch before mid August.
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Jul 18, 2012, at 6:05 PM, edscott wilson garcia wrote:

 Hi forum,
 
In order to get hdf5 1.8.9 to compile cleanly with no fuzz on mingw, just 
 patch two files, configure.in and src/Makefile. Then regenerate scripts with 
 aclocal  autoheader  libtoolize   automake  autoconf.
 Finally run configure with the option --with-mingw, compile and enjoy.
 
 The patches are very simple:
 
 
  Patch for configure.in
 ***
 --- /tmp/hdf5-1.8.9/configure.in.old2012-05-09 10:07:27.0 -0500
 +++ /tmp/hdf5-1.8.9/configure.in2012-07-18 16:51:35.0 -0500
 @@ -49,6 +49,16 @@
  dnl rebuild rules.
  AM_MAINTAINER_MODE
  
 +AC_ARG_WITH(mingw, [--with-mingwenable compilation with gnu gcc under 
 mingw])
 +AM_CONDITIONAL([WITH_MINGW], [test $with_mingw = yes])
 +if test $with_mingw = yes; then
 +  AC_DEFINE([HAVE_WINDOWS],[1],[Define if the Windows virtual file driver 
 should be compiled])
 +  AC_DEFINE([HAVE_MINGW],[1],[Define if on mingw])
 +  AC_DEFINE([HAVE_WIN32_API],[1],[Define if on the Windows platform])
 +dnl We also need to add a -lwsock32 to avert _WSAStartup@8 errors
 +  LIBS=$LIBS -lwsock32
 +fi
 +
  dnl Run post processing on files created by configure.
  dnl src/H5pubconf.h:
  dnl Generate src/H5pubconf.h from src/H5config.h by prepending H5_ to all
 
 
  Patch for src/Makefile.am
 ***
 --- /tmp/hdf5-1.8.9/src/Makefile.am.old2012-05-09 10:05:58.0 -0500
 +++ /tmp/hdf5-1.8.9/src/Makefile.am2012-07-18 16:54:27.0 -0500
 @@ -103,6 +103,9 @@
  H5Zdeflate.c H5Zfletcher32.c H5Znbit.c H5Zshuffle.c H5Zszip.c  \
  H5Zscaleoffset.c H5Ztrans.c
  
 +if WITH_MINGW
 + libhdf5_la_SOURCES+=H5FDwindows.c
 +endif
  
  # Public headers
  include_HEADERS = hdf5.h H5api_adpt.h H5overflow.h H5pubconf.h H5public.h 
 H5version.h \
 @@ -115,6 +118,10 @@
  H5MMpublic.h H5Opublic.h H5Ppublic.h H5Rpublic.h H5Spublic.h \
  H5Tpublic.h H5Zpublic.h
  
 +if WITH_MINGW
 + libhdf5_la_SOURCES+=H5FDwindows.h
 +endif
 +
  # install libhdf5.settings in lib directory
  settingsdir=$(libdir)
  settings_DATA=libhdf5.settings
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


[Hdf-forum] Symlinks and install_name on OS X Libraries

2012-07-10 Thread Michael Jackson
I had a question about the creation of the extra symlinks with version numbers 
and the embedded information in the dynamic libraries. Here is my issue:

First a listing of the lib directory (partial)
libhdf5.1.8.9.dylib*
libhdf5.7.3.0.dylib@ - libhdf5.1.8.9.dylib
libhdf5.dylib@ - libhdf5.7.3.0.dylib
libhdf5_debug.1.8.9.dylib*


I see that libhdf5.1.8.9.dylib is the real library. So lets take a look at 
its embedded install_path

libhdf5.1.8.9.dylib:
/Users/Shared/Toolkits/hdf5-1.8.9/lib/libhdf5.7.3.0.dylib (compatibility 
version 7.3.0, current version 1.8.9)
/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 
125.2.0)

So the install_name has this library pointing to another file 
libhdf5.7.3.0.dylib. But if we look at that file it is really a symlink back 
to the original file. So it appears I have a circular reference. Was this 
intended? What brings this up is that I was building my app against the 1.8.9 
release and Xcode choked on the hdf5 library. Seems in my CMake file the check 
for HDF5 found libhdf5.1.8.9.dylib which points back to libhdf5.1.8.9.dylib via 
a symlink but that has an embedded install_name of libhdf5.7.3.0.dylib. So 
Xcode choked.

Anyone have any thoughts on this issue? Is this something fixable?
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


[Hdf-forum] Bug in Source Release for hdf5 1.8.9

2012-06-18 Thread Michael Jackson
I was trying to build and install HDF5 on an OpenSUSE box today when the 
installation failed. Turns out on line 887 of the top level CMakeLists.txt the 
capitalization of a file name does NOT match what is on the file system. It 
should be USING_CMake.txt instead of Using_CMake.txt.

 Would it be possible to update the source archive on the web site? I am 
writing a script that downloads the .tar.gz and tries to build but this error 
is stopping the script from executing properly.

Thanks
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] CMake vs configure/make: why different results?

2012-04-17 Thread Michael Jackson
Doh. I got my email threads mixed up. Sorry for the noise.
___
Mike Jackson

On Apr 17, 2012, at 3:38 PM, Allen D Byrne wrote:

 Mike,
  I am under the assumption that this is a different issue and Stephane is not 
 using MinGW. 
 Allen
 
 The original email was using these parameters:
 
 I have to use CMake now for the same results and here are the options used, 
 with the same sources (hdf5-1.8.8):
 cmake -DCMAKE_INSTALL_PREFIX:STRING=${INSTALL} \
 -DCMAKE_C_COMPILER:FILEPATH=/usr/local/openmpi/1.5.3-intel/bin/mpicc \
 -DCMAKE_CXX_COMPILER:FILEPATH=/usr/local/openmpi/1.5.3-intel/bin/mpiCC \
 -DCMAKE_Fortran_COMPILER:FILEPATH=/usr/local/openmpi/1.5.3-intel/bin/mpif90 \
 -DBUILD_TESTING:BOOL=ON 
 -DMPI_LIBRARY:FILEPATH=/usr/local/openmpi/1.5.3-intel/lib/libmpi.so \
 -DMPI_INCLUDE_PATH:STRING=/usr/local/openmpi/1.5.3-intel/include/\
 -DHDF5_BUILD_FORTRAN:BOOL=ON   \
 -DHDF5_BUILD_HL_LIB:BOOL=ON  \
 -DHDF5_BUILD_TOOLS:BOOL=ON  \
 -DHDF5_ENABLE_PARALLEL:BOOL=ON  \
 -DHDF5_ENABLE_Z_LIB_SUPPORT:BOOL=ON  \
 -DMPIEXEC_NUMPROC_FLAG:STRING=-np  \
 -DMPIEXEC_MAX_NUMPROCS:STRING=4096   \
  \-DMPIEXEC_PREFLAGS:STRING=-N 1 -d 1\   \
 -DMPIEXEC:FILEPATH=/usr/local/openmpi/1.5.3-intel/bin/mpirun ..
 
 
  After you compiled HDF5 with MinGW and CMake did you run the HDF5 tests? If 
  all the tests pass then this would indicate an error in your Makefiles. If 
  lots of the tests fail then this would indicate an error in the CMake side 
  of things.
  --
  Mike Jackson www.bluequartz.net
  
  On Apr 17, 2012, at 3:06 PM, Stéphane Backaert wrote:
  
   Allen,
   
   I just checked: H5_HAVE_PARALLEL is set to 1 in the file H5pubconf.h. And 
   I put libhdf5_f90cstub in my LDFLAGS...
   I must admit that I am a bit lost this time!
   Is there a way to test the libraries ? I mean, the compilation process 
   does not complain, my CMake parameters, sent before, are right, so I 
   guess that my link process is wrong... 
   By the way, why libhdf5_f90cstub is present with CMake and not with 
   autotools?
   
   All of this is due to the CMake...grrr
   
   Best,
   
   Stephane
   
   On Apr 17, 2012, at 8:41 PM, Allen D Byrne wrote:
   
   Stéphane,
   The undefined references seem to indicate it is looking for parallel 
   references? Maybe the fortran library failed to build properly, or 
   didn't include a parallel library? Is H5_HAVE_PARALLEL defined in the 
   H5pubconf.h file? h5pset_fapl_mpio_c is in the H5FDmpiof.c file which 
   should be in the libhdf5_f90cstub library.
   
   Allen
   
Allen,

I will look into this HDF5-CMake issue a bit further: I think I can 
learn a lot from this investigation! 
But now I would like make some progress. 
Basically, my only problem is the linking of my code with the hdf5-vfd 
libraries. I can link my code with h5 libraries compiled with 
configure/make, with h5pfc (easy) but also without it. Actually, I 
already took care of those libraries explicitly/manually in my 
makefile. So I can link without h5pfc.

With the CMake libraries (and thus without h5pfc), here is the kind of 
messages I get when I link my code with hdf5:

/home/ucl/tfl/sbackaer/local/stow/hdf5-vfd-1.8.8_cmake/lib/libhdf5_fortran.a(H5_ff.f90.o):
 In function `h5lib_mp_h5open_f_':
/home/ucl/tfl/sbackaer/local/build/hdf5-vfd-1.8.8/fortran/src/H5_ff.f90:(.text+0x2e):
 undefined reference to `h5init_types_c_'
/home/ucl/tfl/sbackaer/local/build/hdf5-vfd-1.8.8/fortran/src/H5_ff.f90:(.text+0xba):
 undefined reference to `h5init_flags_c_'
/home/ucl/tfl/sbackaer/local/build/hdf5-vfd-1.8.8/fortran/src/H5_ff.f90:(.text+0xc8):
 undefined reference to `h5init1_flags_c_'
...
/home/ucl/tfl/sbackaer/local/build/hdf5-vfd-1.8.8/fortran/src/H5FDmpioff.f90:(.text+0x17):
 undefined reference to `h5pset_fapl_mpio_c_'
/home/ucl/tfl/sbackaer/local/stow/hdf5-vfd-1.8.8_cmake/lib/libhdf5_fortran.a(H5FDmpioff.f90.o):
 In function `h5fdmpio_mp_h5pget_fapl_mpio_f_':
/home/ucl/tfl/sbackaer/local/build/hdf5-vfd-1.8.8/fortran/src/H5FDmpioff.f90:(.text+0x27):
 undefined reference to `h5pget_fapl_mpio_c_'
/home/ucl/tfl/sbackaer/local/stow/hdf5-vfd-1.8.8_cmake/lib/libhdf5_fortran.a(H5FDmpioff.f90.o):
 In function `h5fdmpio_mp_h5pset_dxpl_mpio_f_':
/home/ucl/tfl/sbackaer/local/build/hdf5-vfd-1.8.8/fortran/src/H5FDmpioff.f90:(.text+0x37):
 undefined reference to `h5pset_dxpl_mpio_c_'
/home/ucl/tfl/sbackaer/local/stow/hdf5-vfd-1.8.8_cmake/lib/libhdf5_fortran.a(H5FDmpioff.f90.o):
 In function `h5fdmpio_mp_h5pget_dxpl_mpio_f_':
/home/ucl/tfl/sbackaer/local/build/hdf5-vfd-1.8.8/fortran/src/H5FDmpioff.f90:(.text+0x47):
 undefined reference to `h5pget_dxpl_mpio_c_'
/home/ucl/tfl/sbackaer/local/stow/hdf5-vfd-1.8.8_cmake/lib/libhdf5_fortran.a(H5FDmpioff.f90.o):
 In function `h5fdmpio_mp_h5pset_fapl_mpiposix_f_':

Re: [Hdf-forum] SOS: gfortran and HDF5 on MAC

2012-02-14 Thread Michael Jackson
According to the Error message your system can not create the file. Is there a 
file of the same name that is still open in another application perhaps? Do you 
have write permissions to what ever directory you are trying to create the file 
in?
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Feb 13, 2012, at 5:17 PM, Silke Trömel wrote:

 Dear all,
 I  already used HDF5 with the Fortran ifort compiler on a linux system. 
 However, I now installed hdf5.7-gfortran successfully on MAC OS X, Version 
 10.7.2. using fink. I am also able to compile the example code 
 dsetexample.f90 via
 gfortran -o dsetexample dsetexample.f90 -lhdf5 -lhdf5_fortran -I/sw/include 
 -L/sw/lib
 (is something missing?)
 but I am not able to run it. I get the message below. I would be very very 
 happy for any help. I have no idea how to proceed.
 Thanks in advance,
 best wishes,
 Silke
 
 
 
 
 HDF5-DIAG: Error detected in HDF5 (1.8.8) thread 0:
 #000: H5F.c line 1440 in H5Fcreate(): unable to create file
   major: File accessability
   minor: Unable to open file
 #001: H5F.c line 1267 in H5F_open(): unable to open file
   major: File accessability
   minor: Unable to open file
 #002: H5FD.c line 1086 in H5FD_open(): open failed
   major: Virtual File Layer
   minor: Unable to initialize object
 #003: H5FDsec2.c line 362 in H5FD_sec2_open(): unable to open file: name = 
 'dsetf.h5', errno = 17, error message = 'File exists', flags = 15, o_flags = 
 a02
   major: File accessability
   minor: Unable to open file
 HDF5-DIAG: Error detected in HDF5 (1.8.8) thread 0:
 #000: H5D.c line 141 in H5Dcreate2(): not a location ID
   major: Invalid arguments to routine
   minor: Inappropriate type
 #001: H5Gloc.c line 253 in H5G_loc(): invalid object ID
   major: Invalid arguments to routine
   minor: Bad value
 HDF5-DIAG: Error detected in HDF5 (1.8.8) thread 0:
 #000: H5D.c line 391 in H5Dclose(): not a dataset
   major: Invalid arguments to routine
   minor: Inappropriate type
 HDF5-DIAG: Error detected in HDF5 (1.8.8) thread 0:
 #000: H5F.c line 1970 in H5Fclose(): not a file ID
   major: Invalid arguments to routine
   minor: Inappropriate type
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Access to source that is not autotool-ized?

2012-02-06 Thread Michael Jackson
If you use CMake to build HDF5 instead of AutoTools then the source directories 
are NOT touched at all. Assuming you use a different build directory.

wget http://www.hdfgroup.org/ftp/HDF5/current/src/hdf5-1.8.8.tar.bz2
tar xjvf hdf5-1.8.8.tar.bz2
mkdir hdf5-build
cd hdf5-build
cmake ../

Batters choice. CMake or AutoTools.
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Feb 6, 2012, at 10:54 AM, Noah Watkins wrote:

 Rhys,
 
 This works beautifully, thanks.
 
 -Noah
 
 On Feb 6, 2012, at 7:04 AM, Rhys Ulerich wrote:
 
 Is it possible to get access to HDF5 source that has not had autoconf 
 (etc…) ran, or alternatively, read-only access to version control?
 
 We would like to hack on some of the IO drivers, but the enormous amount of 
 intermediate files touched during the autotool process makes it difficult 
 to keep a clean view of our changes in version control.
 
 Use VPATH 
 (http://www.gnu.org/savannah-checkouts/gnu/automake/manual/html_node/VPATH-Builds.html).
 Like most automake-based software, you can build HDF5 into a
 directory which is not the source directory.
 
 wget http://www.hdfgroup.org/ftp/HDF5/current/src/hdf5-1.8.8.tar.bz2
 tar xjvf hdf5-1.8.8.tar.bz2
 mkdir hdf5-build
 cd hdf5-build
 ../hdf5-1.8.8/configure
 # All build artifacts are cleanly in hdf5-build
 # All source artifacts are cleanly in hdf5-1.8.8
 
 Hope that helps,
 Rhys
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] hdf5 and ImageMagick conflicts on Windows 32-bit

2011-12-02 Thread Michael Jackson
(VERY Red faced). 
  That is on OS X that there is a difference:

32 Bit: 
sizeof(int): 4 
sizeof(long): 4 
sizeof(long long): 8 

64 Bit
sizeof(int): 4 
sizeof(long): 8 
sizeof(long long): 8 

Again, sorry for the confusion. Trying to keep it all straight is confusing. 
Over the years I have just stayed away from long where absolutely possible.

___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Dec 2, 2011, at 8:39 AM, Andrea Parenti wrote:

 
 Dear Mike,
 
 I am sorry to contradict what you say, but what I get is
   sizeof(int) == sizeof(long) == 4
 compiling on Windows 7 (64 bits) with Visual Studio 2008. The data type size 
 is the same when I compile a 64-bit executable and a 32-bit executable.
 In order to have an 8-byte variable I must use long long, in fact I get
   sizeof(long long) == 8
 
 But maybe this is compiler dependent (I'm not an expert of Visual Studio, I 
 usually work on Linux).
 
   Andrea
 
 
 Dr. Andrea Parenti  DESY (FS-EC Group)
 
 mail: andrea.pare...@desy.deNotkestrasse 85
 web: http://www.desy.de/~parenti/   D-22607 Hamburg
 phone: +49 (0)40 8998 3154  Germany
 
 
 
 On Thu, 1 Dec 2011, Michael Jackson wrote:
 
 That is ONLY valid on 32 bit windows. On 64 bit windows long is 8 bytes in 
 length. The use of long should just be banned, IMHO.
 ___
 Mike JacksonPrincipal Software Engineer
 BlueQuartz SoftwareDayton, Ohio
 mike.jack...@bluequartz.net  www.bluequartz.net
 
 On Dec 1, 2011, at 9:53 AM, Andrea Parenti wrote:
 
 (ie just inverting the order from INT - LONG - LONG LONG to LONG LONG - 
 LONG - INT.) In this way ssize_t is of type long and there is no conflict 
 with ImageMagick. And I think there is no real change in HDF5 too, since 
 sizeof(int)==sizeof(long).
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] hdf5 and ImageMagick conflicts on Windows 32-bit

2011-12-01 Thread Michael Jackson
That is ONLY valid on 32 bit windows. On 64 bit windows long is 8 bytes in 
length. The use of long should just be banned, IMHO.
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Dec 1, 2011, at 9:53 AM, Andrea Parenti wrote:

 (ie just inverting the order from INT - LONG - LONG LONG to LONG LONG - LONG 
 - INT.)
 In this way ssize_t is of type long and there is no conflict with 
 ImageMagick. And I think there is no real change in HDF5 too, since 
 sizeof(int)==sizeof(long).


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5 and MinGW

2011-10-04 Thread Michael Jackson
I was just looking at that and here is the block of code in question:

/* Define the ssize_t type if it not is defined */
#if H5_SIZEOF_SSIZE_T==0
/* Undefine this size, we will re-define it in one of the sections below */
#undef H5_SIZEOF_SSIZE_T
#if H5_SIZEOF_SIZE_T==H5_SIZEOF_INT
typedef int ssize_t;
#   define H5_SIZEOF_SSIZE_T H5_SIZEOF_INT
#elif H5_SIZEOF_SIZE_T==H5_SIZEOF_LONG
typedef long ssize_t;
#   define H5_SIZEOF_SSIZE_T H5_SIZEOF_LONG
#elif H5_SIZEOF_SIZE_T==H5_SIZEOF_LONG_LONG
typedef long long ssize_t;
#   define H5_SIZEOF_SSIZE_T H5_SIZEOF_LONG_LONG
#else /* Can't find matching type for ssize_t */
#   error nothing appropriate for ssize_t
#endif
#endif

The question is why is H5_SIZEOF_SSIZE_T getting set to 0 on MinGW? Does the 
test fail for some reason on MinGW?

rogez: How did you get the HDF5 library that you are using? Did you compile it 
yourself? If so how did you configure the HDF5 source? With CMake or with the 
./configure? 

Thanks
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net

On Oct 4, 2011, at 11:57 AM, Peter Cao wrote:

 Try the following to see if it works:
 
 Set #define H5_SIZEOF_SSIZE_T to 1 at H5pubconf.h
 
 --pc
 
 On 10/4/2011 10:43 AM, rogez wrote:
 Hi,
 
 I was using HDF5 API on Visual Studio C++ and it worked fine.
 I now try to compile a new tutorial project under Eclipse/MinGW and I get
 the following issue :
 
 /HDF5/1.8.7/include/H5public.h:133:13: error: conflicting declaration
 'typedef int ssize_t'
 lib/gcc/mingw32/4.5.2/../../../../include/sys/types.h:118:18: error:
 'ssize_t' has a previous declaration as 'typedef _ssize_t ssize_t'/
 
 Do you know if there's a mean to configure HDF5 or Eclipse/MinGW to work
 together ?
 
 Thanks,
 
 Yves
 
 --
 View this message in context: 
 http://hdf-forum.184993.n3.nabble.com/HDF5-and-MinGW-tp3393676p3393676.html
 Sent from the hdf-forum mailing list archive at Nabble.com.
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5 and MinGW

2011-10-04 Thread Michael Jackson
Correct. For MinGW you will need to compile your own version. Those are for 
Visual Studio only from my estimation.
___
Mike JacksonPrincipal Software Engineer
BlueQuartz SoftwareDayton, Ohio
mike.jack...@bluequartz.net  www.bluequartz.net



On Oct 4, 2011, at 1:58 PM, rogez wrote:

 Thank you for your quick answers. I'm out of the office at that time but I
 will test these things tomorow.
 
 Just to reply to Mike, I didn't compiled HDF5 from sources. I'm on Windows
 XP 32 bits and I retrieved the HDF5-1.8.7_CMake_x86_shared.zip file for
 win32. Maybe it is not the right one for use with MinGW ?
 
 Yves
 
 --
 View this message in context: 
 http://hdf-forum.184993.n3.nabble.com/HDF5-and-MinGW-tp3393676p3394068.html
 Sent from the hdf-forum mailing list archive at Nabble.com.
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] My code crashes upon upgrade to OSX 10.7 Lion

2011-09-28 Thread Michael Jackson
If you are running on OS X 10.6 where is conf[j] created/initialized? maybe the 
compiler is getting more picky in OS X 10.7?
___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net 
BlueQuartz Software   Dayton, Ohio

On Sep 28, 2011, at 1:28 PM, Amos Anderson wrote:

 Hi Bojan --
 
 The crash is not happening in DataSet::p_read_fixed_len. I tried a
 bunch of things with Binh-Minh Ribler, but we were unable to resolve
 it, but here's a summary. The crash is in DataSet::p_read_variable_len
 on the line with:
 
 strg = strg_C;
 
 The error is a double free on strg:
 
 Python(10981) malloc: *** error for object 0x104010820: pointer being
 freed was not allocated
 *** set a breakpoint in malloc_error_break to debug
 [amosa:10981] *** Process received signal ***
 [amosa:10981] Signal: Abort trap: 6 (6)
 [amosa:10981] Signal code:  (0)
 [amosa:10981] [ 0] 2   libsystem_c.dylib
 0x7fff8ce3bcfa _sigtramp + 26
 [amosa:10981] [ 1] 3   libhdf5.7.dylib
 0x0001042ec3f1 H5S_select_hyperslab + 1393
 [amosa:10981] [ 2] 4   libsystem_c.dylib
 0x7fff8ce3984c free + 389
 [amosa:10981] [ 3] 5   libstdc++.6.dylib
 0x7fff8c8f3702 _ZNSs4_Rep10_M_disposeERKSaIcE + 60
 [amosa:10981] [ 4] 6   libstdc++.6.dylib
 0x7fff8c8f4aab _ZNSs9_M_mutateEmmm + 281
 [amosa:10981] [ 5] 7   libstdc++.6.dylib
 0x7fff8c8f4b2d _ZNSs15_M_replace_safeEmmPKcm + 37
 [amosa:10981] [ 6] 8   libhdf5_cpp.7.dylib
 0x00010414af90 _ZNK2H57DataSet19p_read_variable_lenERSs + 1316
 [amosa:10981] [ 7] 9   libhdf5_cpp.7.dylib
 0x00010414d5ea
 _ZNK2H57DataSet4readERSsRKNS_8DataTypeERKNS_9DataSpaceES7_RKNS_19DSetMemXferPropListE
 + 1286
 
 
 The problem does not occur in a standalone unit test I made (and there
 is no problem on linux or osx 10.6). This implies the problem could be
 somewhere else in the rest of my code, but unfortunately valgrind
 doesn't work on osx 10.7 yet (and no problems have been detected on
 other machines). However, I can make the problem go away if I
 initialize the std::string in my code before calling DataSet::read:
 
   H5::StrType st(H5::PredType::C_S1, H5T_VARIABLE);
   H5::DataSet dataset;
  Vector_string conf(dims[0]);
 #if __ENVIRONMENT_MAC_OS_X_VERSION_MIN_REQUIRED__ = 1070
 conf[j] = hack to avoid crash;
 #endif
 dataset.read(conf[j],st,mem_space,file_space);
 
 
 Doing a google search on the error message itself along with a few
 keywords revealed this:
 http://stackoverflow.com/questions/1962685/xcode-stl-c-debug-compile-error
 and other similar posts, problems which I am able to replicate, and
 sets a precedent for the problem being with the compiler itself. OSX
 10.7 in general has seemed somewhat buggy too...
 
 
 Amos.
 
 
 
 On Wed, Sep 28, 2011 at 3:48 AM, Bojan Nikolic bo...@bnikolic.co.uk wrote:
 
 I haven't had the time to test this hypothesis but looking at the HDF5
 C++ wrapping code it looks like it may not be terminating the string
 before converting to C++ std::string if the H5Dread doesn't do that by
 itself for some reason. Maybe worth trying this patch?
 
 
 *** hdf5-1.8.7/c++/src/H5DataSet-fix.cpp2011-09-28 
 11:44:09.0 +0100
 --- hdf5-1.8.7/c++/src/H5DataSet.cpp2011-04-20 
 22:23:08.0 +0100
 ***
 *** 721,729 
throw DataSetIException(DataSet::read, H5Dread failed for 
 fixed length string);
}
 
 - // Terminate the string
 - strg_C[attr_size]=0;
 -
// Get string from the C char* and release resource allocated locally
strg = strg_C;
delete []strg_C;
 --- 721,726 
 
 
 --
 Bojan Nikolic  ||  http://www.bnikolic.co.uk
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 
 
 
 -- 
 ~
 Amos G. Anderson
 +1-626-399-8958 (cell)
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] libhdf5 dll build problems with MinGW

2011-04-01 Thread Michael Jackson
What version of MinGW are you using? Older versions of MinGW are missing 
standard headers related to time/date functions.
___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net 
BlueQuartz Software   Dayton, Ohio

On Apr 1, 2011, at 9:26 AM, Rodolfo Bonnin wrote:

 Hello all,
 
 I'm trying to build libhdf5 dll using msys and MinGW.
 
 Using the configure/make method doesn't generate the dll, and after
 searching for answers, it seems probable that there were no provisions
 to have the dll built with that platform, althought adding this task
 seems possible. How can I modify the configure script in order to add
 the dll building step?
 
 On the other hand, building with CMake, I've got the error
 
 H5Omtime.c:226: error: 'timezone' undeclared (first use in this function)
 
 Which I have seen can be  related with the use of an ancient struct no
 more available. Is there any flag I should use to avoid this error?
 
 Many thanks in advance.
 
 --
 Ing. Rodolfo Bonnin
 SUR Emprendimientos Tecnológicos
 
 Perú 345  Piso 5to Oficina B (C1067AAG)
 Ciudad de Buenos Aires, Argentina
 Tel. +54 (11) 4342-2976/84
 rodolfobon...@suremptec.com.ar
 www.suremptec.com
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] libhdf5 dll build problems with MinGW

2011-04-01 Thread Michael Jackson
So I did some digging into HDF5 1.8 CMake and MinGW and here is what I
came up with. There are some basic errors in the CMake file at
config/cmake/ConfigureChecks.cmake. Namely there is a section that
says:

SET (WINDOWS)
IF (WIN32)
  IF (NOT UNIX AND NOT CYGWIN)
SET (WINDOWS 1)
  ENDIF (NOT UNIX AND NOT CYGWIN)
ENDIF (WIN32)

The problem is that on MinGW the WINDOWS variable will be set to 1
which has bad ramifications about what is available on the system
later on in the file. So we change that to this:

SET (WINDOWS)
IF (WIN32)
  IF (NOT UNIX AND NOT CYGWIN AND NOT MINGW)
SET (WINDOWS 1)
  ENDIF (NOT UNIX AND NOT CYGWIN AND NOT MINGW)
ENDIF (WIN32)

Then later on in that file there is an IF (NOT WINDOWS) block
starting at line 551. That needs to be changed to the following:

IF (NOT WINDOWS)
  FOREACH (test
  TIME_WITH_SYS_TIME
  STDC_HEADERS
  HAVE_TM_ZONE
  HAVE_STRUCT_TM_TM_ZONE
  HAVE_ATTRIBUTE
  HAVE_FUNCTION
  HAVE_TM_GMTOFF
  HAVE_STRUCT_TIMEZONE
  HAVE_STAT_ST_BLOCKS
  HAVE_FUNCTION
  SYSTEM_SCOPE_THREADS
  HAVE_SOCKLEN_T
  DEV_T_IS_SCALAR
  HAVE_OFF64_T
  GETTIMEOFDAY_GIVES_TZ
  VSNPRINTF_WORKS
  HAVE_C99_FUNC
  HAVE_C99_DESIGNATED_INITIALIZER
  CXX_HAVE_OFFSETOF
  LONE_COLON
)
HDF5_FUNCTION_TEST (${test})
  ENDFOREACH (test)
  IF (NOT CYGWIN AND NOT MINGW)
  HDF5_FUNCTION_TEST (HAVE_TIMEZONE)
  ENDIF (NOT CYGWIN AND NOT MINGW)
ENDIF (NOT WINDOWS)

 Note that for MinGW we are NOT checking for 'timezone' as a previous
test for HAVE_STRUCT_TIMEZONE passes.

Note that at line 509 (FOREACH block) has an error in it and it should be this:
FOREACH (def
HAVE_SYS_TIME_H
HAVE_UNISTD_H
HAVE_SYS_TYPES_H
HAVE_SYS_SOCKET_H
)
  IF (${H5_${def}})
SET (MACRO_CHECK_FUNCTION_DEFINITIONS
${MACRO_CHECK_FUNCTION_DEFINITIONS} -D${def})
  ENDIF (${H5_${def}})
ENDFOREACH (def)
so it matches HDF5Tests.c which also has some tweaks that need to be fixed.

SO after all of that we get down to what is going on with MinGW.
MinGW does in fact have a sys/time.h file which has this declaration:
/* Provided for compatibility with code that assumes that
   the presence of gettimeofday function implies a definition
   of struct timezone. */
struct timezone
{
  int tz_minuteswest; /* of Greenwich */
  int tz_dsttime; /* type of dst correction to apply */
};

Note that this is a Declaration and NOT a definition so no memory is
actually reserved for the timezone struct. But with the above
changes to the CMake files and if you reconfigure with CMake on a
CLEAN build directory you should be able to now get MinGW to compile
HDF5 1.8.

Now here is the part I don't understand. On MinGW the following will compile:

#include sys/time.h
int main()
{
 timezone = 0;
return 0;
}
 which if we actually use that as a test we get a false positive for
H5_HAVE_TIMEZONE which makes it get defined in H5pubconf.h that CMake
generates. So in the file H5Omtime.c we end up taking the wrong code
path at line 226 and somehow get a compile error although the two uses
look the same there is something different. Either way it is probably
just wrong under MinGW. With the CMake corrections all the proper
#defines get set and we take another code path in H5Omtime.c which
allows us to compile both as static and as dynamic libraries.
  I have just started trying to get the tests to compile.

Since I don't know squat about svn or remember my credentials to push
up to the source code repository I'll have to figure out how to get
all the changes placed somewhere.

_
Mike Jackson                  mike.jack...@bluequartz.net
BlueQuartz Software                    www.bluequartz.net
Principal Software Engineer                  Dayton, Ohio



On Fri, Apr 1, 2011 at 11:23 AM, Michael Jackson
mike.jack...@bluequartz.net wrote:
 For the configure side of things, I'll ask the stupid question and you did 
 tell configure you wanted shared libraries?

 For the CMake side of things, was time.h included in the file that can't find 
 the definition? And if it _is_ included, and timezone is defined in the file 
 what is happening during the compile process that makes timezone NOT 
 defined, in other words, is there some preprocessor define that is NOT 
 getting set properly allowing the code path in time.h to define timezone? You 
 may have to start manually setting #error in some of the system header files 
 to deduce what is happening. Meanwhile I'll try and pull the Qt SDK that has 
 MinGW in it to see if I can eventually set something up. This would be a nice 
 CDash submission.
 --
 Mike Jackson www.bluequartz.net

 On Apr 1, 2011, at 11:00 AM, Rodolfo Bonnin wrote:

 What is is strange is that the library compiles using configure, but
 it doesn't generate the dll .

 2011/4/1 Michael Jackson mike.jack...@bluequartz.net:
 I don't have a setup where I can

Re: [Hdf-forum] libhdf5 dll build problems with MinGW

2011-04-01 Thread Michael Jackson
And further for MinGW inside of test/CMakeLists.txt at about line 27
you will need to insert the following block to get the tests to
successfully compile against DLLs:
IF (MINGW)
target_link_libraries(${HDF5_TEST_LIB_TARGET} wsock32)
ENDIF(MINGW)

I have been able to get the tests to compile. Started the run and some
are passing and some are failing. At this point it may be debuggable
by the HDF engineers.

 I used the Qt SDK from Trolltech (Nokia), setup a custom batch file
to initialize all the paths, invoked cmake-gui.exe from a command
prompt and configured for MinGW Makefiles.
_
Mike Jackson                  mike.jack...@bluequartz.net
BlueQuartz Software                    www.bluequartz.net
Principal Software Engineer                  Dayton, Ohio



On Fri, Apr 1, 2011 at 3:29 PM, Michael Jackson
mike.jack...@bluequartz.net wrote:
 So I did some digging into HDF5 1.8 CMake and MinGW and here is what I
 came up with. There are some basic errors in the CMake file at
 config/cmake/ConfigureChecks.cmake. Namely there is a section that
 says:

 SET (WINDOWS)
 IF (WIN32)
  IF (NOT UNIX AND NOT CYGWIN)
    SET (WINDOWS 1)
  ENDIF (NOT UNIX AND NOT CYGWIN)
 ENDIF (WIN32)

 The problem is that on MinGW the WINDOWS variable will be set to 1
 which has bad ramifications about what is available on the system
 later on in the file. So we change that to this:

 SET (WINDOWS)
 IF (WIN32)
  IF (NOT UNIX AND NOT CYGWIN AND NOT MINGW)
    SET (WINDOWS 1)
  ENDIF (NOT UNIX AND NOT CYGWIN AND NOT MINGW)
 ENDIF (WIN32)

 Then later on in that file there is an IF (NOT WINDOWS) block
 starting at line 551. That needs to be changed to the following:

 IF (NOT WINDOWS)
  FOREACH (test
      TIME_WITH_SYS_TIME
      STDC_HEADERS
      HAVE_TM_ZONE
      HAVE_STRUCT_TM_TM_ZONE
      HAVE_ATTRIBUTE
      HAVE_FUNCTION
      HAVE_TM_GMTOFF
      HAVE_STRUCT_TIMEZONE
      HAVE_STAT_ST_BLOCKS
      HAVE_FUNCTION
      SYSTEM_SCOPE_THREADS
      HAVE_SOCKLEN_T
      DEV_T_IS_SCALAR
      HAVE_OFF64_T
      GETTIMEOFDAY_GIVES_TZ
      VSNPRINTF_WORKS
      HAVE_C99_FUNC
      HAVE_C99_DESIGNATED_INITIALIZER
      CXX_HAVE_OFFSETOF
      LONE_COLON
    )
    HDF5_FUNCTION_TEST (${test})
  ENDFOREACH (test)
  IF (NOT CYGWIN AND NOT MINGW)
      HDF5_FUNCTION_TEST (HAVE_TIMEZONE)
  ENDIF (NOT CYGWIN AND NOT MINGW)
 ENDIF (NOT WINDOWS)

  Note that for MinGW we are NOT checking for 'timezone' as a previous
 test for HAVE_STRUCT_TIMEZONE passes.

 Note that at line 509 (FOREACH block) has an error in it and it should be 
 this:
    FOREACH (def
        HAVE_SYS_TIME_H
        HAVE_UNISTD_H
        HAVE_SYS_TYPES_H
        HAVE_SYS_SOCKET_H
    )
      IF (${H5_${def}})
        SET (MACRO_CHECK_FUNCTION_DEFINITIONS
 ${MACRO_CHECK_FUNCTION_DEFINITIONS} -D${def})
      ENDIF (${H5_${def}})
    ENDFOREACH (def)
 so it matches HDF5Tests.c which also has some tweaks that need to be fixed.

 SO after all of that we get down to what is going on with MinGW.
 MinGW does in fact have a sys/time.h file which has this declaration:
 /* Provided for compatibility with code that assumes that
   the presence of gettimeofday function implies a definition
   of struct timezone. */
 struct timezone
 {
  int tz_minuteswest; /* of Greenwich */
  int tz_dsttime;     /* type of dst correction to apply */
 };

 Note that this is a Declaration and NOT a definition so no memory is
 actually reserved for the timezone struct. But with the above
 changes to the CMake files and if you reconfigure with CMake on a
 CLEAN build directory you should be able to now get MinGW to compile
 HDF5 1.8.

 Now here is the part I don't understand. On MinGW the following will compile:

 #include sys/time.h
 int main()
 {
  timezone = 0;
 return 0;
 }
  which if we actually use that as a test we get a false positive for
 H5_HAVE_TIMEZONE which makes it get defined in H5pubconf.h that CMake
 generates. So in the file H5Omtime.c we end up taking the wrong code
 path at line 226 and somehow get a compile error although the two uses
 look the same there is something different. Either way it is probably
 just wrong under MinGW. With the CMake corrections all the proper
 #defines get set and we take another code path in H5Omtime.c which
 allows us to compile both as static and as dynamic libraries.
  I have just started trying to get the tests to compile.

 Since I don't know squat about svn or remember my credentials to push
 up to the source code repository I'll have to figure out how to get
 all the changes placed somewhere.

 _
 Mike Jackson                  mike.jack...@bluequartz.net
 BlueQuartz Software                    www.bluequartz.net
 Principal Software Engineer                  Dayton, Ohio



 On Fri, Apr 1, 2011 at 11:23 AM, Michael Jackson
 mike.jack...@bluequartz.net wrote:
 For the configure side of things, I'll ask the stupid question and you did

Re: [Hdf-forum] libhdf5 dll build problems with MinGW

2011-04-01 Thread Michael Jackson
I have the changes so I'll talk to Allen B. about how best to get them 
submitted.
___
Mike Jackson  www.bluequartz.net


On Apr 1, 2011, at 5:51 PM, Rodolfo Bonnin wrote:

 I was able to build the dll, many thanks Michael! I hope you can
 integrate your changes with the current code.
 
 Regards,
 Ing. Rodolfo Bonnin.
 
 2011/4/1 Rodolfo Bonnin rodolfobon...@suremptec.com.ar:
 Great Michael!
 
 I'll research about these changes, I agree that now the hdf developers
 are aware of the problem and a solution for them.
 
 Many thanks and regards.
 
 2011/4/1 Michael Jackson mike.jack...@bluequartz.net:
 And further for MinGW inside of test/CMakeLists.txt at about line 27
 you will need to insert the following block to get the tests to
 successfully compile against DLLs:
 IF (MINGW)
target_link_libraries(${HDF5_TEST_LIB_TARGET} wsock32)
 ENDIF(MINGW)
 
 I have been able to get the tests to compile. Started the run and some
 are passing and some are failing. At this point it may be debuggable
 by the HDF engineers.
 
  I used the Qt SDK from Trolltech (Nokia), setup a custom batch file
 to initialize all the paths, invoked cmake-gui.exe from a command
 prompt and configured for MinGW Makefiles.
 _
 Mike Jackson  mike.jack...@bluequartz.net
 BlueQuartz Softwarewww.bluequartz.net
 Principal Software Engineer  Dayton, Ohio
 
 
 
 On Fri, Apr 1, 2011 at 3:29 PM, Michael Jackson
 mike.jack...@bluequartz.net wrote:
 So I did some digging into HDF5 1.8 CMake and MinGW and here is what I
 came up with. There are some basic errors in the CMake file at
 config/cmake/ConfigureChecks.cmake. Namely there is a section that
 says:
 
 SET (WINDOWS)
 IF (WIN32)
  IF (NOT UNIX AND NOT CYGWIN)
SET (WINDOWS 1)
  ENDIF (NOT UNIX AND NOT CYGWIN)
 ENDIF (WIN32)
 
 The problem is that on MinGW the WINDOWS variable will be set to 1
 which has bad ramifications about what is available on the system
 later on in the file. So we change that to this:
 
 SET (WINDOWS)
 IF (WIN32)
  IF (NOT UNIX AND NOT CYGWIN AND NOT MINGW)
SET (WINDOWS 1)
  ENDIF (NOT UNIX AND NOT CYGWIN AND NOT MINGW)
 ENDIF (WIN32)
 
 Then later on in that file there is an IF (NOT WINDOWS) block
 starting at line 551. That needs to be changed to the following:
 
 IF (NOT WINDOWS)
  FOREACH (test
  TIME_WITH_SYS_TIME
  STDC_HEADERS
  HAVE_TM_ZONE
  HAVE_STRUCT_TM_TM_ZONE
  HAVE_ATTRIBUTE
  HAVE_FUNCTION
  HAVE_TM_GMTOFF
  HAVE_STRUCT_TIMEZONE
  HAVE_STAT_ST_BLOCKS
  HAVE_FUNCTION
  SYSTEM_SCOPE_THREADS
  HAVE_SOCKLEN_T
  DEV_T_IS_SCALAR
  HAVE_OFF64_T
  GETTIMEOFDAY_GIVES_TZ
  VSNPRINTF_WORKS
  HAVE_C99_FUNC
  HAVE_C99_DESIGNATED_INITIALIZER
  CXX_HAVE_OFFSETOF
  LONE_COLON
)
HDF5_FUNCTION_TEST (${test})
  ENDFOREACH (test)
  IF (NOT CYGWIN AND NOT MINGW)
  HDF5_FUNCTION_TEST (HAVE_TIMEZONE)
  ENDIF (NOT CYGWIN AND NOT MINGW)
 ENDIF (NOT WINDOWS)
 
  Note that for MinGW we are NOT checking for 'timezone' as a previous
 test for HAVE_STRUCT_TIMEZONE passes.
 
 Note that at line 509 (FOREACH block) has an error in it and it should be 
 this:
FOREACH (def
HAVE_SYS_TIME_H
HAVE_UNISTD_H
HAVE_SYS_TYPES_H
HAVE_SYS_SOCKET_H
)
  IF (${H5_${def}})
SET (MACRO_CHECK_FUNCTION_DEFINITIONS
 ${MACRO_CHECK_FUNCTION_DEFINITIONS} -D${def})
  ENDIF (${H5_${def}})
ENDFOREACH (def)
 so it matches HDF5Tests.c which also has some tweaks that need to be fixed.
 
 SO after all of that we get down to what is going on with MinGW.
 MinGW does in fact have a sys/time.h file which has this declaration:
 /* Provided for compatibility with code that assumes that
   the presence of gettimeofday function implies a definition
   of struct timezone. */
 struct timezone
 {
  int tz_minuteswest; /* of Greenwich */
  int tz_dsttime; /* type of dst correction to apply */
 };
 
 Note that this is a Declaration and NOT a definition so no memory is
 actually reserved for the timezone struct. But with the above
 changes to the CMake files and if you reconfigure with CMake on a
 CLEAN build directory you should be able to now get MinGW to compile
 HDF5 1.8.
 
 Now here is the part I don't understand. On MinGW the following will 
 compile:
 
 #include sys/time.h
 int main()
 {
  timezone = 0;
 return 0;
 }
  which if we actually use that as a test we get a false positive for
 H5_HAVE_TIMEZONE which makes it get defined in H5pubconf.h that CMake
 generates. So in the file H5Omtime.c we end up taking the wrong code
 path at line 226 and somehow get a compile error although the two uses
 look the same there is something different. Either way it is probably
 just wrong under MinGW. With the CMake corrections all the proper
 #defines get set and we take another code path

[Hdf-forum] How to store array of mixed integer and float

2011-03-24 Thread Michael Jackson
I have the following data structure that I would like to store into a single 
HDF5 data set:

struct {
int v1
int v2
int v3
int v4
float v5
int v6
}

What would be the best way to do this? I know there is a way to store a 
custom data type but I am coming up empty finding out how to do this. Also 
what are the downsides to using a custom data type with respect to portability 
across architectures? 

Thanks for any help
___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net 
BlueQuartz Software   Dayton, Ohio


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] How to store array of mixed integer and float

2011-03-24 Thread Michael Jackson
Will that work with the HDF5 1.6 library? 
___
Mike Jackson  www.bluequartz.net


On Mar 24, 2011, at 10:34 AM, Elena Pourmal wrote:

 Mike,
 
 You can use HDF5 compound datatype. It is portable across architectures. 
 Please see C example  h5ex_t_cpxcmpd.c at 
 http://www.hdfgroup.org/ftp/HDF5/examples/examples-by-api/api18-c.html under 
 the Datatypes section how to use it. User's Guide also has a section on 
 compound datatypes (under the Datatypes link).
 
 Elena
 On Mar 24, 2011, at 9:25 AM, Michael Jackson wrote:
 
 I have the following data structure that I would like to store into a single 
 HDF5 data set:
 
 struct {
 int v1
 int v2
 int v3
 int v4
 float v5
 int v6
 }
 
 What would be the best way to do this? I know there is a way to store a 
 custom data type but I am coming up empty finding out how to do this. Also 
 what are the downsides to using a custom data type with respect to 
 portability across architectures? 
 
 Thanks for any help
 ___
 Mike Jackson  www.bluequartz.net
 Principal Software Engineer   mike.jack...@bluequartz.net 
 BlueQuartz Software   Dayton, Ohio
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


[Hdf-forum] HDFView 2.7 on OS X 10.6.7 gives error on file load.

2011-03-24 Thread Michael Jackson
I have created an Unsupported File format when I try to load any hdf5 file 
using the HDFView version 2.7 (64 bit) for OS X. I have tried my own files 
along with files from the HDF5 code base. None seem to work. What am I doing 
wrong? Is there something I need to do first? Library paths? It seems like it 
should just work.

Thanks for any help
___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net 
BlueQuartz Software   Dayton, Ohio


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDFView 2.7 on OS X 10.6.7 gives error on file load.

2011-03-24 Thread Michael Jackson
Nope that didn't work either. If I go to the Help menu neither the HDF4 of 
HDF5 libraries are enabled. Is it even loading the HDF5 libraries? Which are 
included with this product I am assuming. 
___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net 
BlueQuartz Software   Dayton, Ohio

On Mar 24, 2011, at 12:43 PM, Peter Cao wrote:

 You have the correct JVM.
 
 Very strange. The only problem (two instances) reported for 64-bit Mac
 is the mismatched JVM.
 
 Could you try the batch file ($HDFVIEW_HOME/hdfview.command)?
 
 Thanks
 --pc
 
 On 3/24/2011 11:30 AM, Michael Jackson wrote:
 java version 1.6.0_24
 Java(TM) SE Runtime Environment (build 1.6.0_24-b07-334-10M3326)
 Java HotSpot(TM) 64-Bit Server VM (build 19.1-b02-334, mixed mode)
 
 ___
 Mike Jackson  www.bluequartz.net
 Principal Software Engineer   mike.jack...@bluequartz.net
 BlueQuartz Software   Dayton, Ohio
 
 On Mar 24, 2011, at 12:09 PM, Peter Cao wrote:
 
 try /usr/bin/java -version to see what version of JVM you have.
 If you are using 64-bit hdfview, you have to use 64-bit JVM.
 
 Thanks
 --pc
 
 
 On 3/24/2011 10:53 AM, Michael Jackson wrote:
 I have created an Unsupported File format when I try to load any hdf5 
 file using the HDFView version 2.7 (64 bit) for OS X. I have tried my own 
 files along with files from the HDF5 code base. None seem to work. What am 
 I doing wrong? Is there something I need to do first? Library paths? It 
 seems like it should just work.
 
 Thanks for any help
 ___
 Mike Jackson  www.bluequartz.net
 Principal Software Engineer   mike.jack...@bluequartz.net
 BlueQuartz Software   Dayton, Ohio
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDFView 2.7 on OS X 10.6.7 gives error on file load.

2011-03-24 Thread Michael Jackson
Both of the icons are grayed out which means there is an issue loading the 
libraries. And I am not seeing anything in the logs either. 
___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net 
BlueQuartz Software   Dayton, Ohio

On Mar 24, 2011, at 2:23 PM, Peter Cao wrote:

 If you click on the 5 or 4 icon, will you be able to see the library 
 version? thanks. --pc
 
 On 3/24/2011 1:03 PM, Michael Jackson wrote:
 Nope that didn't work either. If I go to the Help menu neither the HDF4 of 
 HDF5 libraries are enabled. Is it even loading the HDF5 libraries? Which are 
 included with this product I am assuming.
 ___
 Mike Jackson  www.bluequartz.net
 Principal Software Engineer   mike.jack...@bluequartz.net
 BlueQuartz Software   Dayton, Ohio
 
 On Mar 24, 2011, at 12:43 PM, Peter Cao wrote:
 
 You have the correct JVM.
 
 Very strange. The only problem (two instances) reported for 64-bit Mac
 is the mismatched JVM.
 
 Could you try the batch file ($HDFVIEW_HOME/hdfview.command)?
 
 Thanks
 --pc
 
 On 3/24/2011 11:30 AM, Michael Jackson wrote:
 java version 1.6.0_24
 Java(TM) SE Runtime Environment (build 1.6.0_24-b07-334-10M3326)
 Java HotSpot(TM) 64-Bit Server VM (build 19.1-b02-334, mixed mode)
 
 ___
 Mike Jackson  www.bluequartz.net
 Principal Software Engineer   mike.jack...@bluequartz.net
 BlueQuartz Software   Dayton, Ohio
 
 On Mar 24, 2011, at 12:09 PM, Peter Cao wrote:
 
 
 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDFView 2.7 on OS X 10.6.7 gives error on file load.

2011-03-24 Thread Michael Jackson
Turns out on 2 different machines no less I had older HDFView
libraries still in /Library/java/Extensions which was messing
everything up. Cleaned those out and HDFView seems to work now.

Sorry for the noise today. Maybe that should go in a Troubleshooting
faq somewhere.

Thanks
_
Mike Jackson                  mike.jack...@bluequartz.net
BlueQuartz Software                    www.bluequartz.net
Principal Software Engineer                  Dayton, Ohio



On Thu, Mar 24, 2011 at 3:17 PM, Peter Cao x...@hdfgroup.org wrote:
 Mike,

 The gray icons mean that the libraries do not work on your machine. I just
 upgraded
 my machine to 10.7.0. HDFView works fine. Roy tested HDFView on 10.6.7. It
 seems that
 the problem is not in the version of the OS. Could you download and
 reinstall HDFView?

 Thanks
 --pc


 On 3/24/2011 1:45 PM, Michael Jackson wrote:

 Both of the icons are grayed out which means there is an issue loading the
 libraries. And I am not seeing anything in the logs either.
 ___
 Mike Jackson                      www.bluequartz.net
 Principal Software Engineer       mike.jack...@bluequartz.net
 BlueQuartz Software               Dayton, Ohio

 On Mar 24, 2011, at 2:23 PM, Peter Cao wrote:


 ___
 Hdf-forum is for HDF software users discussion.
 Hdf-forum@hdfgroup.org
 http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Trying to build HDF5 1.8.6 with CMake, getting undefined externals?

2011-03-15 Thread Michael Jackson
On the note of the FindHDF5.cmake file that comes with CMake The HDF Group 
may want to really consider taking ownership of that module so that you can 
synchronize the FindHDF.cmake with the HDF5/CMake integration and make sure 
they work correctly together. I also have my own version of FindHDF5.cmake that 
I use that works with my cmake version of HDF5 V1.6.9. I think there are more 
than a few versions of FindHDF5.cmake floating around.

___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net 
BlueQuartz Software   Dayton, Ohio

On Mar 15, 2011, at 4:24 PM, Allen D Byrne wrote:

 Yes. I found that out as well, I have overrode that module in the 1.8.7 
 config/cmake folder. You might try getting the source from the 1.8 branch:
 http://svn.hdfgroup.uiuc.edu/hdf5/branches/hdf5_1_8
 Allen
  I'd be glad to use a snapshot, if it fixed the problems I'm seeing.
  
  The bigger problem is that the FindHDF5.cmake that's distributed with
  CMake is brain-damaged, or its interaction with the hdf5-config.cmake file
  is dysfunctional.
  
  FindHDF5.cmake is looking for h5cc or h5pcc which aren't actually getting
  built or installed.
  Then HDF5_INCLUDE_DIRS and HDF5_LIBRARIES are supposed to be defined in
  hdf5-config.cmake and they aren't.
  
  
  --
  Kent Williams norman-k-willi...@uiowa.edu
  
  
  
  
  
  
  From:  Allen D Byrne b...@hdfgroup.org
  Organization:  HDF Group
  Date:  Tue, 15 Mar 2011 14:57:06 -0500
  To:  Mushly McMushmaster norman-k-willi...@uiowa.edu
  Cc:  hdf-forum@hdfgroup.org hdf-forum@hdfgroup.org
  Subject:  Re: [Hdf-forum] Trying to build HDF5 1.8.6 with CMake, getting
  undefined externals?
  
  
  Interesting. I wouldn't think you need to pass in the COMPILER args since
  it should use the same settings as the parent project. Post 1.8.6 code has
  been tested as an external project, but with compression libs explictly
  enabled/disabled. Have you inspected the generated files to determine if
  they make sense. How about the log files for any clues?
  Can you use a 1.8.7 snapshot? There was significant cmake code changes
  after 1.8.6 concerning use as an External Project.
  Allen
   I satisfy both of those expectations.  If you're curious what I'm doing
  is:
  
   # External_HDF5.cmake
   #
   # gets C/CXX stuff from parent project
   # BRAINS3_INSTALL_PREFIX = local install dir for prerequisites for
  BRAINS3
   # BUILD_SHARED_LIBS = whether or not to build shared libs. Normally On.
   include(ExternalProject)
  
   ExternalProject_add(HDF5
 SOURCE_DIR HDF5
 BINARY_DIR HDF5-build
 URL http://www.hdfgroup.org/ftp/HDF5/current/src/hdf5-1.8.6.tar.gz;
 URL_MD5 bd2d369dfcf7aa0437dde6aeb8460a31
 UPDATE_COMMAND 
 CMAKE_ARGS
 -DCMAKE_CXX_COMPILER:STRING=${CMAKE_CXX_COMPILER}
 -DCMAKE_CXX_COMPILER_ARG1:STRING=${CMAKE_CXX_COMPILER_ARG1}
 -DCMAKE_C_COMPILER:STRING=${CMAKE_C_COMPILER}
 -DCMAKE_C_COMPILER_ARG1:STRING=${CMAKE_C_COMPILER_ARG1}
 -DCMAKE_CXX_FLAGS:STRING=${CMAKE_CXX_FLAGS}
 -DCMAKE_C_FLAGS:STRING=${CMAKE_C_FLAGS}
 -DBUILD_SHARED_LIBS:BOOL=${BUILD_SHARED_LIBS}
 -DCMAKE_INSTALL_PREFIX:PATH=${BRAINS3_INSTALL_PREFIX}
 INSTALL_DIR ${BRAINS3_INSTALL_PREFIX}
   )
  
  
  
  
  
   From:  Allen D Byrne b...@hdfgroup.org
   Organization:  HDF Group
   Date:  Tue, 15 Mar 2011 14:24:06 -0500
   To:  hdf-forum@hdfgroup.org
   Cc:  Mushly McMushmaster norman-k-willi...@uiowa.edu
   Subject:  Re: [Hdf-forum] Trying to build HDF5 1.8.6 with CMake, getting
   undefined externals?
  
  
   When building with cmake there are two expectations, one is that the
   everything will be built out of source (usually create a sub-folder
  called
   build) and that the source folders are clean (you did not run configure
   and/or build in source).
   That has to do with the cmake part, I will need find some help for why
  you
   get the following errors if the above conditions are met.
   Allen
I sure don't understand this problem.  If I use CMake (on OS X 10.6
  with
CMake 2.8.4)  I get all sorts of mysterious undefined externals.
   
When I look through the source, these appear to be used as function
   points
when they've never been defined.  E.G.
   
./src/H5Osdspace.c:#define H5O_SHARED_ENCODE  H5O_sdspace_shared_encode
./src/H5Osdspace.c:H5O_sdspace_shared_encode, /* encode message */
   
   
What gives?
   
Linking C shared library ../bin/libhdf5.dylib
Undefined symbols:
  _H5O_sdspace_shared_encode, referenced from:
  _H5O_MSG_SDSPACE in H5Osdspace.c.o
  _H5O_attr_shared_size, referenced from:
  _H5O_MSG_ATTR in H5Oattr.c.o
  _H5O_fill_shared_encode, referenced from:
  _H5O_MSG_FILL in H5Ofill.c.o
  _H5O_pline_shared_encode, referenced from:
  _H5O_MSG_PLINE in H5Opline.c.o
  _H5O_dtype_shared_copy_file, referenced from:

[Hdf-forum] Registering a new file format in the HDFView Java Application

2011-02-22 Thread Michael Jackson
I have hdf5 files with extensions like .h5ang, .h5ti, .h5p and so on. How do I 
register those extensions to be opened by the HDFView java application? 

Thanks
___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net 
BlueQuartz Software   Dayton, Ohio


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


[Hdf-forum] Visual Studio 2010 integration with C++ issues?

2010-12-29 Thread Michael Jackson
I have some codes that I am trying to compile with Visual Studio 2010. My codes 
are in C++ but I use the C API of HDF5 (in this case version 1.6.9). when I 
try to compile my program I am getting lots of errors about undefined, 
redefined, missing types. I have tracked it down to the following lines in the 
H5public.h file:

#ifndef __cplusplus
#ifdef H5_HAVE_STDINT_H
#   include stdint.h  /*for C9x types  */
#endif
#endif

 I think the issue is that VS2010 finally has a stdint.h file where as the 
previous versions do not. This is somehow messing things up. Has anyone else 
come across these issues?

___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net 
BlueQuartz Software   Dayton, Ohio


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] VC++ project Build Errors with HDF5 1.8.5

2010-11-12 Thread Michael Jackson
  Let me try to explain some more because I was the one who added the  
H5_BUILT_AS_DYNAMIC_LIB variable. The reason this variable was needed  
was because projects that _use_ HDF5 and want to deploy their  
application to another machine need some sort of mechanism to  
correctly determine if the HDF5 project was built as a static or  
dynamic library. Using file extensions and other filenaming schemes  
are NOT good enough as someone could change the file names. This is a  
problem on Windows as there is NO set naming convention for libraries  
and no direction from Microsoft so really anything goes. When HDF5 is  
built using CMake this variable is written into another header file  
(H5pubconf.h) which is then pulled in by another header (H5public.h).


So start with H5Public.h file. This FIRST includes the H5pubconf.h  
file which is generated by CMake or autotools. Within H5pubconf.h the  
H5_BUILT_AS_DYNAMIC_LIB is either defined to 1 or NOT defined at all  
based on how HDF5 is being built. Further down in H5Public.h is the  
#include H5api_adpt.h which brings in all the Windows DLL defines  
that are needed in order to correctly export the functions in HDF5  
library when compiling on windows (and to some extend GCC). If you  
follow this flow then you will see that there is NO need to define  
H5_BUILT_AS_DYNAMIC_LIB yourself in ANY project. And in fact doing so  
may create problems for your project.


  Actually the use of _HDF5USEDLL_ can actually be completely gotten  
rid of in the Visual Studio projects with some fixes to the current  
logic that is contained in the preprocessor blocks. I think at one  
point I had a patch that got rid of it but it never got picked up.


  http://www.cmake.org/Wiki/BuildingWinDLLmay provide some more  
background information but is CMake specific.


Hope that helps explains things.
___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net
BlueQuartz Software   Dayton, Ohio


On Nov 12, 2010, at 10:21 AM, Allen D Byrne wrote:

As Elena mentioned, that was an error in the comments. It should  
have been something like the following:
The Visual Studio project files be not be supported in the next  
major release of 1.10

As far as :
 So what, if any, are the functional consequences to VS2005  
projects client
 code defining H5_BUILT_AS_DYNAMIC_LIB and building against the h,  
lib binary
 deliverable - bearing in mind that my application is built,  
tested, and

 seems to be working fine?
Building applications using the dynamic libs/dlls means you will be  
importing the libraries and defining just H5_BUILT_AS_DYNAMIC_LIB  
will bypass all the exports blocks arriving at lines 87,88:

#define H5_DLL __declspec(dllimport)
#define H5_DLLVAR __declspec(dllimport)
Similiarly, all the other defines will be imports.
Not defining H5_BUILT_AS_DYNAMIC_LIB will skip to the else block at  
line 290:

#if defined(_WIN32)
which then will check the existence/value of the original defines to  
set each of the defines mentioned previously.
--- So that if _HDF5DLL_ is not defined and _HDF5USEDLL_ is defined  
- lines 297,298:

#define H5_DLL __declspec(dllimport)
#define H5_DLLVAR __declspec(dllimport)
and similiarly the other defines will check for 'USE' defines to set  
the the other defines.
NOTE: if neither _HDF5DLL_ or _HDF5USEDLL_ is defined, then the  
defines will be set at lines 300,301:

#define H5_DLL
#define H5_DLLVAR extern
Which may cause problems! Therefore because you have defined  
H5_BUILT_AS_DYNAMIC_LIB, you have set all defines to  
__declspec(dllimport) and can ignore worying about all the 'USE'  
defines.

Allen

 Hi All.
 My reason for setting H5_BUILT_AS_DYNAMIC_LIB is based on the fact  
that

 1) h5api_adpt.h uses it as a #ifdef switch to define H5_DLLVAR,
 2) AND, the #else branch of this (which references the  
_HDF5USEDLL_ ifdef)

 is very clearly marked thus in the comments:

 /* This is the original HDFGroup defined preprocessor code which  
should

 still work
 * with the VS projects that are maintained by The HDF Group
 * This will be removed after the next release.
 */


 The statement this will be removed in the next release led me to  
assume
 that this section of the header is deprecated and should not be  
used in new

 code, and hence that the best define to use would be
 H5_BUILT_AS_DYNAMIC_LIB.

 Is this interpretation incorrect - will this section of the header  
continue

 to exist in future?

 I don't use CMake, and I'm not building HDF5, just using it from  
my client

 code.
 So what, if any, are the functional consequences to VS2005  
projects client
 code defining H5_BUILT_AS_DYNAMIC_LIB and building against the h,  
lib binary
 deliverable - bearing in mind that my application is built,  
tested, and

 seems to be working fine?
 Steve

___

[Hdf-forum] HDF5 1.8.5 Upgrade Errors

2010-08-24 Thread Michael Jackson
I have some HDF5 version 1.6.9 code that I am in the process of  
updating. I have gotten everything to finally compile and I am now  
investigating some runtime issues.


With the following line of code:
hid_t  file_id = H5Fcreate( MXAUnitTest::H5LiteTest::FileName.c_str(),  
H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT );
  where MXAUnitTest::H5LiteTest::FileName is a const std::string = (/ 
tmp/out.h5).


I get the following error stack:
HDF5-DIAG: Error detected in HDF5 (1.8.5-snap5) thread 0:
  #000: /Users/mjackson/Workspace/hdf5-v18/src/H5F.c line 1371 in  
H5Fcreate(): library initialization failed

major: Function entry/exit
minor: Unable to initialize object
  #001: /Users/mjackson/Workspace/hdf5-v18/src/H5.c line 174 in  
H5_init_library(): unable to initialize datatype interface

major: Function entry/exit
minor: Unable to initialize object
  #002: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 524 in  
H5T_init(): interface initialization failed

major: Function entry/exit
minor: Unable to initialize object
  #003: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 1312 in  
H5T_init_interface(): unable to register conversion function(s)

major: Datatype
minor: Unable to initialize object
  #004: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 2297 in  
H5T_register(): unable to locate/allocate conversion path

major: Datatype
minor: Unable to initialize object
  #005: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 4410 in  
H5T_path_find(): unable to initialize conversion function

major: Datatype
minor: Unable to initialize object
  #006: /Users/mjackson/Workspace/hdf5-v18/src/H5Tconv.c line 9480 in  
H5T_conv_ldouble_ulong(): disagreement about datatype size

major: Datatype
minor: Unable to initialize object
  #007: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 2297 in  
H5T_register(): unable to locate/allocate conversion path

major: Datatype
minor: Unable to initialize object
  #008: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 4410 in  
H5T_path_find(): unable to initialize conversion function

major: Datatype
minor: Unable to initialize object
  #009: /Users/mjackson/Workspace/hdf5-v18/src/H5Tconv.c line 9413 in  
H5T_conv_double_ulong(): disagreement about datatype size

major: Datatype
minor: Unable to initialize object
  #010: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 2297 in  
H5T_register(): unable to locate/allocate conversion path

major: Datatype
minor: Unable to initialize object
  #011: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 4410 in  
H5T_path_find(): unable to initialize conversion function

major: Datatype
minor: Unable to initialize object
  #012: /Users/mjackson/Workspace/hdf5-v18/src/H5Tconv.c line 9349 in  
H5T_conv_float_ulong(): disagreement about datatype size

major: Datatype
minor: Unable to initialize object
  #013: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 2297 in  
H5T_register(): unable to locate/allocate conversion path

major: Datatype
minor: Unable to initialize object
  #014: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 4410 in  
H5T_path_find(): unable to initialize conversion function

major: Datatype
minor: Unable to initialize object
  #015: /Users/mjackson/Workspace/hdf5-v18/src/H5Tconv.c line 9446 in  
H5T_conv_ldouble_long(): disagreement about datatype size

major: Datatype
minor: Unable to initialize object
  #016: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 2297 in  
H5T_register(): unable to locate/allocate conversion path

major: Datatype
minor: Unable to initialize object
  #017: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 4410 in  
H5T_path_find(): unable to initialize conversion function

major: Datatype
minor: Unable to initialize object
  #018: /Users/mjackson/Workspace/hdf5-v18/src/H5Tconv.c line 9381 in  
H5T_conv_double_long(): disagreement about datatype size

major: Datatype
minor: Unable to initialize object
  #019: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 2297 in  
H5T_register(): unable to locate/allocate conversion path

major: Datatype
minor: Unable to initialize object
  #020: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 4410 in  
H5T_path_find(): unable to initialize conversion function

major: Datatype
minor: Unable to initialize object
  #021: /Users/mjackson/Workspace/hdf5-v18/src/H5Tconv.c line 9317 in  
H5T_conv_float_long(): disagreement about datatype size

major: Datatype
minor: Unable to initialize object
  #022: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 2297 in  
H5T_register(): unable to locate/allocate conversion path

major: Datatype
minor: Unable to initialize object
  #023: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 4410 in  
H5T_path_find(): unable to initialize conversion function

major: Datatype
minor: Unable to initialize object
  #024: 

[Hdf-forum] Re: HDF5 1.8.5 Upgrade Errors

2010-08-24 Thread Michael Jackson
So this seems to have something to do with building Universal Binaries  
under OS X. If I build HDF5 with a single arch such as i386 or x86_64  
then I can get me test to pass. I thought this was solved?


___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net
BlueQuartz Software   Dayton, Ohio

On Aug 24, 2010, at 2:09 PM, Michael Jackson wrote:

I have some HDF5 version 1.6.9 code that I am in the process of  
updating. I have gotten everything to finally compile and I am now  
investigating some runtime issues.


With the following line of code:
hid_t  file_id =  
H5Fcreate( MXAUnitTest::H5LiteTest::FileName.c_str(), H5F_ACC_TRUNC,  
H5P_DEFAULT, H5P_DEFAULT );
 where MXAUnitTest::H5LiteTest::FileName is a const std::string = (/ 
tmp/out.h5).


I get the following error stack:
HDF5-DIAG: Error detected in HDF5 (1.8.5-snap5) thread 0:
 #000: /Users/mjackson/Workspace/hdf5-v18/src/H5F.c line 1371 in  
H5Fcreate(): library initialization failed

   major: Function entry/exit
   minor: Unable to initialize object
 #001: /Users/mjackson/Workspace/hdf5-v18/src/H5.c line 174 in  
H5_init_library(): unable to initialize datatype interface

   major: Function entry/exit
   minor: Unable to initialize object
 #002: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 524 in  
H5T_init(): interface initialization failed

   major: Function entry/exit
   minor: Unable to initialize object
 #003: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 1312 in  
H5T_init_interface(): unable to register conversion function(s)

   major: Datatype
   minor: Unable to initialize object
 #004: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 2297 in  
H5T_register(): unable to locate/allocate conversion path

   major: Datatype
   minor: Unable to initialize object
 #005: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 4410 in  
H5T_path_find(): unable to initialize conversion function

   major: Datatype
   minor: Unable to initialize object
 #006: /Users/mjackson/Workspace/hdf5-v18/src/H5Tconv.c line 9480 in  
H5T_conv_ldouble_ulong(): disagreement about datatype size

   major: Datatype
   minor: Unable to initialize object
 #007: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 2297 in  
H5T_register(): unable to locate/allocate conversion path

   major: Datatype
   minor: Unable to initialize object
 #008: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 4410 in  
H5T_path_find(): unable to initialize conversion function

   major: Datatype
   minor: Unable to initialize object
 #009: /Users/mjackson/Workspace/hdf5-v18/src/H5Tconv.c line 9413 in  
H5T_conv_double_ulong(): disagreement about datatype size

   major: Datatype
   minor: Unable to initialize object
 #010: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 2297 in  
H5T_register(): unable to locate/allocate conversion path

   major: Datatype
   minor: Unable to initialize object
 #011: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 4410 in  
H5T_path_find(): unable to initialize conversion function

   major: Datatype
   minor: Unable to initialize object
 #012: /Users/mjackson/Workspace/hdf5-v18/src/H5Tconv.c line 9349 in  
H5T_conv_float_ulong(): disagreement about datatype size

   major: Datatype
   minor: Unable to initialize object
 #013: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 2297 in  
H5T_register(): unable to locate/allocate conversion path

   major: Datatype
   minor: Unable to initialize object
 #014: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 4410 in  
H5T_path_find(): unable to initialize conversion function

   major: Datatype
   minor: Unable to initialize object
 #015: /Users/mjackson/Workspace/hdf5-v18/src/H5Tconv.c line 9446 in  
H5T_conv_ldouble_long(): disagreement about datatype size

   major: Datatype
   minor: Unable to initialize object
 #016: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 2297 in  
H5T_register(): unable to locate/allocate conversion path

   major: Datatype
   minor: Unable to initialize object
 #017: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 4410 in  
H5T_path_find(): unable to initialize conversion function

   major: Datatype
   minor: Unable to initialize object
 #018: /Users/mjackson/Workspace/hdf5-v18/src/H5Tconv.c line 9381 in  
H5T_conv_double_long(): disagreement about datatype size

   major: Datatype
   minor: Unable to initialize object
 #019: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 2297 in  
H5T_register(): unable to locate/allocate conversion path

   major: Datatype
   minor: Unable to initialize object
 #020: /Users/mjackson/Workspace/hdf5-v18/src/H5T.c line 4410 in  
H5T_path_find(): unable to initialize conversion function

   major: Datatype
   minor: Unable to initialize object
 #021: /Users/mjackson/Workspace/hdf5-v18/src/H5Tconv.c line 9317 in  
H5T_conv_float_long(): disagreement about datatype size

   major: Datatype

Re: [Hdf-forum] Which version of Mac OS X for HDFview?

2010-04-13 Thread Michael Jackson

try the following:

file /Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib

and report the output back to the list.
___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net
BlueQuartz Software   Dayton, Ohio


On Apr 13, 2010, at 9:44 AM, Peter Cao wrote:

It seems that you have 32-bit java on your system. You should use  
the 32-bit hdfview.


A 64-bit java will look like this:
$ java -version
java version 1.6.0_17
Java(TM) SE Runtime Environment (build 1.6.0_17-b04-248-10M3025)
Java HotSpot(TM) 64-Bit Server VM (build 14.3-b01-101, mixed mode)


George N. White III wrote:

Had an older version running, but I can't get hdf-java 2.6.1 release
(March 5, 2010) to run on
Mac OS 10.5.8.

$ uname -a
Darwin hostname 9.8.0 Darwin Kernel Version 9.8.0: Wed Jul 15
16:55:01 PDT 2009; root:xnu-1228.15.4~1/RELEASE_I386 i386

$ /usr/bin/java -fullversion
java full version 1.6.0_17-b04-248

The 64-bit version gave:

java.lang.UnsatisfiedLinkError:
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib:  no suitable
image found.  Did find:
/Applications/thg/hdfview64/lib/macosx/libjhdf5.jnilib: unknown
required load command 0x8022
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib: unknown  
required

load command 0x8022

I think this indicates that the jnilib was build for Snow Leopard


The 32-bit version gave:

hdfview
java.lang.UnsatisfiedLinkError:
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib:  no suitable
image found.  Did find:
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib: mach-o, but
wrong architecture
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib: mach-o, but
wrong architecture




___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Which version of Mac OS X for HDFview?

2010-04-13 Thread Michael Jackson
Ok, So you are using a 64 bit JVM with a 64 bit library so it _should_  
work. At least the archs match. Maybe the hdfview libraries were built  
on 10.6 and did not have the -mmacosx-version-min=10.5 set correctly?


Dunno.
--
Mike Jackson www.bluequartz.net

On Apr 13, 2010, at 10:21 AM, George N. White III wrote:


On Tue, Apr 13, 2010 at 11:18 AM, Michael Jackson
mike.jack...@bluequartz.net wrote:

try the following:

file /Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib


$ file /Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib: Mach-O 64-bit
bundle x86_64



and report the output back to the list.
___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net
BlueQuartz Software   Dayton, Ohio


On Apr 13, 2010, at 9:44 AM, Peter Cao wrote:

It seems that you have 32-bit java on your system. You should use  
the

32-bit hdfview.

A 64-bit java will look like this:
$ java -version
java version 1.6.0_17
Java(TM) SE Runtime Environment (build 1.6.0_17-b04-248-10M3025)
Java HotSpot(TM) 64-Bit Server VM (build 14.3-b01-101, mixed mode)


George N. White III wrote:


Had an older version running, but I can't get hdf-java 2.6.1  
release

(March 5, 2010) to run on
Mac OS 10.5.8.

$ uname -a
Darwin hostname 9.8.0 Darwin Kernel Version 9.8.0: Wed Jul 15
16:55:01 PDT 2009; root:xnu-1228.15.4~1/RELEASE_I386 i386

$ /usr/bin/java -fullversion
java full version 1.6.0_17-b04-248

The 64-bit version gave:

java.lang.UnsatisfiedLinkError:
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib:  no suitable
image found.  Did find:
/Applications/thg/hdfview64/lib/macosx/libjhdf5.jnilib: unknown
required load command 0x8022
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib: unknown  
required

load command 0x8022

I think this indicates that the jnilib was build for Snow Leopard


The 32-bit version gave:

hdfview
java.lang.UnsatisfiedLinkError:
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib:  no suitable
image found.  Did find:
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib: mach-o, but
wrong architecture
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib: mach-o, but
wrong architecture




___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org





--
George N. White III aa...@chebucto.ns.ca
Head of St. Margarets Bay, Nova Scotia

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] Which version of Mac OS X for HDFview?

2010-04-13 Thread Michael Jackson
Um. OS X 10.6 _IS_ 64 bit by default. Without supplying any arguments  
to GCC, the default arch will be x86_64. So if you want this to be  
able to run on OS X 10.5 you _may_ need to also add the -arch i386 to  
the GCC invocation.


--
Mike Jackson www.bluequartz.net

On Apr 13, 2010, at 11:27 AM, Peter Cao wrote:

We don't have 64-bit Mac OS 10.6 (Snow Leopard). We rebuilt hdf-java  
with the flag you gave.

Could you try it and let me know if it works for you?
ftp://ftp.hdfgroup.uiuc.edu/pub/outgoing/hdf-java/v2.6.1/hdfview/hdfview_install_macosx_intel64.zip

Thanks
--pc

Michael Jackson wrote:
Ok, So you are using a 64 bit JVM with a 64 bit library so it  
_should_ work. At least the archs match. Maybe the hdfview  
libraries were built on 10.6 and did not have the -mmacosx-version- 
min=10.5 set correctly?


Dunno.
--
Mike Jackson www.bluequartz.net

On Apr 13, 2010, at 10:21 AM, George N. White III wrote:


On Tue, Apr 13, 2010 at 11:18 AM, Michael Jackson
mike.jack...@bluequartz.net wrote:

try the following:

file /Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib


$ file /Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib: Mach-O 64-bit
bundle x86_64



and report the output back to the list.
___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net
BlueQuartz Software   Dayton, Ohio


On Apr 13, 2010, at 9:44 AM, Peter Cao wrote:

It seems that you have 32-bit java on your system. You should  
use the

32-bit hdfview.

A 64-bit java will look like this:
$ java -version
java version 1.6.0_17
Java(TM) SE Runtime Environment (build 1.6.0_17-b04-248-10M3025)
Java HotSpot(TM) 64-Bit Server VM (build 14.3-b01-101, mixed mode)


George N. White III wrote:


Had an older version running, but I can't get hdf-java 2.6.1  
release

(March 5, 2010) to run on
Mac OS 10.5.8.

$ uname -a
Darwin hostname 9.8.0 Darwin Kernel Version 9.8.0: Wed Jul 15
16:55:01 PDT 2009; root:xnu-1228.15.4~1/RELEASE_I386 i386

$ /usr/bin/java -fullversion
java full version 1.6.0_17-b04-248

The 64-bit version gave:

java.lang.UnsatisfiedLinkError:
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib:  no  
suitable

image found.  Did find:
/Applications/thg/hdfview64/lib/macosx/libjhdf5.jnilib: unknown
required load command 0x8022
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib: unknown  
required

load command 0x8022

I think this indicates that the jnilib was build for Snow Leopard


The 32-bit version gave:

hdfview
java.lang.UnsatisfiedLinkError:
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib:  no  
suitable

image found.  Did find:
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib: mach-o, but
wrong architecture
/Applications/thg/hdfview/lib/macosx/libjhdf5.jnilib: mach-o, but
wrong architecture




___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org





--
George N. White III aa...@chebucto.ns.ca
Head of St. Margarets Bay, Nova Scotia

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5-1.8.4 patch1 on Windows7 64bit

2010-03-17 Thread Michael Jackson
Did that URL and subsequent build work around the issues that you were  
having?

___
Mike Jackson  www.bluequartz.net

On Mar 16, 2010, at 5:03 PM, Dr. X wrote:


Hi Michael,
Would you please post the lastest Git URL? thanks a lot.
x

On 3/16/2010 4:23 PM, Michael Jackson wrote:
No they don't. I release Applications all the time with the C/C++  
Runtimes next to the executable. There is some build setting in  
visual studio that allows this.


http://www.nuonsoft.com/blog/2008/10/29/binding-to-the-most-recent-visual-studio-libraries/

If you read nothing but the LAST line of the blog the blogger  
explains why binding to multiple C/C++ runtimes is a Bad idea and  
simply will not work. Sounds like something like this is going on  
here.


The original poster may want to try build HDF5 with the CMake  
version on Gitorious although that code base is only HDF5 V1.8.4  
and does NOT have the patch-1 additions/corrections to the codes.


___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net
BlueQuartz Software   Dayton, Ohio


On Mar 16, 2010, at 3:54 PM, Lewandowski, George wrote:

This sounds like it might be a problem with side-by-side  
assemblies.  The proper C runtimes must be installed on the client  
machine, the loader will not look for them in the same directory  
as the executable. See this MSDN article:


http://msdn.microsoft.com/en-us/library/aa376307(VS.85).aspx

If this is the problem, it can be fixed by running  
vcredist_x86.exe to install the side-by-side assemblies on the  
target machine.  You have to get the correct redistributable for  
the version of VC used to build the program.  The redistributable  
for VS2005 SP1 is here:


http://www.microsoft.com/downloads/details.aspx?FamilyID=200b2fd9-ae1a-4a14-984d-389c36f85647displaylang=en

George Lewandowski
(314)777-7890
Mail Code S270-2204
Building 270-E Level 2E Room 20E
P-8A

-Original Message-
From: hdf-forum-boun...@hdfgroup.org [mailto:hdf-forum-boun...@hdfgroup.org 
] On Behalf Of Allen D Byrne

Sent: Tuesday, March 16, 2010 2:46 PM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] HDF5-1.8.4 patch1 on Windows7 64bit

xunlei,

This is the second report of this. I've just not been able to  
locate why this is happening. I haven't altered the project files  
other then adding or deleting to the list of files. Any help on  
finding the source of the problem would be greatly appreciated.


Allen


Hi,

I have been trying to use 5-1.8.4 patch1 on my win7 64bit  
machine. I

first tried the pre-compiled version from
http://www.hdfgroup.org/ftp/HDF5/current/bin/windows/hdf5-1.8.4-patch1
-64bit-vs2005-ivf91.zip When I tried to run any utility in bin\,  
I got

a System Error message box saying:
the program can't start because MSVCR80.dll is missing from your
computer. Try reinstalling the program to fix this problem. error.
I then copied msvcr80.dll, msvcp80.dll, msvcm80.dll into bin\  
from a

working application, say MATLAB or NVIDIA driver folder. I presume
these Dll's are compatible with Win7 64bit since all these
applications work fine.
When I run say h5dump.exe, I got a Microsoft Visual C++ Runtime
Library box saying:
Runtime Error!
Program: C:\HDF5\bin\h5dump.exe
R6034
An application has made an attempt tot load the C runtime library
incorrectly.
Please contact the application's support team for more  
information.

after click OK, I encountered yet another box h5dump.exe -
Application Error saying:
The application was unable to start correctly (0xc142).  
Click OK

to close the application.

I tried to recompile HDF5 from the source code using
http://www.hdfgroup.org/ftp/HDF5/current/src/hdf5-1.8.4-patch1.tar.gz
I used MSVC2008 to create the default x64 configuration files  
from the

included Win32 ones. Compilation went smoothly. The newly created
binary files behave exactly the same as the your binary release.

Would you please help? Thanks a lot.

Best,
xunlei

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.
Hdf-forum

Re: [Hdf-forum] HDF5-1.8.4 patch1 on Windows7 64bit

2010-03-17 Thread Michael Jackson
Great Write up. I'll just add my own experiences and some suggestions  
to make our (the developers USING hdf5) easier.
 I am mainly an OS X Developer and do Windows development due to  
customer constraints so that is the perspective that this comes from.
 RULE #1: Always build ALL your libraries with the SAME EXACT version  
of Visual Studio, down to the Service Pack AND Security Patch Level.

 RULE #2: NEVER break rule #1.

Following those rules will allow you to have a much easier time with  
debugging/running/compiling on windows. What do I mean by exact  
versions? Each library that you are going to compile into your  
application MUST be linked against the same version of the C/C++  
runtime libraries with respect to the following:

 Single/Multi Threaded
 Static/Dynamic Linked to the Runtimes
 Debug/Release Versions of each runtime.

Again, following the above will help the developer have a much easier  
time with Visual Studio Development.


If you get link warnings about adding NODEFAULTLIB then you are NOT  
following the above rules and nice random crashes and bugs WILL pop up  
in your program.


What can HDFGroup do to Help? PLEASE PLEASE PLEASE list the exact  
version of Visual Studio used to compile the binaries. This should be  
something like Visual Studio 2008 SP1 (9.0.xxx.xxx SPI) or if the  
original Visual Studio 2008 was used, then 9.0.xxx.xxx RTM). These  
values can be found by going to Help-About Visual Studio. Also  
explicitly list HOW they were compiled with respect to Single/ 
MultiThreaded, Static/Dynamic Runtime linkage, Dynamic/Static  
libraries, Debug/Release compiles.


 Having this information on the download page will help alleviate  
help emails from confused users.


  Again, these are my experiences, Yours may vary. Following those  
rules helps keep things simple for Visual Studio Development.

___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net
BlueQuartz Software   Dayton, Ohio


On Mar 17, 2010, at 12:29 PM, Dana Robinson wrote:

I don't know if this solves people's linking problems but I thought  
I'd weigh in with what I know about Windows C runtime (CRT) linking.


From what I understand, it boils down to this:

Every library and executable in your application should use the same  
instance of the C runtime library.


The reason for this is that separate dlls have a separate internal  
state and thus different ideas about things like valid file handles  
and allocated memory.  For example, if you malloc() some memory in a  
library linked to msvcr80.dll and then try to realloc() or free() it  
in a program linked to msvcr90.dll, you will have problems since  
msvcr90 doesn't know about the first allocation's internal memory  
handles (writing to memory allocated using a different C runtime  
should work fine, though, since the underlying internal memory  
handles are not being manipulated).  Debug libraries (e.g.  
msvcr80d.dll) are also different in this respect since they are  
different dlls so using a library compiled in release mode with an  
executable compiled in debug mode can cause problems.  Keep in mind  
that many things will appear to work just fine when you mix C  
runtime libraries - it's just when you pass CRT resources around  
that you'll run into issues.  It's also important to remember that  
statically linking to a C runtime gives you a private version of  
that runtime that is NOT shared with any other components.  For this  
reason, it is recommended that you only statically link against a  
library when you are building an executable and NOT when you are  
building a library.


You can check your dependencies with dumpbin if you have Visual  
Studio installed.  Fire up a Visual Studio command prompt and use  
dumpbin /imports path to library or program to see which dynamic  
libraries a particular program or dll requires.  The existing 32-bit  
HDF binaries show a dependency on msvcr80.dll (Visual C++ 8.0 /  
Visual Studio 2005), the VS2005 version of zlib is also linked to  
msvcr80.dll and szip does not require any C library functions (it's  
either just linked to kernel32 and uses Windows functions or it  
incorrectly statically links to the CRT - I'll have to check that  
out).  The tools are an odd case - they are statically linked to  
HDF5 and have a private version of the C runtime that is separate  
from, say, msvcr80.dll but are also dynamically linked to zlib which  
is, in turn, linked to msvcr80.dll.  This means that zlib code used  
by the tools and the rest of the tool code will have different CRT  
states.


So this is (pretty much) all well and good as long as you are using  
Visual Studio 2005.  Problems can arise, however, when you try to  
use Visual Studio 2008 to build an application that links against  
VS2005 versions of HDF5.  VS2008 will link your application against  

Re: [Hdf-forum] HDF5-1.8.4 patch1 on Windows7 64bit

2010-03-17 Thread Michael Jackson
One final note. All the git stuff _should_ be now or very soon in the  
official SVN repo of The HDFGroup. A few of us setup the gitorious  
site to collaborate easily on the Cmakeification of HDF5. Those  
changes will be pushed back to the official SVN repo real soon now.  
What I am getting at is that the gitorious site will go away when the  
SVN gets updated. I do NOT want to fracture the HDF5 source/community  
by having multiple places to download HDF5 source codes.


 Dr. X. I am glad everything worked for you. Alternatively you can  
download an MSys+Git installer from google code http://code.google.com/p/msysgit/downloads/list 
 which is probably an easier install for some due to NOT needing  
cygwin.


I actually have a ZLib that builds with CMake on my own machine. Not  
sure if anyone is interested in that. Adding that would make HDF5  
completely configurable with CMake. Just a thought. Then again, that  
leads to maintenance nightmares ...

___
Mike Jackson  www.bluequartz.net

On Mar 17, 2010, at 3:06 PM, Dr. X wrote:


Hi All,
First of all, I appreciate Michael's prompt follow up and all the  
git info.

I have downloaded HDF5 git repository from
git clone git://gitorious.org/hdf5/hdf5-v18.git
szip from
git clone git://gitorious.org/hdf5/szip.git
and zlib from
http://zlib.net/zlib-1.2.4.tar.gz

If you don't know how to use git. Please install cygwin on your  
windows machine and start a terminal before you copy and paste above  
statements. Certainly create hosting directories for HDF5, SZIP, and  
ZLIB is mandatory. Follow Michael and Dana's note, I want to take no  
chance and recompile szip and zlib from the source as well.


Build szip and zlib first:
Use cmake to build szip and zlib-1.2.4\projects\vissualc6\zlib.dsw  
to build zlib. Visual Studio 2008 will automatically convert  
zlib.dsw to *.sln. And do remember to create new x64 configurations  
if your machine is 64bit. Both packages (DLL Release and LIB  
Release) were built fine (without any L4096 warning on / 
NODEFAULTLIB... :)).


Then use cmake to deal with hdf5. After locating all the right  
places for zlib and szip libraries and include directories, you are  
gold. I built hdf5 with HDF5_ENABLE_SZIP_SUPPORT and  
HDF5_ENABLE_Z_LIB_SUPPORT enabled for my own testing. I guess these  
two flags allow the created hdf5 library to be usefully without the  
companion of additional szip and zlib libraries.


Note, zip_perf project will fail on Windows due to gettimeofday()  
and S_IRWXU dependencies.


At least all the hdf5 .exe are running fine. Best wishes to all  
Windows 64bit users. And thanks Michael and Dana.


Best,
xunlei


On 3/17/2010 12:50 PM, Michael Jackson wrote:


Great Write up. I'll just add my own experiences and some  
suggestions to make our (the developers USING hdf5) easier.
 I am mainly an OS X Developer and do Windows development due to  
customer constraints so that is the perspective that this comes from.
 RULE #1: Always build ALL your libraries with the SAME EXACT  
version of Visual Studio, down to the Service Pack AND Security  
Patch Level.

 RULE #2: NEVER break rule #1.

Following those rules will allow you to have a much easier time  
with debugging/running/compiling on windows. What do I mean by  
exact versions? Each library that you are going to compile into  
your application MUST be linked against the same version of the C/C+ 
+ runtime libraries with respect to the following:

 Single/Multi Threaded
 Static/Dynamic Linked to the Runtimes
 Debug/Release Versions of each runtime.

Again, following the above will help the developer have a much  
easier time with Visual Studio Development.


If you get link warnings about adding NODEFAULTLIB then you are  
NOT following the above rules and nice random crashes and bugs WILL  
pop up in your program.


What can HDFGroup do to Help? PLEASE PLEASE PLEASE list the exact  
version of Visual Studio used to compile the binaries. This should  
be something like Visual Studio 2008 SP1 (9.0.xxx.xxx SPI) or if  
the original Visual Studio 2008 was used, then 9.0.xxx.xxx RTM).  
These values can be found by going to Help-About Visual Studio.  
Also explicitly list HOW they were compiled with respect to Single/ 
MultiThreaded, Static/Dynamic Runtime linkage, Dynamic/Static  
libraries, Debug/Release compiles.


 Having this information on the download page will help alleviate  
help emails from confused users.


  Again, these are my experiences, Yours may vary. Following those  
rules helps keep things simple for Visual Studio Development.

___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net
BlueQuartz Software   Dayton, Ohio


On Mar 17, 2010, at 12:29 PM, Dana Robinson wrote:

I don't know if this solves people's linking problems but I  
thought

Re: [Hdf-forum] HDF5-1.8.4 patch1 on Windows7 64bit

2010-03-17 Thread Michael Jackson

please don't do that..

 Boost does this and absolutely wreaks havoc on things like CMake and  
other build tools that are relying on a library naming scheme NOT  
changing. In particular if you are going to switch up to CMake in the  
near future you are going to have to sync with CMake for releases so  
that FindHDF5.cmake can stay constantly updated with all the new  
flavors of Visual Studio, GCC, ICC, Absoft, Borland that come out.  
Read back through some of the  CMake archives and look for things like  
CMake can't find boost, but I have it installed. Messages like that  
hit about once a month because Boost seems to want to change their  
naming scheme at their leisure. Now look at Qt from Nokia. CMake has  
FindQt4.cmake which doesn't get that many updates and seems to always  
work because the library naming is consistent from release to release  
and across platforms.


  So in light of this I wold make it painfully obvious on your  
download site how your binaries were built. Eclipse has a click  
through page for OS X downloads that has a short blurb regarding the  
needed things for Eclipse on 64 Bit OS X. I suggest the HDFGroup does  
something like that also. Heck, you could have CMake configure a  
Compiler_Used.txt file for each build with a description of how the  
binaries were built.


 Thanks for listening.
___
Mike Jackson  www.bluequartz.net

On Mar 17, 2010, at 3:51 PM, Dana Robinson wrote:
-8---
In particular, I'd like for our dlls to be named zlib90.dll,  
hdf590d.dll and so forth, which I think might make incorrect  
linking more apparent.

-8---

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5-1.8.4 patch1 on Windows7 64bit

2010-03-16 Thread Michael Jackson
No they don't. I release Applications all the time with the C/C++  
Runtimes next to the executable. There is some build setting in visual  
studio that allows this.


http://www.nuonsoft.com/blog/2008/10/29/binding-to-the-most-recent-visual-studio-libraries/

If you read nothing but the LAST line of the blog the blogger explains  
why binding to multiple C/C++ runtimes is a Bad idea and simply will  
not work. Sounds like something like this is going on here.


The original poster may want to try build HDF5 with the CMake  
version on Gitorious although that code base is only HDF5 V1.8.4 and  
does NOT have the patch-1 additions/corrections to the codes.


___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net
BlueQuartz Software   Dayton, Ohio


On Mar 16, 2010, at 3:54 PM, Lewandowski, George wrote:

This sounds like it might be a problem with side-by-side  
assemblies.  The proper C runtimes must be installed on the client  
machine, the loader will not look for them in the same directory as  
the executable. See this MSDN article:


http://msdn.microsoft.com/en-us/library/aa376307(VS.85).aspx

If this is the problem, it can be fixed by running vcredist_x86.exe  
to install the side-by-side assemblies on the target machine.  You  
have to get the correct redistributable for the version of VC used  
to build the program.  The redistributable for VS2005 SP1 is here:


http://www.microsoft.com/downloads/details.aspx?FamilyID=200b2fd9-ae1a-4a14-984d-389c36f85647displaylang=en

George Lewandowski
(314)777-7890
Mail Code S270-2204
Building 270-E Level 2E Room 20E
P-8A

-Original Message-
From: hdf-forum-boun...@hdfgroup.org [mailto:hdf-forum-boun...@hdfgroup.org 
] On Behalf Of Allen D Byrne

Sent: Tuesday, March 16, 2010 2:46 PM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] HDF5-1.8.4 patch1 on Windows7 64bit

xunlei,

 This is the second report of this. I've just not been able to  
locate why this is happening. I haven't altered the project files  
other then adding or deleting to the list of files. Any help on  
finding the source of the problem would be greatly appreciated.


Allen


Hi,

I have been trying to use 5-1.8.4 patch1 on my win7 64bit machine. I
first tried the pre-compiled version from
http://www.hdfgroup.org/ftp/HDF5/current/bin/windows/hdf5-1.8.4- 
patch1
-64bit-vs2005-ivf91.zip When I tried to run any utility in bin\, I  
got

a System Error message box saying:
the program can't start because MSVCR80.dll is missing from your
computer. Try reinstalling the program to fix this problem. error.
I then copied msvcr80.dll, msvcp80.dll, msvcm80.dll into bin\ from a
working application, say MATLAB or NVIDIA driver folder. I presume
these Dll's are compatible with Win7 64bit since all these
applications work fine.
When I run say h5dump.exe, I got a Microsoft Visual C++ Runtime
Library box saying:
Runtime Error!
Program: C:\HDF5\bin\h5dump.exe
R6034
An application has made an attempt tot load the C runtime library
incorrectly.
Please contact the application's support team for more information.
after click OK, I encountered yet another box h5dump.exe -
Application Error saying:
The application was unable to start correctly (0xc142). Click OK
to close the application.

I tried to recompile HDF5 from the source code using
http://www.hdfgroup.org/ftp/HDF5/current/src/hdf5-1.8.4-patch1.tar.gz
I used MSVC2008 to create the default x64 configuration files from  
the

included Win32 ones. Compilation went smoothly. The newly created
binary files behave exactly the same as the your binary release.

Would you please help? Thanks a lot.

Best,
xunlei

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] libhdftool as dll

2010-03-10 Thread Michael Jackson
Not sure what you have as far as hardware for testing goes, but if you  
use the Git repository at www.gitorious.com/hdf5 then I can test any  
changes on OS X intel and Windows 7 x64. Just FYI.


Basically sign up for an account and clone the repository. We can then  
pull from your clone to test out changes.

___
Mike Jackson  www.bluequartz.net
Principal Software Engineer   mike.jack...@bluequartz.net
BlueQuartz Software   Dayton, Ohio


On Mar 10, 2010, at 2:54 PM, Peter Cao wrote:


Werner,

You made a very good suggestion. It is actual in our long-term plan  
to make
a set of public tools functions and also to make our current tools  
more modular.
Because of other priority, we do not have much resource to commit to  
the work.

However, we are working toward to this goal.

As for making some of the current function to be public and build as  
dll, you
are welcome to try. If there is no other issue (technical and non- 
technical),

we will be happy to include your contribution.

Thanks
--pc


Werner Benger wrote:
This might be a bit too internal for the forum, but maybe still of  
interest

for more:

The HDF5 tools (h5dump, h5ls, ...) share common code via the tools  
library.
This is currently built as a static library and linked into the  
main executables.


It appears it would not be a big effort to modify it to be shared  
library/DLL
instead. There are two symbols that from the library are references  
as external
and are supposed to be defined in the main application. These would  
need to go
into the library itself instead to be able to build it as shared  
library. Not

a big deal.

If we can have the tools library available through header files and  
DLL for
user code as well, it has two advantages: once the executables are  
a bit
smaller (which is not really important these days), but secondly  
one could
also easier write user-modified versions of h5ls/h5dump etc., for  
instance
versions that use another VFD, or output data differently on dump  
etc.


As a DLL under windows, only addition required in the code were some
symbol export definitions. I'd be volunteering to contribute such.

Is there interest in such a patch?

   Werner




___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org