Hi Roland, indeed changing the names of the files I'm creating with my Python script to foo.file_<n>.h5 solved the issue and I can correctly visualize the data with Visit.
Thanks a lot again, Lorenzo Il giorno lun 27 giu 2022 alle ore 17:15 Roland Haas <[email protected]> ha scritto: > Hello Lorenzo, > > Python can be tricky. niprocs must be a 32bit integer and Python's int > type by default is 64bit and it tends to re-create the attribute rather > than change it. So depending on how you copied the attributes group > this may affect you. > > My own nioprocs.py script thus looks like this (note the int32 cast): > > --8<-- > #!/usr/bin/python > > import h5py > import sys > import numpy > > cfh = h5py.File(sys.argv[1], "r+") > cfh.require_group("Parameters and Global Attributes") > cfh["Parameters and Global Attributes"].attrs["nioprocs"] = numpy.int32(1) > --8<-- > > Or it could be somthing with the file names not quite matching the form > the VisIt reader expects (must be either foo.file_X.h5 or foo.h5 if > there is a single ioproc). > > Anyway, glad that this worked for you. > > Yours, > Roland > > > Hi Roland, > > thank you very much for your answer, I made it using hdf5_slicer. I'm > > puzzled though, because in my Python script I copied each dataset into a > > separate file, so I was expecting nioprocs = 160 to be correct (there are > > 160 files). Anyway, hdf5_slicer worked perfectly. > > > > Thank you again, > > Lorenzo > > > > Il giorno lun 27 giu 2022 alle ore 15:20 Roland Haas <[email protected] > > > > ha scritto: > > > > > Hello Lorenzo, > > > > > > There's a metadata nioprocs or so that is rhe number of files. If you > > > copied all datasets into a single file you must set it to 1. You can > use a > > > tiny Python script to do so, or (I think) hdf5_slicer --out3d_box, or > (I > > > think) hdf5_merge with its --ioprocs option. > > > > > > In fact you can use hdf5_slicer to do the extraction for you. There's > some > > > details about this in the response to an earlier question by Maria > about > > > merging files I think. > > > > > > Yours, > > > Roland > > > > > > ----- Original Message ----- > > > From: Lorenzo Ennoggi <[email protected]> > > > Sent: 2022-06-27 - 07:01 > > > To: Einstein Toolkit Users <[email protected]> > > > Subject: [Users] Extracting the last iteration from 3D CarpetIOHDF5 > output > > > files > > > > > > > Hello, > > > > I have a set of 3D output files from CarpetIOHDF5 > (rho.xyz_file<n>.h5), > > > > each containing many iterations, and I want to create a set of new > > > > CarpetIOHDF5 files containing the data related to the last > iteration. I > > > > have tried something, but I couldn't get anything to work, so I am > asking > > > > for help here. > > > > > > > > For each file rho.xyz_file<n>.h5, I used the hdf5_extract utility > to > > > create > > > > a new file containing the dataset related to the last iteration > > > > ("HYDROBASE::rho > > > > it=1067600 tl=0 rl=0 c=<n>") and the group "Parameters and Global > > > > Attributes". However, when I try to open the set of new files with > VisIt > > > I > > > > get the following error: > > > > > > > > VisIt could not read from the file > > > >> > > > > "/home/lorenzo/CBD_prod_WZ9_400_140_280_output-0014/Output/rho.xyz_it1067600_tl0_rl0_c* > > > > >> database". > > > > > > > > > > > > The generated error message was: > > > > > > > > > > > > There was an error opening > > > >> > > > > /home/lorenzo/CBD_prod_WZ9_400_140_280_output-0014/Output/rho.xyz_it1067600_tl0_rl0_c* > > > > >> database. It may be an invalid file. VisIt tried using the > following > > > file > > > >> format readers to open the file: CarpetHDF5 > > > > > > > > > > > > The following error(s) may be helpful in identifying the problem: > > > >> *Tried to access an invalid index 1 (Maximum = 0).* > > > > > > > > > > > > I thought that maybe some metadata in the new files are telling > VisIt > > > that > > > > there are multiple iterations available, which is not true; I'm not > > > > completely sure this is really the issue, though. > > > > > > > > In the new files, the attributes "GH$iteration" and > "carpet_global_time" > > > > from the group "Parameters and Global Attributes" are still > respectively > > > > set to the first iteration (992800) and the first time (148920) > available > > > > in the original files, while I am extracting the last iteration > (1067600, > > > > time 160140). Also, in the new files, the dataset "Grid Structure > v5" (a > > > > string) still contains > > > > "grid_times:[[[148920,148919.85000000001,148919.69999999998]]]", > which > > > also > > > > looks wrong. Therefore, I used h5py to generate new files in which > I > > > fixed > > > > "GH$iteration", "carpet_global_time" and "Grid Structure v5", but I > still > > > > get the same error from VisIt. > > > > > > > > I have also thought about re-running the simulation that produced the > > > > original files from the last checkpoint and just dump the last > iteration, > > > > but that simulation was run some time ago and the code I am using has > > > > evolved quite a bit in the meantime, to the point that some > parameters > > > are > > > > not even defined anymore. > > > > > > > > Is there any other attribute/dataset I should edit in order to fix > the > > > new > > > > files? Are there smarter ways to extract an iteration from a set of > 3D > > > > CarpetHDF5 output files? Please let me know if you need additional > > > > information from my end. > > > > > > > > Thank you very much, > > > > Lorenzo Ennoggi > > > > _______________________________________________ > > > > Users mailing list > > > > [email protected] > > > > > > > > https://urldefense.com/v3/__http://lists.einsteintoolkit.org/mailman/listinfo/users__;!!DZ3fjg!8ek7C4tvxIysQIy8c3UhecmG8i6HaqZW_HTD-WMedhlEHkX7_FBP7ORbQoGjMqNr2SXd2ekz8pJr7kHksP2S8ShBJg$ > > > > > > > > > > > -- > My email is as private as my paper mail. I therefore support encrypting > and signing email messages. Get my PGP key from http://pgp.mit.edu . >
_______________________________________________ Users mailing list [email protected] http://lists.einsteintoolkit.org/mailman/listinfo/users
