1) What is wrong with more DAs? Nothing. Just I never needed more than one DA.
2) Get the coordinate vector for one, copy it, and then shift it to get the others Well, I played around a bit and I realized that the coordinate vector returned from DMDAGetCoordinates() is actually of size Nx*Ny*Nz*3 (for DA 3D). This will double the size of my output file. What I need is just to add 3 1-D arrays of (x[Nx]+y[Ny]+z[Nz]) including the grid coordinates to the end of the *.h5 file and then later on, I can use any visualization software to load the data using those coordinates. I am using orthogonal grid, that's why I don't need all the (x,y,z) coordinates for each cell. Thanks again, Mohamad On Thu, Jan 19, 2012 at 3:30 PM, Matthew Knepley <knepley at gmail.com> wrote: > On Thu, Jan 19, 2012 at 5:26 PM, Mohamad M. Nasr-Azadani <mmnasr at gmail.com > > wrote: > >> Sorry for the confusion. >> I use only one DA parallel layout for my problem in spite of the fact >> that I use MAC-staggered grid (I added one extra cell to the end of the >> domain in each direction so that I overcome the difficulty of one extra >> cell associated with each velocity in the corresponding direction, i.e. one >> extra u-grid exists in x-direction or one extra v-grid exists in >> y-direction). >> So, the vectors (global and locals) are derived from same DA but they do >> not refer to the same physical location. >> I could fix this if I create DA for each velocity and scalar components >> and set the coordinates for each of them separately, but I would rather not >> do this at this point. >> > > 1) What is wrong with more DAs? > > 2) Get the coordinate vector for one, copy it, and then shift it to get > the others > > Matt > > >> I hope I was clear. >> Best, >> Mohamad >> >> >> >> >> On Thu, Jan 19, 2012 at 3:20 PM, Matthew Knepley <knepley at gmail.com>wrote: >> >>> On Thu, Jan 19, 2012 at 5:19 PM, Mohamad M. Nasr-Azadani < >>> mmnasr at gmail.com> wrote: >>> >>>> Thanks Mat, >>>> Use the DA coordinate mechanism and you can get the coordinates as a >>>> parallel Vec. >>>> >>>> well, that won't be working for me since although I use one DA and the >>>> parallel vectors derived from same DA, yet I am using staggered grid >>>> formulation. So, there the coordinates could be different for different >>>> vectors. >>>> Is there any other way around this ? >>>> >>> >>> I do not understand what you mean, be more specific. >>> >>> Matt >>> >>> >>>> On Thu, Jan 19, 2012 at 6:59 AM, Matthew Knepley <knepley at >>>> gmail.com>wrote: >>>> >>>>> On Thu, Jan 19, 2012 at 2:50 AM, Mohamad M. Nasr-Azadani < >>>>> mmnasr at gmail.com> wrote: >>>>> >>>>>> Hi guys, >>>>>> >>>>>> I have compiled petsc to use HDF5 package. >>>>>> >>>>>> I like to store the data from a parallel vector(s) (obtained from >>>>>> structured DA in 3 dimensions) to file using VecView() in conjunction >>>>>> with PetscViewerHDF5Open(). >>>>>> >>>>>> I followed the example here >>>>>> http://www.mcs.anl.gov/petsc/petsc-current/src/dm/examples/tutorials/ex10.c.html >>>>>> and everything looks fine. >>>>>> >>>>>> However, I had a couple questions: >>>>>> >>>>>> 1- When I am done writing the parallel vector obtained from the DA >>>>>> (and PETSC_COMM_WORLD), >>>>>> >>>>>> // Create the HDF5 viewer >>>>>> PetscViewerHDF5Open<http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/PetscViewerHDF5Open.html#PetscViewerHDF5Open> >>>>>> (PETSC_COMM_WORLD<http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PETSC_COMM_WORLD.html#PETSC_COMM_WORLD> >>>>>> ,"gauss.h5",FILE_MODE_WRITE,&H5viewer); >>>>>> // Write the H5 file >>>>>> >>>>>> VecView<http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecView.html#VecView> >>>>>> (gauss,H5viewer); >>>>>> // Cleaning stage >>>>>> PetscViewerDestroy<http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/PetscViewerDestroy.html#PetscViewerDestroy> >>>>>> (&H5viewer); >>>>>> >>>>>> how can I add data that are just simple 1-D numbers stored on local >>>>>> arrays. >>>>>> Easier said, I would like to add the structured grid coordinates >>>>>> (first all x's, then all y's, and then all z's) at the end (or to the >>>>>> beginning) of each data (*.h5) file. But the grid coordinates are stored >>>>>> locally on each machine and not derived from any parallel vectors or DA. >>>>>> I >>>>>> was thinking about creating vectors and viewers using PETSC_COMM_SELF >>>>>> but i >>>>>> am not sure if that is the right approach since that vector is created on >>>>>> all processors locally. >>>>>> >>>>> >>>>> Use the DA coordinate mechanism and you can get the coordinates as a >>>>> parallel Vec. >>>>> >>>>> >>>>>> 2- When using VecView() and HDF5 writer, what is the status of data >>>>>> compression? >>>>>> The reason that I am asking is that, I used the same example above >>>>>> and comparing two files saved via two different PetscViewers, i.e. (just) >>>>>> Binary and HDF5 (Binary) the size is not reduced in the (*.h5) case. >>>>>> In fact, it is slightly bigger than pure binary file!! >>>>>> Is there any command we have to set in Petsc to tell HDF5 viewer to >>>>>> use data compression? >>>>>> >>>>> >>>>> We do not support it. We are happy to take patches that enable this. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Thanks for your patience, >>>>>> Best, >>>>>> Mohamad >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120119/c0b2a4da/attachment-0001.htm>
