Re: [petsc-users] SNES ex12 visualization

2017-09-14 Thread Kong, Fande
On Thu, Sep 14, 2017 at 11:26 AM, Matthew Knepley  wrote:

> On Thu, Sep 14, 2017 at 1:07 PM, Kong, Fande  wrote:
>
>>
>>
>> On Thu, Sep 14, 2017 at 10:35 AM, Barry Smith  wrote:
>>
>>>
>>> > On Sep 14, 2017, at 11:10 AM, Kong, Fande  wrote:
>>> >
>>> >
>>> >
>>> > On Thu, Sep 14, 2017 at 9:47 AM, Matthew Knepley 
>>> wrote:
>>> > On Thu, Sep 14, 2017 at 11:43 AM, Adriano Côrtes <
>>> adrimacor...@gmail.com> wrote:
>>> > Dear Matthew,
>>> >
>>> > Thank you for your return. It worked, but this prompts another
>>> question. So why PetscViewer does not write both files (.h5 and .xmf)
>>> directly, instead of having to post-proc the .h5 file (in serial)?
>>> >
>>> > 1) Maintenance: Changing the Python is much easier than changing the C
>>> you would add to generate it
>>> >
>>> > 2) Performance: On big parallel system, writing files is expensive so
>>> I wanted to minimize what I had to do.
>>> >
>>> > 3) Robustness: Moving 1 file around is much easier than remembering 2.
>>> I just always regenerate the xdmf when needed.
>>> >
>>> > And what about big 3D simulations? PETSc always serialize the output
>>> of the distributed dmplex? Is there a way to output one .h5 per mesh
>>> partition?
>>> >
>>> > Given the way I/O is structured on big machines, we believe the
>>> multiple file route is a huge mistake. Also, all our measurements
>>> > say that sending some data on the network is not noticeable given the
>>> disk access costs.
>>> >
>>> > I have slightly different things here. We tried the serial output, it
>>> looks really slow for large-scale problems, and the first processor often
>>> runs out of memory because of gathering all data from other processor cores.
>>>
>>>   Where in PETSc is this?  What type of viewer? Is there an example that
>>> reproduces the problem? Even when we do not use MPI IO in PETSc we attempt
>>> to not "put the entire object on the first process" so memory should not be
>>> an issue. For example VecVew() should memory scale both with or without MPI
>>> IO
>>>
>>
>> We manually gather all data to the first processor core, and write it as
>> a single vtk file.
>>
>
> Of course I am not doing that. I reduce everything to an ISView or a
> VecView call. That way it uses MPI I/O if its turned on.
>

I meant Fande manually gathers  all data to the first processor core in his
in-house code.


>
>Matt
>
>
>>
>>>
>>> > The parallel IO runs smoothly and much faster than I excepted. We have
>>> done experiments with ten thousands  of cores for a problem with 1 billion
>>> of unknowns.
>>>
>>> Is this your own canned IO or something in PETSc?
>>>
>>
>> We implement the writer based on the ISView and VecView with HDF5 viewer
>>  in PETSc to output all data as a single HDF. ISView and VecView do the
>> magic job for me.
>>
>>
>>
>>>
>>> > I did not see any concern so far.
>>>
>>>Ten thousand files is possibly manageable but I question 2 million.
>>>
>>
>> Just one single HDF5 file.
>>
>> Fande,
>>
>>
>>>
>>> >
>>> >
>>> > Fande,
>>> >
>>> >
>>> >   Thanks,
>>> >
>>> > Matt
>>> >
>>> > Best regards,
>>> > Adriano.
>>> >
>>> >
>>> > 2017-09-14 12:00 GMT-03:00 Matthew Knepley :
>>> > On Thu, Sep 14, 2017 at 10:30 AM, Adriano Côrtes <
>>> adrimacor...@gmail.com> wrote:
>>> > Dear all,
>>> >
>>> > I am running the SNES ex12  and I'm passing the options -dm_view
>>> hdf5:sol.h5 -vec_view hdf5:sol.h5::append to generate an output file. The
>>> .h5 file is generated, but I'm not being able to load it in Paraview
>>> (5.4.0-64bits). Paraview recognizes the file and offers severel options to
>>> read it, here is the complete list
>>> >
>>> > Chombo Files
>>> > GTC Files
>>> > M3DC1 Files
>>> > Multilevel 3D Plasma Files
>>> > PFLOTRAN Files
>>> > Pixie Files
>>> > Tetrad Files
>>> > UNIC Files
>>> > VizSchema Files
>>> >
>>> > The problem is none of the options above work :(
>>> > I'm using the configure option '-download-hdf5' and it installs hdf5
>>> version 1.8.18
>>> > Any hint of how to fix it and have the visualization working?
>>> >
>>> > Yes, Paraview does not directly read HDF5. It needs you to tell it
>>> what the data in the HDF5 file means. You do
>>> > this by creating a *.xdmf file, which is XML. We provide a tool
>>> >
>>> >   $PETSC_DIR/bin/petsc_gen_xdmf.py 
>>> >
>>> > which should automatically produce this file for you. Let us know if
>>> it does not work.
>>> >
>>> >   Thanks,
>>> >
>>> > Matt
>>> >
>>> >
>>> > Best regards,
>>> > Adriano.
>>> >
>>> > --
>>> > Adriano Côrtes
>>> > =
>>> > Campus Duque de Caxias and
>>> > High-performance Computing Center (NACAD/COPPE)
>>> > Federal University of Rio de Janeiro (UFRJ)
>>> >
>>> >
>>> >
>>> > --
>>> > What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> 

Re: [petsc-users] SNES ex12 visualization

2017-09-14 Thread Matthew Knepley
On Thu, Sep 14, 2017 at 1:07 PM, Kong, Fande  wrote:

>
>
> On Thu, Sep 14, 2017 at 10:35 AM, Barry Smith  wrote:
>
>>
>> > On Sep 14, 2017, at 11:10 AM, Kong, Fande  wrote:
>> >
>> >
>> >
>> > On Thu, Sep 14, 2017 at 9:47 AM, Matthew Knepley 
>> wrote:
>> > On Thu, Sep 14, 2017 at 11:43 AM, Adriano Côrtes <
>> adrimacor...@gmail.com> wrote:
>> > Dear Matthew,
>> >
>> > Thank you for your return. It worked, but this prompts another
>> question. So why PetscViewer does not write both files (.h5 and .xmf)
>> directly, instead of having to post-proc the .h5 file (in serial)?
>> >
>> > 1) Maintenance: Changing the Python is much easier than changing the C
>> you would add to generate it
>> >
>> > 2) Performance: On big parallel system, writing files is expensive so I
>> wanted to minimize what I had to do.
>> >
>> > 3) Robustness: Moving 1 file around is much easier than remembering 2.
>> I just always regenerate the xdmf when needed.
>> >
>> > And what about big 3D simulations? PETSc always serialize the output of
>> the distributed dmplex? Is there a way to output one .h5 per mesh partition?
>> >
>> > Given the way I/O is structured on big machines, we believe the
>> multiple file route is a huge mistake. Also, all our measurements
>> > say that sending some data on the network is not noticeable given the
>> disk access costs.
>> >
>> > I have slightly different things here. We tried the serial output, it
>> looks really slow for large-scale problems, and the first processor often
>> runs out of memory because of gathering all data from other processor cores.
>>
>>   Where in PETSc is this?  What type of viewer? Is there an example that
>> reproduces the problem? Even when we do not use MPI IO in PETSc we attempt
>> to not "put the entire object on the first process" so memory should not be
>> an issue. For example VecVew() should memory scale both with or without MPI
>> IO
>>
>
> We manually gather all data to the first processor core, and write it as a
> single vtk file.
>

Of course I am not doing that. I reduce everything to an ISView or a
VecView call. That way it uses MPI I/O if its turned on.

   Matt


>
>>
>> > The parallel IO runs smoothly and much faster than I excepted. We have
>> done experiments with ten thousands  of cores for a problem with 1 billion
>> of unknowns.
>>
>> Is this your own canned IO or something in PETSc?
>>
>
> We implement the writer based on the ISView and VecView with HDF5 viewer
>  in PETSc to output all data as a single HDF. ISView and VecView do the
> magic job for me.
>
>
>
>>
>> > I did not see any concern so far.
>>
>>Ten thousand files is possibly manageable but I question 2 million.
>>
>
> Just one single HDF5 file.
>
> Fande,
>
>
>>
>> >
>> >
>> > Fande,
>> >
>> >
>> >   Thanks,
>> >
>> > Matt
>> >
>> > Best regards,
>> > Adriano.
>> >
>> >
>> > 2017-09-14 12:00 GMT-03:00 Matthew Knepley :
>> > On Thu, Sep 14, 2017 at 10:30 AM, Adriano Côrtes <
>> adrimacor...@gmail.com> wrote:
>> > Dear all,
>> >
>> > I am running the SNES ex12  and I'm passing the options -dm_view
>> hdf5:sol.h5 -vec_view hdf5:sol.h5::append to generate an output file. The
>> .h5 file is generated, but I'm not being able to load it in Paraview
>> (5.4.0-64bits). Paraview recognizes the file and offers severel options to
>> read it, here is the complete list
>> >
>> > Chombo Files
>> > GTC Files
>> > M3DC1 Files
>> > Multilevel 3D Plasma Files
>> > PFLOTRAN Files
>> > Pixie Files
>> > Tetrad Files
>> > UNIC Files
>> > VizSchema Files
>> >
>> > The problem is none of the options above work :(
>> > I'm using the configure option '-download-hdf5' and it installs hdf5
>> version 1.8.18
>> > Any hint of how to fix it and have the visualization working?
>> >
>> > Yes, Paraview does not directly read HDF5. It needs you to tell it what
>> the data in the HDF5 file means. You do
>> > this by creating a *.xdmf file, which is XML. We provide a tool
>> >
>> >   $PETSC_DIR/bin/petsc_gen_xdmf.py 
>> >
>> > which should automatically produce this file for you. Let us know if it
>> does not work.
>> >
>> >   Thanks,
>> >
>> > Matt
>> >
>> >
>> > Best regards,
>> > Adriano.
>> >
>> > --
>> > Adriano Côrtes
>> > =
>> > Campus Duque de Caxias and
>> > High-performance Computing Center (NACAD/COPPE)
>> > Federal University of Rio de Janeiro (UFRJ)
>> >
>> >
>> >
>> > --
>> > What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> > -- Norbert Wiener
>> >
>> > https://urldefense.proofpoint.com/v2/url?u=http-3A__www.caam
>> .rice.edu_-7Emk51_=DwIFaQ=54IZrppPQZKX9mLzcGdPfFD1hxrcB_
>> _aEkJFOKJFd00=DUUt3SRGI0_JgtNaS3udV68GRkgV4ts7XKfj2opmiCY&
>> m=YTLjMkjfS0tYLZ3RxmJFoe8BT56h48axFCzaadZwBXA=iLsaHQugaY4g
>> j4DKE9gq8XdBt7q3ejdpDRfJ8RFerE0=
>> >
>> >
>> 

Re: [petsc-users] SNES ex12 visualization

2017-09-14 Thread Kong, Fande
On Thu, Sep 14, 2017 at 10:35 AM, Barry Smith  wrote:

>
> > On Sep 14, 2017, at 11:10 AM, Kong, Fande  wrote:
> >
> >
> >
> > On Thu, Sep 14, 2017 at 9:47 AM, Matthew Knepley 
> wrote:
> > On Thu, Sep 14, 2017 at 11:43 AM, Adriano Côrtes 
> wrote:
> > Dear Matthew,
> >
> > Thank you for your return. It worked, but this prompts another question.
> So why PetscViewer does not write both files (.h5 and .xmf) directly,
> instead of having to post-proc the .h5 file (in serial)?
> >
> > 1) Maintenance: Changing the Python is much easier than changing the C
> you would add to generate it
> >
> > 2) Performance: On big parallel system, writing files is expensive so I
> wanted to minimize what I had to do.
> >
> > 3) Robustness: Moving 1 file around is much easier than remembering 2. I
> just always regenerate the xdmf when needed.
> >
> > And what about big 3D simulations? PETSc always serialize the output of
> the distributed dmplex? Is there a way to output one .h5 per mesh partition?
> >
> > Given the way I/O is structured on big machines, we believe the multiple
> file route is a huge mistake. Also, all our measurements
> > say that sending some data on the network is not noticeable given the
> disk access costs.
> >
> > I have slightly different things here. We tried the serial output, it
> looks really slow for large-scale problems, and the first processor often
> runs out of memory because of gathering all data from other processor cores.
>
>   Where in PETSc is this?  What type of viewer? Is there an example that
> reproduces the problem? Even when we do not use MPI IO in PETSc we attempt
> to not "put the entire object on the first process" so memory should not be
> an issue. For example VecVew() should memory scale both with or without MPI
> IO
>

We manually gather all data to the first processor core, and write it as a
single vtk file.


>
>
> > The parallel IO runs smoothly and much faster than I excepted. We have
> done experiments with ten thousands  of cores for a problem with 1 billion
> of unknowns.
>
> Is this your own canned IO or something in PETSc?
>

We implement the writer based on the ISView and VecView with HDF5 viewer
 in PETSc to output all data as a single HDF. ISView and VecView do the
magic job for me.



>
> > I did not see any concern so far.
>
>Ten thousand files is possibly manageable but I question 2 million.
>

Just one single HDF5 file.

Fande,


>
> >
> >
> > Fande,
> >
> >
> >   Thanks,
> >
> > Matt
> >
> > Best regards,
> > Adriano.
> >
> >
> > 2017-09-14 12:00 GMT-03:00 Matthew Knepley :
> > On Thu, Sep 14, 2017 at 10:30 AM, Adriano Côrtes 
> wrote:
> > Dear all,
> >
> > I am running the SNES ex12  and I'm passing the options -dm_view
> hdf5:sol.h5 -vec_view hdf5:sol.h5::append to generate an output file. The
> .h5 file is generated, but I'm not being able to load it in Paraview
> (5.4.0-64bits). Paraview recognizes the file and offers severel options to
> read it, here is the complete list
> >
> > Chombo Files
> > GTC Files
> > M3DC1 Files
> > Multilevel 3D Plasma Files
> > PFLOTRAN Files
> > Pixie Files
> > Tetrad Files
> > UNIC Files
> > VizSchema Files
> >
> > The problem is none of the options above work :(
> > I'm using the configure option '-download-hdf5' and it installs hdf5
> version 1.8.18
> > Any hint of how to fix it and have the visualization working?
> >
> > Yes, Paraview does not directly read HDF5. It needs you to tell it what
> the data in the HDF5 file means. You do
> > this by creating a *.xdmf file, which is XML. We provide a tool
> >
> >   $PETSC_DIR/bin/petsc_gen_xdmf.py 
> >
> > which should automatically produce this file for you. Let us know if it
> does not work.
> >
> >   Thanks,
> >
> > Matt
> >
> >
> > Best regards,
> > Adriano.
> >
> > --
> > Adriano Côrtes
> > =
> > Campus Duque de Caxias and
> > High-performance Computing Center (NACAD/COPPE)
> > Federal University of Rio de Janeiro (UFRJ)
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
> >
> > https://urldefense.proofpoint.com/v2/url?u=http-3A__www.
> caam.rice.edu_-7Emk51_=DwIFaQ=54IZrppPQZKX9mLzcGdPfFD1hxrcB_
> _aEkJFOKJFd00=DUUt3SRGI0_JgtNaS3udV68GRkgV4ts7XKfj2opmiCY=
> YTLjMkjfS0tYLZ3RxmJFoe8BT56h48axFCzaadZwBXA=
> iLsaHQugaY4gj4DKE9gq8XdBt7q3ejdpDRfJ8RFerE0=
> >
> >
> >
> > --
> > Adriano Côrtes
> > =
> > Campus Duque de Caxias and
> > High-performance Computing Center (NACAD/COPPE)
> > Federal University of Rio de Janeiro (UFRJ)
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> 

Re: [petsc-users] SNES ex12 visualization

2017-09-14 Thread Barry Smith

> On Sep 14, 2017, at 11:10 AM, Kong, Fande  wrote:
> 
> 
> 
> On Thu, Sep 14, 2017 at 9:47 AM, Matthew Knepley  wrote:
> On Thu, Sep 14, 2017 at 11:43 AM, Adriano Côrtes  
> wrote:
> Dear Matthew,
> 
> Thank you for your return. It worked, but this prompts another question. So 
> why PetscViewer does not write both files (.h5 and .xmf) directly, instead of 
> having to post-proc the .h5 file (in serial)?
> 
> 1) Maintenance: Changing the Python is much easier than changing the C you 
> would add to generate it
> 
> 2) Performance: On big parallel system, writing files is expensive so I 
> wanted to minimize what I had to do.
> 
> 3) Robustness: Moving 1 file around is much easier than remembering 2. I just 
> always regenerate the xdmf when needed.
>  
> And what about big 3D simulations? PETSc always serialize the output of the 
> distributed dmplex? Is there a way to output one .h5 per mesh partition? 
> 
> Given the way I/O is structured on big machines, we believe the multiple file 
> route is a huge mistake. Also, all our measurements
> say that sending some data on the network is not noticeable given the disk 
> access costs.
> 
> I have slightly different things here. We tried the serial output, it looks 
> really slow for large-scale problems, and the first processor often runs out 
> of memory because of gathering all data from other processor cores.

  Where in PETSc is this?  What type of viewer? Is there an example that 
reproduces the problem? Even when we do not use MPI IO in PETSc we attempt to 
not "put the entire object on the first process" so memory should not be an 
issue. For example VecVew() should memory scale both with or without MPI IO


> The parallel IO runs smoothly and much faster than I excepted. We have done 
> experiments with ten thousands  of cores for a problem with 1 billion of 
> unknowns.

Is this your own canned IO or something in PETSc?

> I did not see any concern so far. 

   Ten thousand files is possibly manageable but I question 2 million.

> 
> 
> Fande,
>  
> 
>   Thanks,
> 
> Matt
>  
> Best regards,
> Adriano.
> 
> 
> 2017-09-14 12:00 GMT-03:00 Matthew Knepley :
> On Thu, Sep 14, 2017 at 10:30 AM, Adriano Côrtes  
> wrote:
> Dear all,
> 
> I am running the SNES ex12  and I'm passing the options -dm_view hdf5:sol.h5 
> -vec_view hdf5:sol.h5::append to generate an output file. The .h5 file is 
> generated, but I'm not being able to load it in Paraview (5.4.0-64bits). 
> Paraview recognizes the file and offers severel options to read it, here is 
> the complete list
> 
> Chombo Files
> GTC Files
> M3DC1 Files
> Multilevel 3D Plasma Files
> PFLOTRAN Files
> Pixie Files
> Tetrad Files
> UNIC Files
> VizSchema Files
> 
> The problem is none of the options above work :(
> I'm using the configure option '-download-hdf5' and it installs hdf5 version 
> 1.8.18
> Any hint of how to fix it and have the visualization working?
> 
> Yes, Paraview does not directly read HDF5. It needs you to tell it what the 
> data in the HDF5 file means. You do
> this by creating a *.xdmf file, which is XML. We provide a tool
> 
>   $PETSC_DIR/bin/petsc_gen_xdmf.py 
> 
> which should automatically produce this file for you. Let us know if it does 
> not work.
> 
>   Thanks,
> 
> Matt
>  
> 
> Best regards,
> Adriano.
> 
> -- 
> Adriano Côrtes
> =
> Campus Duque de Caxias and
> High-performance Computing Center (NACAD/COPPE)
> Federal University of Rio de Janeiro (UFRJ)
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener
> 
> http://www.caam.rice.edu/~mk51/
> 
> 
> 
> -- 
> Adriano Côrtes
> =
> Campus Duque de Caxias and
> High-performance Computing Center (NACAD/COPPE)
> Federal University of Rio de Janeiro (UFRJ)
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener
> 
> http://www.caam.rice.edu/~mk51/
> 



Re: [petsc-users] SNES ex12 visualization

2017-09-14 Thread Jed Brown
"Kong, Fande"  writes:
>> Given the way I/O is structured on big machines, we believe the multiple
>> file route is a huge mistake. Also, all our measurements
>> say that sending some data on the network is not noticeable given the disk
>> access costs.
>>
>
> I have slightly different things here. We tried the serial output, it looks
> really slow for large-scale problems, and the first processor often runs
> out of memory because of gathering all data from other processor cores. The
> parallel IO runs smoothly and much faster than I excepted. We have done
> experiments with ten thousands  of cores for a problem with 1 billion of
> unknowns. I did not see any concern so far.

I think there are two different issues here.  Writing a separate file
per MPI rank (often also per time step, etc.) creates a filesystem
*metadata* bottleneck.  It's the open() and close() that are more
painful than the write() when you have lots of files.  (You'd also want
to be careful about your naming convention because merely running "ls"
on a directory with many files is usually quite painful.)

MPI-IO collectives offer a solution -- each rank writes parts of a file
efficiently using the parallel file system.  MPI-IO was introduced in
MPI-2 (standardized in 1997) and PETSc has thus far avoided a hard
dependency on this standard because some implementations were very slow
to adopt it.  In my opinion, any IO in PETSc that is intended to be
highly scalable should use MPI-IO.


signature.asc
Description: PGP signature


Re: [petsc-users] SNES ex12 visualization

2017-09-14 Thread Kong, Fande
On Thu, Sep 14, 2017 at 9:47 AM, Matthew Knepley  wrote:

> On Thu, Sep 14, 2017 at 11:43 AM, Adriano Côrtes 
> wrote:
>
>> Dear Matthew,
>>
>> Thank you for your return. It worked, but this prompts another question.
>> So why PetscViewer does not write both files (.h5 and .xmf) directly,
>> instead of having to post-proc the .h5 file (in serial)?
>>
>
> 1) Maintenance: Changing the Python is much easier than changing the C you
> would add to generate it
>
> 2) Performance: On big parallel system, writing files is expensive so I
> wanted to minimize what I had to do.
>
> 3) Robustness: Moving 1 file around is much easier than remembering 2. I
> just always regenerate the xdmf when needed.
>
>
>> And what about big 3D simulations? PETSc always serialize the output of
>> the distributed dmplex? Is there a way to output one .h5 per mesh
>> partition?
>>
>
> Given the way I/O is structured on big machines, we believe the multiple
> file route is a huge mistake. Also, all our measurements
> say that sending some data on the network is not noticeable given the disk
> access costs.
>

I have slightly different things here. We tried the serial output, it looks
really slow for large-scale problems, and the first processor often runs
out of memory because of gathering all data from other processor cores. The
parallel IO runs smoothly and much faster than I excepted. We have done
experiments with ten thousands  of cores for a problem with 1 billion of
unknowns. I did not see any concern so far.


Fande,


>
>   Thanks,
>
> Matt
>
>
>> Best regards,
>> Adriano.
>>
>>
>> 2017-09-14 12:00 GMT-03:00 Matthew Knepley :
>>
>>> On Thu, Sep 14, 2017 at 10:30 AM, Adriano Côrtes >> > wrote:
>>>
 Dear all,

 I am running the SNES ex12  and I'm passing the options -dm_view
 hdf5:sol.h5 -vec_view hdf5:sol.h5::append to generate an output file. The
 .h5 file is generated, but I'm not being able to load it in Paraview
 (5.4.0-64bits). Paraview recognizes the file and offers severel options to
 read it, here is the complete list

 Chombo Files
 GTC Files
 M3DC1 Files
 Multilevel 3D Plasma Files
 PFLOTRAN Files
 Pixie Files
 Tetrad Files
 UNIC Files
 VizSchema Files

 The problem is none of the options above work :(
 I'm using the configure option '-download-hdf5' and it installs hdf5
 version 1.8.18
 Any hint of how to fix it and have the visualization working?

>>>
>>> Yes, Paraview does not directly read HDF5. It needs you to tell it what
>>> the data in the HDF5 file means. You do
>>> this by creating a *.xdmf file, which is XML. We provide a tool
>>>
>>>   $PETSC_DIR/bin/petsc_gen_xdmf.py 
>>>
>>> which should automatically produce this file for you. Let us know if it
>>> does not work.
>>>
>>>   Thanks,
>>>
>>> Matt
>>>
>>>

 Best regards,
 Adriano.

 --
 Adriano Côrtes
 =
 *Campus Duque de Caxias and*
 *High-performance Computing Center (NACAD/COPPE)*
 Federal University of Rio de Janeiro (UFRJ)

>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> http://www.caam.rice.edu/~mk51/
>>> 
>>>
>>
>>
>>
>> --
>> Adriano Côrtes
>> =
>> *Campus Duque de Caxias and*
>> *High-performance Computing Center (NACAD/COPPE)*
>> Federal University of Rio de Janeiro (UFRJ)
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> http://www.caam.rice.edu/~mk51/
> 
>


Re: [petsc-users] SNES ex12 visualization

2017-09-14 Thread Matthew Knepley
On Thu, Sep 14, 2017 at 11:43 AM, Adriano Côrtes 
wrote:

> Dear Matthew,
>
> Thank you for your return. It worked, but this prompts another question.
> So why PetscViewer does not write both files (.h5 and .xmf) directly,
> instead of having to post-proc the .h5 file (in serial)?
>

1) Maintenance: Changing the Python is much easier than changing the C you
would add to generate it

2) Performance: On big parallel system, writing files is expensive so I
wanted to minimize what I had to do.

3) Robustness: Moving 1 file around is much easier than remembering 2. I
just always regenerate the xdmf when needed.


> And what about big 3D simulations? PETSc always serialize the output of
> the distributed dmplex? Is there a way to output one .h5 per mesh
> partition?
>

Given the way I/O is structured on big machines, we believe the multiple
file route is a huge mistake. Also, all our measurements
say that sending some data on the network is not noticeable given the disk
access costs.

  Thanks,

Matt


> Best regards,
> Adriano.
>
>
> 2017-09-14 12:00 GMT-03:00 Matthew Knepley :
>
>> On Thu, Sep 14, 2017 at 10:30 AM, Adriano Côrtes 
>> wrote:
>>
>>> Dear all,
>>>
>>> I am running the SNES ex12  and I'm passing the options -dm_view
>>> hdf5:sol.h5 -vec_view hdf5:sol.h5::append to generate an output file. The
>>> .h5 file is generated, but I'm not being able to load it in Paraview
>>> (5.4.0-64bits). Paraview recognizes the file and offers severel options to
>>> read it, here is the complete list
>>>
>>> Chombo Files
>>> GTC Files
>>> M3DC1 Files
>>> Multilevel 3D Plasma Files
>>> PFLOTRAN Files
>>> Pixie Files
>>> Tetrad Files
>>> UNIC Files
>>> VizSchema Files
>>>
>>> The problem is none of the options above work :(
>>> I'm using the configure option '-download-hdf5' and it installs hdf5
>>> version 1.8.18
>>> Any hint of how to fix it and have the visualization working?
>>>
>>
>> Yes, Paraview does not directly read HDF5. It needs you to tell it what
>> the data in the HDF5 file means. You do
>> this by creating a *.xdmf file, which is XML. We provide a tool
>>
>>   $PETSC_DIR/bin/petsc_gen_xdmf.py 
>>
>> which should automatically produce this file for you. Let us know if it
>> does not work.
>>
>>   Thanks,
>>
>> Matt
>>
>>
>>>
>>> Best regards,
>>> Adriano.
>>>
>>> --
>>> Adriano Côrtes
>>> =
>>> *Campus Duque de Caxias and*
>>> *High-performance Computing Center (NACAD/COPPE)*
>>> Federal University of Rio de Janeiro (UFRJ)
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> http://www.caam.rice.edu/~mk51/
>>
>
>
>
> --
> Adriano Côrtes
> =
> *Campus Duque de Caxias and*
> *High-performance Computing Center (NACAD/COPPE)*
> Federal University of Rio de Janeiro (UFRJ)
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

http://www.caam.rice.edu/~mk51/


Re: [petsc-users] SNES ex12 visualization

2017-09-14 Thread Adriano Côrtes
Dear Matthew,

Thank you for your return. It worked, but this prompts another question. So
why PetscViewer does not write both files (.h5 and .xmf) directly, instead
of having to post-proc the .h5 file (in serial)?
And what about big 3D simulations? PETSc always serialize the output of the
distributed dmplex? Is there a way to output one .h5 per mesh partition?

Best regards,
Adriano.


2017-09-14 12:00 GMT-03:00 Matthew Knepley :

> On Thu, Sep 14, 2017 at 10:30 AM, Adriano Côrtes 
> wrote:
>
>> Dear all,
>>
>> I am running the SNES ex12  and I'm passing the options -dm_view
>> hdf5:sol.h5 -vec_view hdf5:sol.h5::append to generate an output file. The
>> .h5 file is generated, but I'm not being able to load it in Paraview
>> (5.4.0-64bits). Paraview recognizes the file and offers severel options to
>> read it, here is the complete list
>>
>> Chombo Files
>> GTC Files
>> M3DC1 Files
>> Multilevel 3D Plasma Files
>> PFLOTRAN Files
>> Pixie Files
>> Tetrad Files
>> UNIC Files
>> VizSchema Files
>>
>> The problem is none of the options above work :(
>> I'm using the configure option '-download-hdf5' and it installs hdf5
>> version 1.8.18
>> Any hint of how to fix it and have the visualization working?
>>
>
> Yes, Paraview does not directly read HDF5. It needs you to tell it what
> the data in the HDF5 file means. You do
> this by creating a *.xdmf file, which is XML. We provide a tool
>
>   $PETSC_DIR/bin/petsc_gen_xdmf.py 
>
> which should automatically produce this file for you. Let us know if it
> does not work.
>
>   Thanks,
>
> Matt
>
>
>>
>> Best regards,
>> Adriano.
>>
>> --
>> Adriano Côrtes
>> =
>> *Campus Duque de Caxias and*
>> *High-performance Computing Center (NACAD/COPPE)*
>> Federal University of Rio de Janeiro (UFRJ)
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> http://www.caam.rice.edu/~mk51/
>



-- 
Adriano Côrtes
=
*Campus Duque de Caxias and*
*High-performance Computing Center (NACAD/COPPE)*
Federal University of Rio de Janeiro (UFRJ)


Re: [petsc-users] SNES ex12 visualization

2017-09-14 Thread Matthew Knepley
On Thu, Sep 14, 2017 at 10:30 AM, Adriano Côrtes 
wrote:

> Dear all,
>
> I am running the SNES ex12  and I'm passing the options -dm_view
> hdf5:sol.h5 -vec_view hdf5:sol.h5::append to generate an output file. The
> .h5 file is generated, but I'm not being able to load it in Paraview
> (5.4.0-64bits). Paraview recognizes the file and offers severel options to
> read it, here is the complete list
>
> Chombo Files
> GTC Files
> M3DC1 Files
> Multilevel 3D Plasma Files
> PFLOTRAN Files
> Pixie Files
> Tetrad Files
> UNIC Files
> VizSchema Files
>
> The problem is none of the options above work :(
> I'm using the configure option '-download-hdf5' and it installs hdf5
> version 1.8.18
> Any hint of how to fix it and have the visualization working?
>

Yes, Paraview does not directly read HDF5. It needs you to tell it what the
data in the HDF5 file means. You do
this by creating a *.xdmf file, which is XML. We provide a tool

  $PETSC_DIR/bin/petsc_gen_xdmf.py 

which should automatically produce this file for you. Let us know if it
does not work.

  Thanks,

Matt


>
> Best regards,
> Adriano.
>
> --
> Adriano Côrtes
> =
> *Campus Duque de Caxias and*
> *High-performance Computing Center (NACAD/COPPE)*
> Federal University of Rio de Janeiro (UFRJ)
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

http://www.caam.rice.edu/~mk51/


[petsc-users] SNES ex12 visualization

2017-09-14 Thread Adriano Côrtes
Dear all,

I am running the SNES ex12  and I'm passing the options -dm_view
hdf5:sol.h5 -vec_view hdf5:sol.h5::append to generate an output file. The
.h5 file is generated, but I'm not being able to load it in Paraview
(5.4.0-64bits). Paraview recognizes the file and offers severel options to
read it, here is the complete list

Chombo Files
GTC Files
M3DC1 Files
Multilevel 3D Plasma Files
PFLOTRAN Files
Pixie Files
Tetrad Files
UNIC Files
VizSchema Files

The problem is none of the options above work :(
I'm using the configure option '-download-hdf5' and it installs hdf5
version 1.8.18
Any hint of how to fix it and have the visualization working?

Best regards,
Adriano.

-- 
Adriano Côrtes
=
*Campus Duque de Caxias and*
*High-performance Computing Center (NACAD/COPPE)*
Federal University of Rio de Janeiro (UFRJ)