Yes, sorry.

On Mar 19, 2014, at 3:44 PM, Åsmund Ervik <[email protected]> wrote:

> Hi Barry,
> 
> Thanks for the hints. I couldn't find any "DMDACreateNatural", but after some 
> grepping of the manual, I ended up with this:
> 
> [... in init]
> DMDACreateNaturalVector()
> VecScatterCreateToZero()
> 
> [... inside a loop, e.g. over time steps]
> DMDAGlobalToNaturalBegin/End()
> VecScatterBegin/End()
> VecGetArray()
> [... write to file]
> VecRestoreArray()
> 
> Is this what you meant, or have I misunderstood anything? 
> 
> It seems to work fine both in sequential and parallel, the resulting plots 
> look OK, but it has uncovered some race condition that I need to fix now.
> 
> Regards,
> Åsmund
> 
> ________________________________________
> Fra: Barry Smith [[email protected]]
> Sendt: 19. mars 2014 00:27
> Til: Åsmund Ervik
> Kopi: [email protected]
> Emne: Re: [petsc-users] Writing solution data to file when using DMDA.
> 
> On Mar 18, 2014, at 5:19 PM, Åsmund Ervik <[email protected]> wrote:
> 
>> Dear PETSc users,
>> 
>> I'm trying to wrap my head around parallel I/O. If I understand correctly, a 
>> decent way of doing this is having one rank (say 0) writing to disk, and the 
>> other ranks communicating their part of the solution to rank 0. Please 
>> correct me if I'm wrong here.
>> 
>> I'm using DMDA to manage my domain decomposition. As a first step, I've been 
>> trying to create an array on rank 0 holding the entire global solution and 
>> then writing this to file by re-using some routines from our serial codes 
>> (the format is Tecplot ASCII). (I realize that neither this approach nor an 
>> ASCII format are good solutions in the end, but I have to start somewhere.) 
>> However, I haven't been able to find any DMDA routines that give me an array 
>> holding the entire global solution on rank 0. Are there any, or is this too 
>> much of a "dirty trick"? (For just 1 process there is no problem, the output 
>> files generated look good.)
> 
>    DMDACreateNatural()
>    DMDAGlobalToNaturalBegin/End()
>    VecScatterCreateToZero
>    VecGetArray()  on process 0
> 
>    the final array is in the natural ordering, x direction first, y direction 
> second, z direction third.
>> 
>> I'm also willing to try the VTK way of doing things, but I hit a problem 
>> when I tried that: even though I include "petscviewer.h" (also tried adding 
>> "petscviewerdef.h"), when I do
>>   call PetscViewerSetType(viewer,PETSCVIEWERVTK,ierr)
>> my compiler complains that PETSCVIEWERVTK is undefined (has no implicit 
>> type). This is from Fortran90 using preprocessing macros to #include the 
>> files. I tried PETSCVIEWERASCII as well, same problem. This is with 3.4.3. 
>> Any hints on this?
> 
>   Hmm, they are in petscviewerdef.h in 3.4.4 but anyways you can pass ‘vtk’   
> or  ‘ascii’  as the type
>> 
>> Also, there are many different examples and mailing list threads about VTK 
>> output. What is the currently recommended way of doing things? I need to 
>> output at least (u,v,w) as vector components of one field, together with a 
>> scalar field (p). These currently have separate DM's, since I only use PETSc 
>> to solve for p (the pressure).
>> 
>> Best regards,
>> Åsmund
> 

Reply via email to