Matthew Knepley <[email protected]> writes:

> On Jul 21, 2014 5:44 AM, "Stephen Wornom" <[email protected]> wrote:
>>
>> I have an unstructured mesh code used to compute vortex shedding problems
> saving the solutions every 500-1000 time steps. The mesh size is 3 to 20
> MNodes. The minimum number of cores that I us is 128 for the 3MNode mesh.
>>  I would like to know if PETSC could be used to use to save the solutions
> using MPI-IO?
>
> The normal VecView() for the binary viewer will use MPI/IO.

You need -viewer_binary_mpiio or PetscViewerBinarySetMPIIO().

PETSc devs, do you suppose MPI-IO support is stable enough that we could
make this a default?  In any case, PetscViewerBinarySetMPIIO should take
a PetscBool.

Attachment: pgpxoOI_lCEhf.pgp
Description: PGP signature

Reply via email to