[petsc-users] Meaning of the order of PETSC_VIEWER_ASCII_INDEX format in PETSc

2018-02-07 Thread Paula Sanematsu
I am using PETSc 3.7.6 and Fortran.

I am trying to output a PETSc vector that contains the solution of a linear
system. I am using VecView with the PETSC_VIEWER_ASCII_INDEX format as
follows:

call PetscViewerASCIIOpen(PETSC_COMM_WORLD,"output.dat",viewer,ierr)
call PetscViewerPushFormat(viewer,PETSC_VIEWER_ASCII_INDEX,ierr)
call VecView(myVec,viewer,ierr)


When I run with 4 processors, my output file looks like:

Vec Object: 4 MPI processes
  type: mpi
Process [0]
0: 30.7501
1: 164.001
2: 41.0001
3: 164.001
.
.
.
Process [1]
4988: 60.1443
4989: 157.257
4990: 271.518
4991: 366.669
.
.
.
Process [2]
9977: 114.948
9978: -77.2896
9979: 823.142
9980: -1096.19
.
.
.
Process [3]
14916: 0.
14917: 4.4056
14918: 2.08151
14919: -0.110862
.
.
.
19843: 0.


My question is: each processor outputs the part of the vector that it owns?
Or does PETSc collects each processor's parts and then processor 0
sequentially outputs the 1st quarter of the global vector, processor 1
outputs the 2nd quarter of the global vector, processor 2 outputs the 3rd
quarter of the global vector, and so on? Or, does PETSc do something else?

Thank you!

Paula


Re: [petsc-users] Meaning of the order of PETSC_VIEWER_ASCII_INDEX format in PETSc

2018-02-07 Thread Matthew Knepley
On Wed, Feb 7, 2018 at 2:08 PM, Paula Sanematsu  wrote:

> I am using PETSc 3.7.6 and Fortran.
>
> I am trying to output a PETSc vector that contains the solution of a
> linear system. I am using VecView with the PETSC_VIEWER_ASCII_INDEX format
> as follows:
>
> call PetscViewerASCIIOpen(PETSC_COMM_WORLD,"output.dat",viewer,ierr)
> call PetscViewerPushFormat(viewer,PETSC_VIEWER_ASCII_INDEX,ierr)
> call VecView(myVec,viewer,ierr)
>
>
> When I run with 4 processors, my output file looks like:
>
> Vec Object: 4 MPI processes
>   type: mpi
> Process [0]
> 0: 30.7501
> 1: 164.001
> 2: 41.0001
> 3: 164.001
> .
> .
> .
> Process [1]
> 4988: 60.1443
> 4989: 157.257
> 4990: 271.518
> 4991: 366.669
> .
> .
> .
> Process [2]
> 9977: 114.948
> 9978: -77.2896
> 9979: 823.142
> 9980: -1096.19
> .
> .
> .
> Process [3]
> 14916: 0.
> 14917: 4.4056
> 14918: 2.08151
> 14919: -0.110862
> .
> .
> .
> 19843: 0.
>
>
> My question is: each processor outputs the part of the vector that it
> owns? Or does PETSc collects each processor's parts and then processor 0
> sequentially outputs the 1st quarter of the global vector, processor 1
> outputs the 2nd quarter of the global vector, processor 2 outputs the 3rd
> quarter of the global vector, and so on? Or, does PETSc do something else?
>

There is no difference between those two. The process portions are
contiguous.

Where are you going to read this in? It seems like there must be a more
appropriate format for you.

  Thanks,

Matt


> Thank you!
>
> Paula
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Meaning of the order of PETSC_VIEWER_ASCII_INDEX format in PETSc

2018-02-07 Thread Paula Sanematsu
 I see. Thanks for the explanation.

Yes, that's the stage I'm in now. I am developing the code and only
testing+validating small samples, so I am trying to visualize the results
in Matlab. But in the future, Matlab will probably not be feasible so I
will probably need to use the binary format and visualize in Avizo.



On Wed, Feb 7, 2018 at 2:29 PM, Smith, Barry F.  wrote:

>
>   Here is the code VecView_MPI_ASCII()
>
> if (format == PETSC_VIEWER_ASCII_INDEX) {
>   ierr = PetscViewerASCIIPrintf(viewer,"%D:
> ",cnt++);CHKERRQ(ierr);
> }
> #if defined(PETSC_USE_COMPLEX)
> if (PetscImaginaryPart(xarray[i]) > 0.0) {
>   ierr = PetscViewerASCIIPrintf(viewer,"%g + %g
> i\n",(double)PetscRealPart(xarray[i]),(double)
> PetscImaginaryPart(xarray[i]));CHKERRQ(ierr);
> } else if (PetscImaginaryPart(xarray[i]) < 0.0) {
>   ierr = PetscViewerASCIIPrintf(viewer,"%g - %g
> i\n",(double)PetscRealPart(xarray[i]),-(double)
> PetscImaginaryPart(xarray[i]));CHKERRQ(ierr);
> } else {
>   ierr = PetscViewerASCIIPrintf(viewer,
> "%g\n",(double)PetscRealPart(xarray[i]));CHKERRQ(ierr);
> }
> #else
> ierr = PetscViewerASCIIPrintf(viewer,"%g\n",(double)xarray[i]);
> CHKERRQ(ierr);
> #endif
>   }
>   /* receive and print messages */
>   for (j=1; j ierr = MPI_Recv(values,(PetscMPIInt)len,MPIU_SCALAR,j,tag,
> PetscObjectComm((PetscObject)xin),);CHKERRQ(ierr);
> ierr = MPI_Get_count(,MPIU_SCALAR,);CHKERRQ(ierr);
> if (format != PETSC_VIEWER_ASCII_COMMON) {
>   ierr = PetscViewerASCIIPrintf(viewer,"Process
> [%d]\n",j);CHKERRQ(ierr);
> }
> for (i=0; i   if (format == PETSC_VIEWER_ASCII_INDEX) {
> ierr = PetscViewerASCIIPrintf(viewer,"%D:
> ",cnt++);CHKERRQ(ierr);
>   }
> #if defined(PETSC_USE_COMPLEX)
>   if (PetscImaginaryPart(values[i]) > 0.0) {
> ierr = PetscViewerASCIIPrintf(viewer,"%g + %g
> i\n",(double)PetscRealPart(values[i]),(double)
> PetscImaginaryPart(values[i]));CHKERRQ(ierr);
>   } else if (PetscImaginaryPart(values[i]) < 0.0) {
> ierr = PetscViewerASCIIPrintf(viewer,"%g - %g
> i\n",(double)PetscRealPart(values[i]),-(double)
> PetscImaginaryPart(values[i]));CHKERRQ(ierr);
>   } else {
> ierr = PetscViewerASCIIPrintf(viewer,
> "%g\n",(double)PetscRealPart(values[i]));CHKERRQ(ierr);
>   }
> #else
>   ierr = PetscViewerASCIIPrintf(viewer,"%g\n",(double)values[i]);
> CHKERRQ(ierr);
> #endif
> }
>   }
>
> So each process ships its values to process zero who prints them in order.
>
> Note that printing out vectors and matrices as ASCII is just for toys and
> to help debug. For large runs one should always use some variant of binary
> output.
>
>   Barry
>
>
> > On Feb 7, 2018, at 1:08 PM, Paula Sanematsu  wrote:
> >
> > I am using PETSc 3.7.6 and Fortran.
> >
> > I am trying to output a PETSc vector that contains the solution of a
> linear system. I am using VecView with the PETSC_VIEWER_ASCII_INDEX format
> as follows:
> >
> > call PetscViewerASCIIOpen(PETSC_COMM_WORLD,"output.dat",viewer,ierr)
> > call PetscViewerPushFormat(viewer,PETSC_VIEWER_ASCII_INDEX,ierr)
> > call VecView(myVec,viewer,ierr)
> >
> > When I run with 4 processors, my output file looks like:
> >
> > Vec Object: 4 MPI processes
> >   type: mpi
> > Process [0]
> > 0: 30.7501
> > 1: 164.001
> > 2: 41.0001
> > 3: 164.001
> > .
> > .
> > .
> > Process [1]
> > 4988: 60.1443
> > 4989: 157.257
> > 4990: 271.518
> > 4991: 366.669
> > .
> > .
> > .
> > Process [2]
> > 9977: 114.948
> > 9978: -77.2896
> > 9979: 823.142
> > 9980: -1096.19
> > .
> > .
> > .
> > Process [3]
> > 14916: 0.
> > 14917: 4.4056
> > 14918: 2.08151
> > 14919: -0.110862
> > .
> > .
> > .
> > 19843: 0.
> >
> > My question is: each processor outputs the part of the vector that it
> owns? Or does PETSc collects each processor's parts and then processor 0
> sequentially outputs the 1st quarter of the global vector, processor 1
> outputs the 2nd quarter of the global vector, processor 2 outputs the 3rd
> quarter of the global vector, and so on? Or, does PETSc do something else?
> >
> > Thank you!
> >
> > Paula
> >
>
>


Re: [petsc-users] Elliptic operator with Neumann conditions

2018-02-07 Thread Matthew Knepley
On Wed, Feb 7, 2018 at 1:45 PM, Marco Cisternino <
marco.cistern...@optimad.it> wrote:

> I'm sorry Matt but I cannot understand what flexible gmres computes when
> no null space is created.
> Could you give me some hints, please? Even in very simple cases...
>

I don't think I can clear it up any further.

  Thanks,

Matt


> Thanks
>
>
> Marco Cisternino, PhD
> marco.cistern...@optimad.it
> ___
> OPTIMAD Engineering srl
> Via Giacinto Collegno 18, Torino, Italia.
> +3901119719782
> www.optimad.it
>
>
> 
> Da: Matthew Knepley 
> Inviato: mercoledì 7 febbraio 2018 19:38:08
> A: Marco Cisternino
> Cc: petsc-users
> Oggetto: Re: [petsc-users] Elliptic operator with Neumann conditions
>
> On Thu, Feb 8, 2018 at 5:29 AM, Marco Cisternino <
> marco.cistern...@optimad.it> wrote:
> Thanks Matt,
> I'm using KSPFGMRES and I'm sorry but what do you mean with initial
> residual?
> I also force a non-zero initial guess.
>
> If your initial residual has a component in the null space of the
> operator, it is likely to stay there.
>
>Matt
>
> Thanks again
>
>
> Marco Cisternino, PhD
> marco.cistern...@optimad.it
> ___
> OPTIMAD Engineering srl
> Via Giacinto Collegno 18, Torino, Italia.
> +3901119719782
> www.optimad.it
>
>
> 
> Da: Matthew Knepley >
> Inviato: mercoledì 7 febbraio 2018 18:57:56
> A: Marco Cisternino
> Cc: petsc-users
> Oggetto: Re: [petsc-users] Elliptic operator with Neumann conditions
>
> On Thu, Feb 8, 2018 at 4:43 AM, Marco Cisternino <
> marco.cistern...@optimad.it marco.cistern...@optimad.it>> wrote:
> Hi everybody,
> I would like to ask what solution is computed if I try to solve the linear
> system relative to the problem in subject without creating the null space.
> I tried with and without the call to
> MatNullSpaceCreate(m_communicator, PETSC_TRUE, 0, NULL, );
> and I get zero averaged solution with and the same solution plus a
> constant without.
> How does PETSc  work in the second case?
>
> It depends on the Krylov method you use and the initial residual. We do
> not do anything special.
>
>   Thanks,
>
>  Matt
>
> Does it check the matrix singularity? And is it able to create the null
> space with the constant automatically?
> Thanks.
>
>
> Marco Cisternino, PhD
> marco.cistern...@optimad.it marco.cistern...@optimad.it>
> ___
> OPTIMAD Engineering srl
> Via Giacinto Collegno 18, Torino, Italia.
> +3901119719782
> www.optimad.it
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Meaning of the order of PETSC_VIEWER_ASCII_INDEX format in PETSc

2018-02-07 Thread Smith, Barry F.

  Here is the code VecView_MPI_ASCII()

if (format == PETSC_VIEWER_ASCII_INDEX) {
  ierr = PetscViewerASCIIPrintf(viewer,"%D: ",cnt++);CHKERRQ(ierr);
}
#if defined(PETSC_USE_COMPLEX)
if (PetscImaginaryPart(xarray[i]) > 0.0) {
  ierr = PetscViewerASCIIPrintf(viewer,"%g + %g 
i\n",(double)PetscRealPart(xarray[i]),(double)PetscImaginaryPart(xarray[i]));CHKERRQ(ierr);
} else if (PetscImaginaryPart(xarray[i]) < 0.0) {
  ierr = PetscViewerASCIIPrintf(viewer,"%g - %g 
i\n",(double)PetscRealPart(xarray[i]),-(double)PetscImaginaryPart(xarray[i]));CHKERRQ(ierr);
} else {
  ierr = 
PetscViewerASCIIPrintf(viewer,"%g\n",(double)PetscRealPart(xarray[i]));CHKERRQ(ierr);
}
#else
ierr = 
PetscViewerASCIIPrintf(viewer,"%g\n",(double)xarray[i]);CHKERRQ(ierr);
#endif
  }
  /* receive and print messages */
  for (j=1; j 0.0) {
ierr = PetscViewerASCIIPrintf(viewer,"%g + %g 
i\n",(double)PetscRealPart(values[i]),(double)PetscImaginaryPart(values[i]));CHKERRQ(ierr);
  } else if (PetscImaginaryPart(values[i]) < 0.0) {
ierr = PetscViewerASCIIPrintf(viewer,"%g - %g 
i\n",(double)PetscRealPart(values[i]),-(double)PetscImaginaryPart(values[i]));CHKERRQ(ierr);
  } else {
ierr = 
PetscViewerASCIIPrintf(viewer,"%g\n",(double)PetscRealPart(values[i]));CHKERRQ(ierr);
  }
#else
  ierr = 
PetscViewerASCIIPrintf(viewer,"%g\n",(double)values[i]);CHKERRQ(ierr);
#endif
}
  }

So each process ships its values to process zero who prints them in order.

Note that printing out vectors and matrices as ASCII is just for toys and to 
help debug. For large runs one should always use some variant of binary output.

  Barry


> On Feb 7, 2018, at 1:08 PM, Paula Sanematsu  wrote:
> 
> I am using PETSc 3.7.6 and Fortran.
> 
> I am trying to output a PETSc vector that contains the solution of a linear 
> system. I am using VecView with the PETSC_VIEWER_ASCII_INDEX format as 
> follows:
> 
> call PetscViewerASCIIOpen(PETSC_COMM_WORLD,"output.dat",viewer,ierr)
> call PetscViewerPushFormat(viewer,PETSC_VIEWER_ASCII_INDEX,ierr)
> call VecView(myVec,viewer,ierr)
> 
> When I run with 4 processors, my output file looks like:
> 
> Vec Object: 4 MPI processes
>   type: mpi
> Process [0]
> 0: 30.7501
> 1: 164.001
> 2: 41.0001
> 3: 164.001
> .
> .
> .
> Process [1]
> 4988: 60.1443
> 4989: 157.257
> 4990: 271.518
> 4991: 366.669
> .
> .
> .
> Process [2]
> 9977: 114.948
> 9978: -77.2896
> 9979: 823.142
> 9980: -1096.19
> .
> .
> .
> Process [3]
> 14916: 0.
> 14917: 4.4056
> 14918: 2.08151
> 14919: -0.110862
> .
> .
> .
> 19843: 0.
> 
> My question is: each processor outputs the part of the vector that it owns? 
> Or does PETSc collects each processor's parts and then processor 0 
> sequentially outputs the 1st quarter of the global vector, processor 1 
> outputs the 2nd quarter of the global vector, processor 2 outputs the 3rd 
> quarter of the global vector, and so on? Or, does PETSc do something else?
> 
> Thank you!
> 
> Paula
> 



Re: [petsc-users] Elliptic operator with Neumann conditions

2018-02-07 Thread Matthew Knepley
On Thu, Feb 8, 2018 at 4:43 AM, Marco Cisternino <
marco.cistern...@optimad.it> wrote:

> Hi everybody,
> I would like to ask what solution is computed if I try to solve the linear
> system relative to the problem in subject without creating the null space.
> I tried with and without the call to
> MatNullSpaceCreate(m_communicator, PETSC_TRUE, 0, NULL, );
> and I get zero averaged solution with and the same solution plus a
> constant without.
> How does PETSc  work in the second case?
>

It depends on the Krylov method you use and the initial residual. We do not
do anything special.

  Thanks,

 Matt


> Does it check the matrix singularity? And is it able to create the null
> space with the constant automatically?
> Thanks.
>
>
> Marco Cisternino, PhD
> marco.cistern...@optimad.it
> ___
> OPTIMAD Engineering srl
> Via Giacinto Collegno 18, Torino, Italia.
> +3901119719782
> www.optimad.it
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Elliptic operator with Neumann conditions

2018-02-07 Thread Matthew Knepley
On Thu, Feb 8, 2018 at 5:29 AM, Marco Cisternino <
marco.cistern...@optimad.it> wrote:

> Thanks Matt,
> I'm using KSPFGMRES and I'm sorry but what do you mean with initial
> residual?
> I also force a non-zero initial guess.
>

If your initial residual has a component in the null space of the operator,
it is likely to stay there.

   Matt


> Thanks again
>
>
> Marco Cisternino, PhD
> marco.cistern...@optimad.it
> ___
> OPTIMAD Engineering srl
> Via Giacinto Collegno 18, Torino, Italia.
> +3901119719782
> www.optimad.it
>
>
> 
> Da: Matthew Knepley 
> Inviato: mercoledì 7 febbraio 2018 18:57:56
> A: Marco Cisternino
> Cc: petsc-users
> Oggetto: Re: [petsc-users] Elliptic operator with Neumann conditions
>
> On Thu, Feb 8, 2018 at 4:43 AM, Marco Cisternino <
> marco.cistern...@optimad.it> wrote:
> Hi everybody,
> I would like to ask what solution is computed if I try to solve the linear
> system relative to the problem in subject without creating the null space.
> I tried with and without the call to
> MatNullSpaceCreate(m_communicator, PETSC_TRUE, 0, NULL, );
> and I get zero averaged solution with and the same solution plus a
> constant without.
> How does PETSc  work in the second case?
>
> It depends on the Krylov method you use and the initial residual. We do
> not do anything special.
>
>   Thanks,
>
>  Matt
>
> Does it check the matrix singularity? And is it able to create the null
> space with the constant automatically?
> Thanks.
>
>
> Marco Cisternino, PhD
> marco.cistern...@optimad.it
> ___
> OPTIMAD Engineering srl
> Via Giacinto Collegno 18, Torino, Italia.
> +3901119719782
> www.optimad.it
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Elliptic operator with Neumann conditions

2018-02-07 Thread Marco Cisternino
I'm sorry Matt but I cannot understand what flexible gmres computes when no 
null space is created.
Could you give me some hints, please? Even in very simple cases...
Thanks


Marco Cisternino, PhD
marco.cistern...@optimad.it
___
OPTIMAD Engineering srl
Via Giacinto Collegno 18, Torino, Italia.
+3901119719782
www.optimad.it



Da: Matthew Knepley 
Inviato: mercoledì 7 febbraio 2018 19:38:08
A: Marco Cisternino
Cc: petsc-users
Oggetto: Re: [petsc-users] Elliptic operator with Neumann conditions

On Thu, Feb 8, 2018 at 5:29 AM, Marco Cisternino 
> wrote:
Thanks Matt,
I'm using KSPFGMRES and I'm sorry but what do you mean with initial residual?
I also force a non-zero initial guess.

If your initial residual has a component in the null space of the operator, it 
is likely to stay there.

   Matt

Thanks again


Marco Cisternino, PhD
marco.cistern...@optimad.it
___
OPTIMAD Engineering srl
Via Giacinto Collegno 18, Torino, Italia.
+3901119719782
www.optimad.it



Da: Matthew Knepley >
Inviato: mercoledì 7 febbraio 2018 18:57:56
A: Marco Cisternino
Cc: petsc-users
Oggetto: Re: [petsc-users] Elliptic operator with Neumann conditions

On Thu, Feb 8, 2018 at 4:43 AM, Marco Cisternino 
>>
 wrote:
Hi everybody,
I would like to ask what solution is computed if I try to solve the linear 
system relative to the problem in subject without creating the null space.
I tried with and without the call to
MatNullSpaceCreate(m_communicator, PETSC_TRUE, 0, NULL, );
and I get zero averaged solution with and the same solution plus a constant 
without.
How does PETSc  work in the second case?

It depends on the Krylov method you use and the initial residual. We do not do 
anything special.

  Thanks,

 Matt

Does it check the matrix singularity? And is it able to create the null space 
with the constant automatically?
Thanks.


Marco Cisternino, PhD
marco.cistern...@optimad.it>
___
OPTIMAD Engineering srl
Via Giacinto Collegno 18, Torino, Italia.
+3901119719782
www.optimad.it




--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/


Re: [petsc-users] Elliptic operator with Neumann conditions

2018-02-07 Thread Marco Cisternino
Thanks Matt,
I'm using KSPFGMRES and I'm sorry but what do you mean with initial residual?
I also force a non-zero initial guess.
Thanks again


Marco Cisternino, PhD
marco.cistern...@optimad.it
___
OPTIMAD Engineering srl
Via Giacinto Collegno 18, Torino, Italia.
+3901119719782
www.optimad.it



Da: Matthew Knepley 
Inviato: mercoledì 7 febbraio 2018 18:57:56
A: Marco Cisternino
Cc: petsc-users
Oggetto: Re: [petsc-users] Elliptic operator with Neumann conditions

On Thu, Feb 8, 2018 at 4:43 AM, Marco Cisternino 
> wrote:
Hi everybody,
I would like to ask what solution is computed if I try to solve the linear 
system relative to the problem in subject without creating the null space.
I tried with and without the call to
MatNullSpaceCreate(m_communicator, PETSC_TRUE, 0, NULL, );
and I get zero averaged solution with and the same solution plus a constant 
without.
How does PETSc  work in the second case?

It depends on the Krylov method you use and the initial residual. We do not do 
anything special.

  Thanks,

 Matt

Does it check the matrix singularity? And is it able to create the null space 
with the constant automatically?
Thanks.


Marco Cisternino, PhD
marco.cistern...@optimad.it
___
OPTIMAD Engineering srl
Via Giacinto Collegno 18, Torino, Italia.
+3901119719782
www.optimad.it




--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/


Re: [petsc-users] Elliptic operator with Neumann conditions

2018-02-07 Thread Smith, Barry F.

   A square matrix with a null space results in an underdetermined system, that 
is a solution with more than one solution. The solutions can be written asx 
+ alpha_1 v_1 + ... alpha_n v_n where the v_n form an orthonormal basis for the 
null space and x is orthogonal to the null space.

   When you provide the null space KSP Krylov methods find the norm minimizing 
solution (x) , that is it finds the x with the smallest norm that satisfies the 
system. This is exactly the same as saying you take any solution of the system 
and remove all the components in the directions of the null space.

   If you do not provide the null space then the Krylov space may find you a 
solution that is not the norm minimizing solution, thus that solution has a 
component of the null space within it. What component of the null space in the 
solution depends on what you use for an initial guess and right hand side.

   When you have a preconditioner then things can get trickier because the 
preconditioner can (unless you remove them) components in the direction of the 
null space. These components can get amplified with each iteration of the 
Krylov method so it looks like the Krylov method is not converging since the 
norm of the solution is getting larger and larger (these larger components are 
in the null space.) This is why one should always provide the null space when 
solving singular systems with singular matrices.

  Barry


> On Feb 7, 2018, at 11:43 AM, Marco Cisternino  
> wrote:
> 
> Hi everybody,
> I would like to ask what solution is computed if I try to solve the linear 
> system relative to the problem in subject without creating the null space.
> I tried with and without the call to
> MatNullSpaceCreate(m_communicator, PETSC_TRUE, 0, NULL, );
> and I get zero averaged solution with and the same solution plus a constant 
> without.
> How does PETSc  work in the second case?
> Does it check the matrix singularity? And is it able to create the null space 
> with the constant automatically?
> Thanks.
> 
> 
> Marco Cisternino, PhD
> marco.cistern...@optimad.it
> ___
> OPTIMAD Engineering srl
> Via Giacinto Collegno 18, Torino, Italia.
> +3901119719782
> www.optimad.it
> 



Re: [petsc-users] Elliptic operator with Neumann conditions

2018-02-07 Thread Marco Cisternino
Barry, thanks a lot!
Exactly what I wanted to understand and clearly explained.
Again thank you very much.

Marco

Ottieni Outlook per Android


From: Smith, Barry F. 
Sent: Wednesday, February 7, 2018 8:24:22 PM
To: Marco Cisternino
Cc: petsc-users
Subject: Re: [petsc-users] Elliptic operator with Neumann conditions


   A square matrix with a null space results in an underdetermined system, that 
is a solution with more than one solution. The solutions can be written asx 
+ alpha_1 v_1 + ... alpha_n v_n where the v_n form an orthonormal basis for the 
null space and x is orthogonal to the null space.

   When you provide the null space KSP Krylov methods find the norm minimizing 
solution (x) , that is it finds the x with the smallest norm that satisfies the 
system. This is exactly the same as saying you take any solution of the system 
and remove all the components in the directions of the null space.

   If you do not provide the null space then the Krylov space may find you a 
solution that is not the norm minimizing solution, thus that solution has a 
component of the null space within it. What component of the null space in the 
solution depends on what you use for an initial guess and right hand side.

   When you have a preconditioner then things can get trickier because the 
preconditioner can (unless you remove them) components in the direction of the 
null space. These components can get amplified with each iteration of the 
Krylov method so it looks like the Krylov method is not converging since the 
norm of the solution is getting larger and larger (these larger components are 
in the null space.) This is why one should always provide the null space when 
solving singular systems with singular matrices.

  Barry


> On Feb 7, 2018, at 11:43 AM, Marco Cisternino  
> wrote:
>
> Hi everybody,
> I would like to ask what solution is computed if I try to solve the linear 
> system relative to the problem in subject without creating the null space.
> I tried with and without the call to
> MatNullSpaceCreate(m_communicator, PETSC_TRUE, 0, NULL, );
> and I get zero averaged solution with and the same solution plus a constant 
> without.
> How does PETSc  work in the second case?
> Does it check the matrix singularity? And is it able to create the null space 
> with the constant automatically?
> Thanks.
>
>
> Marco Cisternino, PhD
> marco.cistern...@optimad.it
> ___
> OPTIMAD Engineering srl
> Via Giacinto Collegno 18, Torino, Italia.
> +3901119719782
> www.optimad.it
>



Re: [petsc-users] petsc and mumps for compute user-specified set of entries in inv(A)

2018-02-07 Thread Hong
Marius :
I added MatMumpsGetInverse(), see
https://bitbucket.org/petsc/petsc/commits/f5e16c35adb7810c6c977ea1c9cd95e4afb8512b
It works in sequential and parallel.

It is in the branch hzhang/mumps-invA, and will be merged to petsc-master
after it passed our regression tests.

Let me know if you see any problems.
Hong

On Sat, Jan 20, 2018 at 11:42 AM, Hong  wrote:

> Marius :
> Current petsc-mumps interface supports ICNTL(30), e.g., runtime option 
> '-mat_mumps_icntl_30
> 0' as default.
> However, no user ever has tested it. I tested it
> using petsc/src/mat/examples/tests/ex130.c with
>  '-mat_mumps_icntl_30 1' and got an error in MatSolve() -- additional
> work needs to be down here.
>
> I'll read mumps user manual and investigate it next week.
>
> Meanwhile, you may give it a try and figure out how to set other
> parameters, such as icntl_27 and allocate correct rhs and solution
> vectors.
>
> Hong
>
>> Hi,
>>
>> Is it possible to interface MUMPS to  compute user-specified set of
>> entries in inv(A) (ICNTL(30)) using petsc ?
>>
>> best,
>> marius
>>
>
>


Re: [petsc-users] Elliptic operator with Neumann conditions

2018-02-07 Thread Mark Adams
On Wed, Feb 7, 2018 at 2:24 PM, Smith, Barry F.  wrote:

>
>A square matrix with a null space results in an underdetermined system,
> that is a solution with more than one solution. The solutions can be
> written asx + alpha_1 v_1 + ... alpha_n v_n where the v_n form an
> orthonormal basis for the null space and x is orthogonal to the null space.
>
>When you provide the null space KSP Krylov methods find the norm
> minimizing solution (x) , that is it finds the x with the smallest norm
> that satisfies the system. This is exactly the same as saying you take any
> solution of the system and remove all the components in the directions of
> the null space.
>
>If you do not provide the null space then the Krylov space may find you
> a solution that is not the norm minimizing solution, thus that solution has
> a component of the null space within it. What component of the null space
> in the solution depends on what you use for an initial guess and right hand
> side.
>

Additionally, assuming your initial guess is orthogonal to the null space,
of course, your solution can "float" away from roundoff error. This is what
you were seeing initially w/o the null space. As you saw you can just
project it out yourself but as Barry said it is better to let KSP do it.


>
>When you have a preconditioner then things can get trickier because the
> preconditioner can (unless you remove them) components in the direction of
> the null space. These components can get amplified with each iteration of
> the Krylov method so it looks like the Krylov method is not converging
> since the norm of the solution is getting larger and larger (these larger
> components are in the null space.) This is why one should always provide
> the null space when solving singular systems with singular matrices.
>
>   Barry
>
>
> > On Feb 7, 2018, at 11:43 AM, Marco Cisternino <
> marco.cistern...@optimad.it> wrote:
> >
> > Hi everybody,
> > I would like to ask what solution is computed if I try to solve the
> linear system relative to the problem in subject without creating the null
> space.
> > I tried with and without the call to
> > MatNullSpaceCreate(m_communicator, PETSC_TRUE, 0, NULL, );
> > and I get zero averaged solution with and the same solution plus a
> constant without.
> > How does PETSc  work in the second case?
> > Does it check the matrix singularity? And is it able to create the null
> space with the constant automatically?
> > Thanks.
> >
> >
> > Marco Cisternino, PhD
> > marco.cistern...@optimad.it
> > ___
> > OPTIMAD Engineering srl
> > Via Giacinto Collegno 18, Torino, Italia.
> > +3901119719782
> > www.optimad.it
> >
>
>


Re: [petsc-users] Visualizing structured cell-centered data VTK

2018-02-07 Thread Jed Brown
"Buesing, Henrik"  writes:

>> >> > I have structured cell-centered data and would like to visualize
>> >> > this with
>> >> Paraview. Up to now I use PetscViewerVTKOpen and VecView to write
>> >> data in *.vts format. I would like to tell PETSc that the fieldtype
>> >> is PETSC_VTK_CELL_FIELD. I have found PetscViewerVTKAddField.
>> >> >
>> >> > Is this the way to go? I was thinking maybe a DMDASetFieldType
>> >> > exists, but
>> >> did not find any. If yes, what is the PetscViewerVTKWriteFunction I
>> >> need to provide?
>> >>
>> >> DMDA does not explicitly support distinguishing between cell and
>> >> point values.  PetscViewerVTKAddField is a developer level routine
>> >> and you would need to implement a function similar to
>> >> DMDAVTKWriteAll_VTS (not at all trivial and you need to read the code
>> >> because it is responsible for almost everything).
>> >
>> > I am looking at src/sys/classes/viewer/impls/vtk/vtkv.c. There is a
>> reference to PETSC_VTK_POINT_FIELD vs. PETSC_VTK_CELL_FIELD. Judging
>> from the output I get, I was assuming fieldtype=PETSC_VTK_POINT_FIELD. I
>> would be totally fine with replacing POINT by CELL everywhere, since all my
>> data is cell-centered.
>> 
>> I think coordinates need to be PointData, not coordinates of cell centroids.
>> DMDA doesn't have that concept.  (Maybe it should, but adding it is no small
>> task and hacking the output is likely to create a lot of edge cases.)
>> 
> I had a look at DMDAVTKWriteAll_VTS. You are totally right that this
> is not generic. www.vtk.org/VTK/img/file-formats.pdf describes the
> StructuredGrid format. "Points" contain only one DataArray specifying
> the coordinates whereas Cells contain three DataArrays with
> connectivity, offsets and types.

Are you mixing up CellData from the StructuredGrid XML spec with Cells
from the UnstructuredGrid?

> What if I use DMDAVTKWriteAll_VTR and write a RectliniearGrid? Then I could 
> just replace PointData with CellData and "hack" only the extent which should 
> leave no edge cases...

I think it's feasible.  You could use DMDAGetInterpolationType() to
determine which variant to use.


Re: [petsc-users] Visualizing structured cell-centered data VTK

2018-02-07 Thread Buesing, Henrik
> >> >> > I have structured cell-centered data and would like to visualize
> >> >> > this with
> >> >> Paraview. Up to now I use PetscViewerVTKOpen and VecView to write
> >> >> data in *.vts format. I would like to tell PETSc that the
> >> >> fieldtype is PETSC_VTK_CELL_FIELD. I have found
> PetscViewerVTKAddField.
> >> >> >
> >> >> > Is this the way to go? I was thinking maybe a DMDASetFieldType
> >> >> > exists, but
> >> >> did not find any. If yes, what is the PetscViewerVTKWriteFunction
> >> >> I need to provide?
> >> >>
> >> >> DMDA does not explicitly support distinguishing between cell and
> >> >> point values.  PetscViewerVTKAddField is a developer level routine
> >> >> and you would need to implement a function similar to
> >> >> DMDAVTKWriteAll_VTS (not at all trivial and you need to read the
> >> >> code because it is responsible for almost everything).
> >> >
> >> > I am looking at src/sys/classes/viewer/impls/vtk/vtkv.c. There is a
> >> reference to PETSC_VTK_POINT_FIELD vs. PETSC_VTK_CELL_FIELD. Judging
> >> from the output I get, I was assuming
> >> fieldtype=PETSC_VTK_POINT_FIELD. I would be totally fine with
> >> replacing POINT by CELL everywhere, since all my data is cell-centered.
> >>
> >> I think coordinates need to be PointData, not coordinates of cell 
> >> centroids.
> >> DMDA doesn't have that concept.  (Maybe it should, but adding it is
> >> no small task and hacking the output is likely to create a lot of
> >> edge cases.)
> >>
> > I had a look at DMDAVTKWriteAll_VTS. You are totally right that this
> > is not generic. www.vtk.org/VTK/img/file-formats.pdf describes the
> > StructuredGrid format. "Points" contain only one DataArray specifying
> > the coordinates whereas Cells contain three DataArrays with
> > connectivity, offsets and types.
> 
> Are you mixing up CellData from the StructuredGrid XML spec with Cells from
> the UnstructuredGrid?
Ah, yes. You are right! Ok wrong reasoning, but maybe good conclusion 
nevertheless. 

> > What if I use DMDAVTKWriteAll_VTR and write a RectliniearGrid? Then I
> could just replace PointData with CellData and "hack" only the extent which
> should leave no edge cases...
> 
> I think it's feasible.  You could use DMDAGetInterpolationType() to
> determine which variant to use.

I will try it and see where I can go with it. 


Re: [petsc-users] Visualizing structured cell-centered data VTK

2018-02-07 Thread Buesing, Henrik
> >> > I have structured cell-centered data and would like to visualize
> >> > this with
> >> Paraview. Up to now I use PetscViewerVTKOpen and VecView to write
> >> data in *.vts format. I would like to tell PETSc that the fieldtype
> >> is PETSC_VTK_CELL_FIELD. I have found PetscViewerVTKAddField.
> >> >
> >> > Is this the way to go? I was thinking maybe a DMDASetFieldType
> >> > exists, but
> >> did not find any. If yes, what is the PetscViewerVTKWriteFunction I
> >> need to provide?
> >>
> >> DMDA does not explicitly support distinguishing between cell and
> >> point values.  PetscViewerVTKAddField is a developer level routine
> >> and you would need to implement a function similar to
> >> DMDAVTKWriteAll_VTS (not at all trivial and you need to read the code
> >> because it is responsible for almost everything).
> >
> > I am looking at src/sys/classes/viewer/impls/vtk/vtkv.c. There is a
> reference to PETSC_VTK_POINT_FIELD vs. PETSC_VTK_CELL_FIELD. Judging
> from the output I get, I was assuming fieldtype=PETSC_VTK_POINT_FIELD. I
> would be totally fine with replacing POINT by CELL everywhere, since all my
> data is cell-centered.
> 
> I think coordinates need to be PointData, not coordinates of cell centroids.
> DMDA doesn't have that concept.  (Maybe it should, but adding it is no small
> task and hacking the output is likely to create a lot of edge cases.)
> 
I had a look at DMDAVTKWriteAll_VTS. You are totally right that this is not 
generic. www.vtk.org/VTK/img/file-formats.pdf describes the StructuredGrid 
format. "Points" contain only one DataArray specifying the coordinates whereas 
Cells contain three DataArrays with connectivity, offsets and types. 

What if I use DMDAVTKWriteAll_VTR and write a RectliniearGrid? Then I could 
just replace PointData with CellData and "hack" only the extent which should 
leave no edge cases...



[petsc-users] Visualizing structured cell-centered data VTK

2018-02-07 Thread Buesing, Henrik
Dear all,

I have structured cell-centered data and would like to visualize this with 
Paraview. Up to now I use PetscViewerVTKOpen and VecView to write data in *.vts 
format. I would like to tell PETSc that the fieldtype is PETSC_VTK_CELL_FIELD. 
I have found PetscViewerVTKAddField.

Is this the way to go? I was thinking maybe a DMDASetFieldType exists, but did 
not find any. If yes, what is the PetscViewerVTKWriteFunction I need to provide?

Thank you!
Henrik

--
Dipl.-Math. Henrik Büsing
Institute for Applied Geophysics and Geothermal Energy
E.ON Energy Research Center
RWTH Aachen University

Mathieustr. 10| Tel +49 (0)241 80 49907
52074 Aachen, Germany | Fax +49 (0)241 80 49889

http://www.eonerc.rwth-aachen.de/GGE
hbues...@eonerc.rwth-aachen.de



[petsc-users] Parallel output in PETSc with pHDF5 and VTK

2018-02-07 Thread Buesing, Henrik
Dear all,

I would like to write HDF5 and VTK files in parallel. I found Vec example 
19:
 "Parallel HDF5 Vec Viewing". But I do not understand how I tell PETSc to write 
a DMDA in parallel when doing VecView. At the moment everything is done on 
process 0. Can PETSc use parallel HDF5?

Regarding VTK: Is it possible that every process dumps his part of the Vec in a 
separate file and let Paraview combine this. I think Firedrake does it in this 
way.

Thank you!
Henrik





--
Dipl.-Math. Henrik Büsing
Institute for Applied Geophysics and Geothermal Energy
E.ON Energy Research Center
RWTH Aachen University

Mathieustr. 10| Tel +49 (0)241 80 49907
52074 Aachen, Germany | Fax +49 (0)241 80 49889

http://www.eonerc.rwth-aachen.de/GGE
hbues...@eonerc.rwth-aachen.de



Re: [petsc-users] Visualizing structured cell-centered data VTK

2018-02-07 Thread Jed Brown
"Buesing, Henrik"  writes:

> Dear all,
>
> I have structured cell-centered data and would like to visualize this with 
> Paraview. Up to now I use PetscViewerVTKOpen and VecView to write data in 
> *.vts format. I would like to tell PETSc that the fieldtype is 
> PETSC_VTK_CELL_FIELD. I have found PetscViewerVTKAddField.
>
> Is this the way to go? I was thinking maybe a DMDASetFieldType exists, but 
> did not find any. If yes, what is the PetscViewerVTKWriteFunction I need to 
> provide?

DMDA does not explicitly support distinguishing between cell and point
values.  PetscViewerVTKAddField is a developer level routine and you
would need to implement a function similar to DMDAVTKWriteAll_VTS (not
at all trivial and you need to read the code because it is responsible
for almost everything).


[petsc-users] Elliptic operator with Neumann conditions

2018-02-07 Thread Marco Cisternino
Hi everybody,
I would like to ask what solution is computed if I try to solve the linear 
system relative to the problem in subject without creating the null space.
I tried with and without the call to
MatNullSpaceCreate(m_communicator, PETSC_TRUE, 0, NULL, );
and I get zero averaged solution with and the same solution plus a constant 
without.
How does PETSc  work in the second case?
Does it check the matrix singularity? And is it able to create the null space 
with the constant automatically?
Thanks.


Marco Cisternino, PhD
marco.cistern...@optimad.it
___
OPTIMAD Engineering srl
Via Giacinto Collegno 18, Torino, Italia.
+3901119719782
www.optimad.it



Re: [petsc-users] Parallel output in PETSc with pHDF5 and VTK

2018-02-07 Thread Jed Brown
"Buesing, Henrik"  writes:

> Dear all,
>
> I would like to write HDF5 and VTK files in parallel. I found Vec example 
> 19:
>  "Parallel HDF5 Vec Viewing". But I do not understand how I tell PETSc to 
> write a DMDA in parallel when doing VecView. At the moment everything is done 
> on process 0. Can PETSc use parallel HDF5?

Yeah, the implementation is in VecView_MPI_HDF5_DA and uses
H5FD_MPIO_COLLECTIVE if supported.  Did you build your HDF5 with MPI?

> Regarding VTK: Is it possible that every process dumps his part of the Vec in 
> a separate file and let Paraview combine this. I think Firedrake does it in 
> this way.

This creates a filesystem metadata problem and is not supported by
PETSc's VTK viewers.

It would be possible to write binary-appended files by scattering the
metadata back to the processes which then open the output file and seek
to the appropriate place (instead of serializing their data through rank
0).  This would be worthwhile and not particularly hard to implement if
you want to try.  The problem is that VTK readers usually don't
parallelize over such files so you've just moved the problem.  Our usual
advice is to use HDF5 with collective IO if running at large scale.


Re: [petsc-users] Parallel output in PETSc with pHDF5 and VTK

2018-02-07 Thread Buesing, Henrik
>> Can PETSc use parallel HDF5?
> Yeah, the implementation is in VecView_MPI_HDF5_DA and uses
> H5FD_MPIO_COLLECTIVE if supported.  Did you build your HDF5 with MPI?

I just --download-hdf5. Do I need to do something else?

> 
> > Regarding VTK: Is it possible that every process dumps his part of the Vec 
> > in
> a separate file and let Paraview combine this. I think Firedrake does it in 
> this
> way.
> 
> This creates a filesystem metadata problem and is not supported by PETSc's
> VTK viewers.
> ...
>  Our usual advice is to use HDF5 with
> collective IO if running at large scale.

I understand. I will stick to HDF5 then. 

Thank you!
Henrik


Re: [petsc-users] Parallel output in PETSc with pHDF5 and VTK

2018-02-07 Thread Jed Brown
"Buesing, Henrik"  writes:

>> >>> Can PETSc use parallel HDF5?
>> >> Yeah, the implementation is in VecView_MPI_HDF5_DA and uses
>> >> H5FD_MPIO_COLLECTIVE if supported.  Did you build your HDF5 with
>> MPI?
>> >
>> > I just --download-hdf5. Do I need to do something else?
>> 
>> That should configure it with whichever MPI PETSc is using.  How are you
>> concluding that it's only using rank 0?
>
> I have to ask back to the person working on the parallel IO. But if built and 
> linked correctly, it should work in parallel with PETSCViewerHDF5Open and 
> VecView?

Yes.


Re: [petsc-users] Visualizing structured cell-centered data VTK

2018-02-07 Thread Jed Brown
"Buesing, Henrik"  writes:

>> > I have structured cell-centered data and would like to visualize this with
>> Paraview. Up to now I use PetscViewerVTKOpen and VecView to write data in
>> *.vts format. I would like to tell PETSc that the fieldtype is
>> PETSC_VTK_CELL_FIELD. I have found PetscViewerVTKAddField.
>> >
>> > Is this the way to go? I was thinking maybe a DMDASetFieldType exists, but
>> did not find any. If yes, what is the PetscViewerVTKWriteFunction I need to
>> provide?
>> 
>> DMDA does not explicitly support distinguishing between cell and point
>> values.  PetscViewerVTKAddField is a developer level routine and you would
>> need to implement a function similar to DMDAVTKWriteAll_VTS (not at all
>> trivial and you need to read the code because it is responsible for almost
>> everything).
>
> I am looking at src/sys/classes/viewer/impls/vtk/vtkv.c. There is a reference 
> to PETSC_VTK_POINT_FIELD vs. PETSC_VTK_CELL_FIELD. Judging from the output I 
> get, I was assuming fieldtype=PETSC_VTK_POINT_FIELD. I would be totally fine 
> with replacing POINT by CELL everywhere, since all my data is cell-centered. 

I think coordinates need to be PointData, not coordinates of cell
centroids.  DMDA doesn't have that concept.  (Maybe it should, but
adding it is no small task and hacking the output is likely to create a
lot of edge cases.)

> The only other references I find are in src/dm/impls/plex/plex.c and 
> plexvtk.c and plexvtu.c. But do I go through plex when just using 
> DMDACreate3d?

DMPlex is a different DM and can't be used to write structured output.


Re: [petsc-users] Parallel output in PETSc with pHDF5 and VTK

2018-02-07 Thread Buesing, Henrik
> >>> Can PETSc use parallel HDF5?
> >> Yeah, the implementation is in VecView_MPI_HDF5_DA and uses
> >> H5FD_MPIO_COLLECTIVE if supported.  Did you build your HDF5 with
> MPI?
> >
> > I just --download-hdf5. Do I need to do something else?
> 
> That should configure it with whichever MPI PETSc is using.  How are you
> concluding that it's only using rank 0?
I have to ask back to the person working on the parallel IO. But if built and 
linked correctly, it should work in parallel with PETSCViewerHDF5Open and 
VecView?




Re: [petsc-users] Parallel output in PETSc with pHDF5 and VTK

2018-02-07 Thread Buesing, Henrik
> >> >>> Can PETSc use parallel HDF5?
> >> >> Yeah, the implementation is in VecView_MPI_HDF5_DA and uses
> >> >> H5FD_MPIO_COLLECTIVE if supported.  Did you build your HDF5 with
> >> MPI?
> >> >
> >> > I just --download-hdf5. Do I need to do something else?
> >>
> >> That should configure it with whichever MPI PETSc is using.  How are
> >> you concluding that it's only using rank 0?
> >
> > I have to ask back to the person working on the parallel IO. But if built 
> > and
> linked correctly, it should work in parallel with PETSCViewerHDF5Open and
> VecView?
> 
> Yes.
Awesome! Thank you!
Henrik


Re: [petsc-users] Visualizing structured cell-centered data VTK

2018-02-07 Thread Buesing, Henrik
> > I have structured cell-centered data and would like to visualize this with
> Paraview. Up to now I use PetscViewerVTKOpen and VecView to write data in
> *.vts format. I would like to tell PETSc that the fieldtype is
> PETSC_VTK_CELL_FIELD. I have found PetscViewerVTKAddField.
> >
> > Is this the way to go? I was thinking maybe a DMDASetFieldType exists, but
> did not find any. If yes, what is the PetscViewerVTKWriteFunction I need to
> provide?
> 
> DMDA does not explicitly support distinguishing between cell and point
> values.  PetscViewerVTKAddField is a developer level routine and you would
> need to implement a function similar to DMDAVTKWriteAll_VTS (not at all
> trivial and you need to read the code because it is responsible for almost
> everything).

I am looking at src/sys/classes/viewer/impls/vtk/vtkv.c. There is a reference 
to PETSC_VTK_POINT_FIELD vs. PETSC_VTK_CELL_FIELD. Judging from the output I 
get, I was assuming fieldtype=PETSC_VTK_POINT_FIELD. I would be totally fine 
with replacing POINT by CELL everywhere, since all my data is cell-centered. 

The only other references I find are in src/dm/impls/plex/plex.c and plexvtk.c 
and plexvtu.c. But do I go through plex when just using DMDACreate3d?

Thank you!
Henrik



Re: [petsc-users] Parallel output in PETSc with pHDF5 and VTK

2018-02-07 Thread Jed Brown
"Buesing, Henrik"  writes:

>>> Can PETSc use parallel HDF5?
>> Yeah, the implementation is in VecView_MPI_HDF5_DA and uses
>> H5FD_MPIO_COLLECTIVE if supported.  Did you build your HDF5 with MPI?
>
> I just --download-hdf5. Do I need to do something else?

That should configure it with whichever MPI PETSc is using.  How are you
concluding that it's only using rank 0?