Re: [petsc-users] explanations on DM_BOUNDARY_PERIODIC

2017-04-28 Thread neok m4700
Hello Barry,

Thank you for answering.

I quote the DMDA webpage:
"The vectors can be thought of as either cell centered or vertex centered
on the mesh. But some variables cannot be cell centered and others vertex
centered."

So If I use this, then when creating the DMDA the overall size will be the
number of nodes, with nodal coordinates, and by setting DMDA_Q0 interp
together with DM_BOUNDARY_PERIODIC I should be able to recover the solution
at cell centers ?

I that possible in PETSc or should I stick to the nodal representation of
my problem ?

thanks.

2017-04-27 20:03 GMT+02:00 Barry Smith <bsm...@mcs.anl.gov>:

>
> > On Apr 27, 2017, at 12:43 PM, neok m4700 <neok.m4...@gmail.com> wrote:
> >
> > Hi Matthew,
> >
> > Thank you for the clarification, however, it is unclear why there is an
> additional unknown in the case of periodic bcs.
> >
> > Please see attached to this email what I'd like to achieve, the number
> of unknowns does not change when switching to the periodic case for e.g. a
> laplace operator.
>
>So here you are thinking in terms of cell-centered discretizations. You
> are correct in that case that the number of "unknowns" is the same for both
> Dirichlet or periodic boundary conditions.
>
>DMDA was originally written in support of vertex centered coordinates,
> then this was extended somewhat with DMDASetInterpolationType() where
> DMDA_Q1 represents piecewise linear vertex centered while DMDA_Q0
> represents piecewise constatant cell-centered.
>
> If you look at the source code for DMDASetUniformCoordinates() it is
> written in the context of vertex centered where the coordinates are stored
> for each vertex
>
>if (bx == DM_BOUNDARY_PERIODIC) hx = (xmax-xmin)/M;
> else hx = (xmax-xmin)/(M-1);
> ierr = VecGetArray(xcoor,);CHKERRQ(ierr);
> for (i=0; i<isize; i++) {
>   coors[i] = xmin + hx*(i+istart);
> }
>
> Note that in the periodic case say domain [0,1) vertex centered with 3
> grid points (in the global problem) the coordinates for the vertices are 0,
> 1/3, 2/3 If you are using cell-centered and have 3 cells, the coordinates
> of the vertices are again 0, 1/3, 2/3
>
> Note that in the cell centered case we are storing in each location of the
> vector the coordinates of a vertex, not the coordinates of the cell center
> so it is a likely "wonky".
>
>There is no contradiction between what you are saying and what we are
> saying.
>
>Barry
>
> >
> > And in the case of dirichlet or neumann bcs, the extremum cell add
> information to the RHS, they do not appear in the matrix formulation.
> >
> > Hope I was clear enough,
> > thanks
> >
> >
> > 2017-04-27 16:15 GMT+02:00 Matthew Knepley <knep...@gmail.com>:
> > On Thu, Apr 27, 2017 at 3:46 AM, neok m4700 <neok.m4...@gmail.com>
> wrote:
> > Hi,
> >
> > I am trying to change my problem to using periodic boundary conditions.
> >
> > However, when I use DMDASetUniformCoordinates on the DA, the spacing
> changes.
> >
> > This is due to an additional point e.g. in dm/impls/da/gr1.c
> >
> > else if (dim == 2) {
> > if (bx == DM_BOUNDARY_PERIODIC) hx = (xmax-xmin)/(M);
> > else hx = (xmax-xmin)/(M-1);
> > if (by == DM_BOUNDARY_PERIODIC) hy = (ymax-ymin)/(N);
> > else hy = (ymax-ymin)/(N-1);
> >
> > I don't understand the logic here, since xmin an xmax refer to the
> physical domain, how does changing to a periodic BC change the
> discretization ?
> >
> > Could someone clarify or point to a reference ?
> >
> > Just do a 1D example with 3 vertices. With a normal domain, you have 2
> cells
> >
> >   1-2-3
> >
> > so each cell is 1/2 of the domain. In a periodic domain, the last vertex
> is connected to the first, so we have 3 cells
> >
> >   1-2-3-1
> >
> > and each is 1/3 of the domain.
> >
> >Matt
> >
> > Thanks
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
> >
> > <1D.pdf>
>
>


Re: [petsc-users] explanations on DM_BOUNDARY_PERIODIC

2017-04-27 Thread neok m4700
Hi Matthew,

Thank you for the clarification, however, it is unclear why there is an
additional unknown in the case of periodic bcs.

Please see attached to this email what I'd like to achieve, the number of
unknowns does not change when switching to the periodic case for e.g. a
laplace operator.

And in the case of dirichlet or neumann bcs, the extremum cell add
information to the RHS, they do not appear in the matrix formulation.

Hope I was clear enough,
thanks


2017-04-27 16:15 GMT+02:00 Matthew Knepley <knep...@gmail.com>:

> On Thu, Apr 27, 2017 at 3:46 AM, neok m4700 <neok.m4...@gmail.com> wrote:
>
>> Hi,
>>
>> I am trying to change my problem to using periodic boundary conditions.
>>
>> However, when I use DMDASetUniformCoordinates on the DA, the spacing
>> changes.
>>
>> This is due to an additional point e.g. in dm/impls/da/gr1.c
>>
>> else if (dim == 2) {
>> if (bx == DM_BOUNDARY_PERIODIC) hx = (xmax-xmin)/(M);
>> else hx = (xmax-xmin)/(M-1);
>> if (by == DM_BOUNDARY_PERIODIC) hy = (ymax-ymin)/(N);
>> else hy = (ymax-ymin)/(N-1);
>>
>> I don't understand the logic here, since xmin an xmax refer to the
>> physical domain, how does changing to a periodic BC change the
>> discretization ?
>>
>> Could someone clarify or point to a reference ?
>>
>
> Just do a 1D example with 3 vertices. With a normal domain, you have 2
> cells
>
>   1-2-3
>
> so each cell is 1/2 of the domain. In a periodic domain, the last vertex
> is connected to the first, so we have 3 cells
>
>   1-2-3-1
>
> and each is 1/3 of the domain.
>
>Matt
>
>
>> Thanks
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>


1D.pdf
Description: Adobe PDF document


[petsc-users] explanations on DM_BOUNDARY_PERIODIC

2017-04-27 Thread neok m4700
Hi,

I am trying to change my problem to using periodic boundary conditions.

However, when I use DMDASetUniformCoordinates on the DA, the spacing
changes.

This is due to an additional point e.g. in dm/impls/da/gr1.c

else if (dim == 2) {
if (bx == DM_BOUNDARY_PERIODIC) hx = (xmax-xmin)/(M);
else hx = (xmax-xmin)/(M-1);
if (by == DM_BOUNDARY_PERIODIC) hy = (ymax-ymin)/(N);
else hy = (ymax-ymin)/(N-1);

I don't understand the logic here, since xmin an xmax refer to the physical
domain, how does changing to a periodic BC change the discretization ?


Could someone clarify or point to a reference ?

Thanks


Re: [petsc-users] (no subject)

2016-06-02 Thread neok m4700
Re,

Makes sense to read the documentation, I will try with another
preconditioners.

Thanks for the support.

2016-06-02 18:15 GMT+02:00 Matthew Knepley <knep...@gmail.com>:

> On Thu, Jun 2, 2016 at 11:10 AM, neok m4700 <neok.m4...@gmail.com> wrote:
>
>> Hi Satish,
>>
>> Thanks for the correction.
>>
>> The error message is now slightly different, but the result is the same
>> (serial runs fine, parallel with mpirun fails with following error):
>>
>
> Now the error is correct. You are asking to run ICC in parallel, which we
> do not support. It is telling you
> to look at the table of available solvers.
>
>   Thanks,
>
> Matt
>
>
>> [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
>> [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
>> [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
>> [0] PCSetUp_ICC() line 21 in<...>/src/ksp/pc/impls/factor/icc/icc.c
>> [0] MatGetFactor() line 4291 in <...>/src/mat/interface/matrix.c
>> [0] See http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html
>> for possible LU and Cholesky solvers
>> [0] Could not locate a solver package. Perhaps you must ./configure with
>> --download-
>>
>>
>>
>>
>> 2016-06-02 17:58 GMT+02:00 Satish Balay <ba...@mcs.anl.gov>:
>>
>>> with petsc-master - you would have to use petsc4py-master.
>>>
>>> i.e try petsc-eab7b92 with petsc4py-6e8e093
>>>
>>> Satish
>>>
>>>
>>> On Thu, 2 Jun 2016, neok m4700 wrote:
>>>
>>> > Hi Matthew,
>>> >
>>> > I've rebuilt petsc // petsc4py with following versions:
>>> >
>>> > 3.7.0 // 3.7.0 => same runtime error
>>> > 00c67f3 // 3.7.1 => fails to build petsc4py (error below)
>>> > 00c67f3 // 6e8e093 => same as above
>>> > f1b0812 (latest commit) // 6e8e093 (latest commit) => same as above
>>> >
>>> > In file included from src/PETSc.c:3:0:
>>> > src/petsc4py.PETSc.c: In function
>>> > ‘__pyx_pf_8petsc4py_5PETSc_6DMPlex_4createBoxMesh’:
>>> > src/petsc4py.PETSc.c:214629:112: error: incompatible type for argument
>>> 4 of
>>> > ‘DMPlexCreateBoxMesh’
>>> >__pyx_t_4 =
>>> > __pyx_f_8petsc4py_5PETSc_CHKERR(DMPlexCreateBoxMesh(__pyx_v_ccomm,
>>> > __pyx_v_cdim, __pyx_v_interp, (&__pyx_v_newdm))); if
>>> (unlikely(__pyx_t_4 ==
>>> > -1)) __PYX_ERR(42, 49, __pyx_L1_error)
>>> >
>>> > using
>>> > - numpy 1.11.0
>>> > - openblas 0.2.18
>>> > - openmpi 1.10.2
>>> >
>>> > Thanks
>>> >
>>> > 2016-06-02 16:39 GMT+02:00 Matthew Knepley <knep...@gmail.com>:
>>> >
>>> > > On Thu, Jun 2, 2016 at 9:12 AM, neok m4700 <neok.m4...@gmail.com>
>>> wrote:
>>> > >
>>> > >> Hi,
>>> > >>
>>> > >> I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and
>>> ran the
>>> > >> examples in the demo directory.
>>> > >>
>>> > >
>>> > > I believe this was fixed in 'master':
>>> > >
>>> https://bitbucket.org/petsc/petsc/commits/00c67f3b09c0bcda06af5ed306d845d9138e5003
>>> > >
>>> > > Is it possible to try this?
>>> > >
>>> > >   Thanks,
>>> > >
>>> > > Matt
>>> > >
>>> > >
>>> > >> $ python test_mat_ksp.py
>>> > >> => runs as expected (serial)
>>> > >>
>>> > >> $ mpiexec -np 2 python test_mat_ksp.py
>>> > >> => fails with the following output:
>>> > >>
>>> > >> Traceback (most recent call last):
>>> > >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 15, in 
>>> > >> execfile('petsc-ksp.py')
>>> > >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 6, in execfile
>>> > >> try: exec(fh.read()+"\n", globals, locals)
>>> > >>   File "", line 15, in 
>>> > >>   File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve
>>> > >> (src/petsc4py.PETSc.c:153555)
>>> > >> petsc4py.PETSc.Error: error code 92
>>> > >> [0] KSPSolve() line 

Re: [petsc-users] (no subject)

2016-06-02 Thread neok m4700
Hi Satish,

Thanks for the correction.

The error message is now slightly different, but the result is the same
(serial runs fine, parallel with mpirun fails with following error):

[0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
[0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
[0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
[0] PCSetUp_ICC() line 21 in<...>/src/ksp/pc/impls/factor/icc/icc.c
[0] MatGetFactor() line 4291 in <...>/src/mat/interface/matrix.c
[0] See http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html
for possible LU and Cholesky solvers
[0] Could not locate a solver package. Perhaps you must ./configure with
--download-




2016-06-02 17:58 GMT+02:00 Satish Balay <ba...@mcs.anl.gov>:

> with petsc-master - you would have to use petsc4py-master.
>
> i.e try petsc-eab7b92 with petsc4py-6e8e093
>
> Satish
>
>
> On Thu, 2 Jun 2016, neok m4700 wrote:
>
> > Hi Matthew,
> >
> > I've rebuilt petsc // petsc4py with following versions:
> >
> > 3.7.0 // 3.7.0 => same runtime error
> > 00c67f3 // 3.7.1 => fails to build petsc4py (error below)
> > 00c67f3 // 6e8e093 => same as above
> > f1b0812 (latest commit) // 6e8e093 (latest commit) => same as above
> >
> > In file included from src/PETSc.c:3:0:
> > src/petsc4py.PETSc.c: In function
> > ‘__pyx_pf_8petsc4py_5PETSc_6DMPlex_4createBoxMesh’:
> > src/petsc4py.PETSc.c:214629:112: error: incompatible type for argument 4
> of
> > ‘DMPlexCreateBoxMesh’
> >__pyx_t_4 =
> > __pyx_f_8petsc4py_5PETSc_CHKERR(DMPlexCreateBoxMesh(__pyx_v_ccomm,
> > __pyx_v_cdim, __pyx_v_interp, (&__pyx_v_newdm))); if (unlikely(__pyx_t_4
> ==
> > -1)) __PYX_ERR(42, 49, __pyx_L1_error)
> >
> > using
> > - numpy 1.11.0
> > - openblas 0.2.18
> > - openmpi 1.10.2
> >
> > Thanks
> >
> > 2016-06-02 16:39 GMT+02:00 Matthew Knepley <knep...@gmail.com>:
> >
> > > On Thu, Jun 2, 2016 at 9:12 AM, neok m4700 <neok.m4...@gmail.com>
> wrote:
> > >
> > >> Hi,
> > >>
> > >> I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran
> the
> > >> examples in the demo directory.
> > >>
> > >
> > > I believe this was fixed in 'master':
> > >
> https://bitbucket.org/petsc/petsc/commits/00c67f3b09c0bcda06af5ed306d845d9138e5003
> > >
> > > Is it possible to try this?
> > >
> > >   Thanks,
> > >
> > > Matt
> > >
> > >
> > >> $ python test_mat_ksp.py
> > >> => runs as expected (serial)
> > >>
> > >> $ mpiexec -np 2 python test_mat_ksp.py
> > >> => fails with the following output:
> > >>
> > >> Traceback (most recent call last):
> > >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 15, in 
> > >> execfile('petsc-ksp.py')
> > >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 6, in execfile
> > >> try: exec(fh.read()+"\n", globals, locals)
> > >>   File "", line 15, in 
> > >>   File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve
> > >> (src/petsc4py.PETSc.c:153555)
> > >> petsc4py.PETSc.Error: error code 92
> > >> [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
> > >> [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
> > >> [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
> > >> [0] PCSetUp_ICC() line 21 in <...>/src/ksp/pc/impls/factor/icc/icc.c
> > >> [0] MatGetFactor() line 4240 in<...>/src/mat/interface/matrix.c
> > >> [0] You cannot overwrite this option since that will conflict with
> other
> > >> previously set options
> > >> [0] Could not locate solver package (null). Perhaps you must
> ./configure
> > >> with --download-(null)
> > >>
> > >> <...>
> > >> ---
> > >> Primary job  terminated normally, but 1 process returned
> > >> a non-zero exit code.. Per user-direction, the job has been aborted.
> > >> ---
> > >>
> --
> > >> mpirun detected that one or more processes exited with non-zero
> status,
> > >> thus causing
> > >> the job to be terminated. The first process to do so was:
> > >>
> > >>   Process name: [[23110,1],0]
> > >>   Exit code:1
> > >>
> --
> > >>
> > >>
> > >> What have I done wrong ?
> > >>
> > >>
> > >>
> > >>
> > >
> > >
> > > --
> > > What most experimenters take for granted before they begin their
> > > experiments is infinitely more interesting than any results to which
> their
> > > experiments lead.
> > > -- Norbert Wiener
> > >
> >
>


Re: [petsc-users] (no subject)

2016-06-02 Thread neok m4700
Hi Matthew,

I've rebuilt petsc // petsc4py with following versions:

3.7.0 // 3.7.0 => same runtime error
00c67f3 // 3.7.1 => fails to build petsc4py (error below)
00c67f3 // 6e8e093 => same as above
f1b0812 (latest commit) // 6e8e093 (latest commit) => same as above

In file included from src/PETSc.c:3:0:
src/petsc4py.PETSc.c: In function
‘__pyx_pf_8petsc4py_5PETSc_6DMPlex_4createBoxMesh’:
src/petsc4py.PETSc.c:214629:112: error: incompatible type for argument 4 of
‘DMPlexCreateBoxMesh’
   __pyx_t_4 =
__pyx_f_8petsc4py_5PETSc_CHKERR(DMPlexCreateBoxMesh(__pyx_v_ccomm,
__pyx_v_cdim, __pyx_v_interp, (&__pyx_v_newdm))); if (unlikely(__pyx_t_4 ==
-1)) __PYX_ERR(42, 49, __pyx_L1_error)

using
- numpy 1.11.0
- openblas 0.2.18
- openmpi 1.10.2

Thanks

2016-06-02 16:39 GMT+02:00 Matthew Knepley <knep...@gmail.com>:

> On Thu, Jun 2, 2016 at 9:12 AM, neok m4700 <neok.m4...@gmail.com> wrote:
>
>> Hi,
>>
>> I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran the
>> examples in the demo directory.
>>
>
> I believe this was fixed in 'master':
> https://bitbucket.org/petsc/petsc/commits/00c67f3b09c0bcda06af5ed306d845d9138e5003
>
> Is it possible to try this?
>
>   Thanks,
>
> Matt
>
>
>> $ python test_mat_ksp.py
>> => runs as expected (serial)
>>
>> $ mpiexec -np 2 python test_mat_ksp.py
>> => fails with the following output:
>>
>> Traceback (most recent call last):
>>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 15, in 
>> execfile('petsc-ksp.py')
>>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 6, in execfile
>> try: exec(fh.read()+"\n", globals, locals)
>>   File "", line 15, in 
>>   File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve
>> (src/petsc4py.PETSc.c:153555)
>> petsc4py.PETSc.Error: error code 92
>> [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
>> [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
>> [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
>> [0] PCSetUp_ICC() line 21 in <...>/src/ksp/pc/impls/factor/icc/icc.c
>> [0] MatGetFactor() line 4240 in<...>/src/mat/interface/matrix.c
>> [0] You cannot overwrite this option since that will conflict with other
>> previously set options
>> [0] Could not locate solver package (null). Perhaps you must ./configure
>> with --download-(null)
>>
>> <...>
>> ---
>> Primary job  terminated normally, but 1 process returned
>> a non-zero exit code.. Per user-direction, the job has been aborted.
>> ---
>> --
>> mpirun detected that one or more processes exited with non-zero status,
>> thus causing
>> the job to be terminated. The first process to do so was:
>>
>>   Process name: [[23110,1],0]
>>   Exit code:1
>> --
>>
>>
>> What have I done wrong ?
>>
>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>


[petsc-users] (no subject)

2016-06-02 Thread neok m4700
Hi,

I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran the
examples in the demo directory.

$ python test_mat_ksp.py
=> runs as expected (serial)

$ mpiexec -np 2 python test_mat_ksp.py
=> fails with the following output:

Traceback (most recent call last):
  File "<...>/demo/kspsolve/test_mat_ksp.py", line 15, in 
execfile('petsc-ksp.py')
  File "<...>/demo/kspsolve/test_mat_ksp.py", line 6, in execfile
try: exec(fh.read()+"\n", globals, locals)
  File "", line 15, in 
  File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve
(src/petsc4py.PETSc.c:153555)
petsc4py.PETSc.Error: error code 92
[0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
[0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
[0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
[0] PCSetUp_ICC() line 21 in <...>/src/ksp/pc/impls/factor/icc/icc.c
[0] MatGetFactor() line 4240 in<...>/src/mat/interface/matrix.c
[0] You cannot overwrite this option since that will conflict with other
previously set options
[0] Could not locate solver package (null). Perhaps you must ./configure
with --download-(null)

<...>
---
Primary job  terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.
---
--
mpirun detected that one or more processes exited with non-zero status,
thus causing
the job to be terminated. The first process to do so was:

  Process name: [[23110,1],0]
  Exit code:1
--


What have I done wrong ?