Re: [petsc-users] Clarification on PCASMSetLocalSubdomain is and is_local

2016-06-02 Thread Matthew Knepley
On Thu, Jun 2, 2016 at 5:27 PM, Luc Berger-Vergiat 
wrote:

> Ok I get it,  then if I have multiple subdomains on the local processor is
> and is_local will be arrays of is that represent each subdomain on that
> processor?
>

Yep.

  Matt


> Best,
> Luc
>
> On 06/02/2016 06:21 PM, Matthew Knepley wrote:
>
> On Thu, Jun 2, 2016 at 5:11 PM, Luc Berger-Vergiat < 
> lb2...@columbia.edu> wrote:
>
>> Hi all,
>> I would like a quick clarification on what is and is_local are
>> representing in the PCASMSetLocalSubdomains().
>> My understanding is that if I have two mpi ranks and four subdomains I
>> can end up having four blocks that I can denote as follows:
>>
>>|   domain1  |  domain2  |  domain3  |  domain3   |
>> rank1  |block11  |block12  |   block13   | |
>> rank2  |block21  |block22  |   -- |block24 |
>>
>> to each blockIJ I associate isIJ.
>>
>> So for rank1 I will have is=[1,2,3] and is_local=[is11,is12,is13], and
>> for rank2 I will have is=[1,2,4] and is_local=[is21,is22,is24].
>> Or am I not understanding things correctly?
>
>
> I did not understand the above.
>
> The best way to think of this is algebraically. Suppose you have a matrix
> A, and you divide the rows into k disjoint sets where each
> process gets one set. Then is_local on each process is a list of the rows
> in that set. Now we also allow some overlap, which means
> rows in other sets are also used. The is on each process contains both
> is_local and these extra rows from other sets.
>
>   Thanks,
>
>  Matt
>
>
>>
>> --
>> Best,
>> Luc
>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
> --
> Best,
> Luc
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener


Re: [petsc-users] Clarification on PCASMSetLocalSubdomain is and is_local

2016-06-02 Thread Luc Berger-Vergiat
Ok I get it,  then if I have multiple subdomains on the local processor 
is and is_local will be arrays of is that represent each subdomain on 
that processor?


Best,
Luc

On 06/02/2016 06:21 PM, Matthew Knepley wrote:
On Thu, Jun 2, 2016 at 5:11 PM, Luc Berger-Vergiat 
> wrote:


Hi all,
I would like a quick clarification on what is and is_local are
representing in the PCASMSetLocalSubdomains().
My understanding is that if I have two mpi ranks and four
subdomains I can end up having four blocks that I can denote as
follows:

   |   domain1  |  domain2  |  domain3  | domain3   |
rank1  |block11  |block12  |   block13   | |
rank2  |block21  |block22  |   -- | block24 |

to each blockIJ I associate isIJ.

So for rank1 I will have is=[1,2,3] and is_local=[is11,is12,is13],
and for rank2 I will have is=[1,2,4] and is_local=[is21,is22,is24].
Or am I not understanding things correctly?


I did not understand the above.

The best way to think of this is algebraically. Suppose you have a 
matrix A, and you divide the rows into k disjoint sets where each
process gets one set. Then is_local on each process is a list of the 
rows in that set. Now we also allow some overlap, which means
rows in other sets are also used. The is on each process contains both 
is_local and these extra rows from other sets.


  Thanks,

 Matt


-- 
Best,

Luc





--
What most experimenters take for granted before they begin their 
experiments is infinitely more interesting than any results to which 
their experiments lead.

-- Norbert Wiener


--
Best,
Luc



Re: [petsc-users] Clarification on PCASMSetLocalSubdomain is and is_local

2016-06-02 Thread Matthew Knepley
On Thu, Jun 2, 2016 at 5:11 PM, Luc Berger-Vergiat 
wrote:

> Hi all,
> I would like a quick clarification on what is and is_local are
> representing in the PCASMSetLocalSubdomains().
> My understanding is that if I have two mpi ranks and four subdomains I can
> end up having four blocks that I can denote as follows:
>
>|   domain1  |  domain2  |  domain3  |  domain3   |
> rank1  |block11  |block12  |   block13   | |
> rank2  |block21  |block22  |   -- |block24 |
>
> to each blockIJ I associate isIJ.
>
> So for rank1 I will have is=[1,2,3] and is_local=[is11,is12,is13], and for
> rank2 I will have is=[1,2,4] and is_local=[is21,is22,is24].
> Or am I not understanding things correctly?


I did not understand the above.

The best way to think of this is algebraically. Suppose you have a matrix
A, and you divide the rows into k disjoint sets where each
process gets one set. Then is_local on each process is a list of the rows
in that set. Now we also allow some overlap, which means
rows in other sets are also used. The is on each process contains both
is_local and these extra rows from other sets.

  Thanks,

 Matt


>
> --
> Best,
> Luc
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener


[petsc-users] Clarification on PCASMSetLocalSubdomain is and is_local

2016-06-02 Thread Luc Berger-Vergiat

Hi all,
I would like a quick clarification on what is and is_local are 
representing in the PCASMSetLocalSubdomains().
My understanding is that if I have two mpi ranks and four subdomains I 
can end up having four blocks that I can denote as follows:


   |   domain1  |  domain2  |  domain3  |  domain3   |
rank1  |block11  |block12  |   block13   | |
rank2  |block21  |block22  |   -- |block24 |

to each blockIJ I associate isIJ.

So for rank1 I will have is=[1,2,3] and is_local=[is11,is12,is13], and 
for rank2 I will have is=[1,2,4] and is_local=[is21,is22,is24].

Or am I not understanding things correctly?

--
Best,
Luc




Re: [petsc-users] (no subject)

2016-06-02 Thread neok m4700
Re,

Makes sense to read the documentation, I will try with another
preconditioners.

Thanks for the support.

2016-06-02 18:15 GMT+02:00 Matthew Knepley :

> On Thu, Jun 2, 2016 at 11:10 AM, neok m4700  wrote:
>
>> Hi Satish,
>>
>> Thanks for the correction.
>>
>> The error message is now slightly different, but the result is the same
>> (serial runs fine, parallel with mpirun fails with following error):
>>
>
> Now the error is correct. You are asking to run ICC in parallel, which we
> do not support. It is telling you
> to look at the table of available solvers.
>
>   Thanks,
>
> Matt
>
>
>> [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
>> [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
>> [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
>> [0] PCSetUp_ICC() line 21 in<...>/src/ksp/pc/impls/factor/icc/icc.c
>> [0] MatGetFactor() line 4291 in <...>/src/mat/interface/matrix.c
>> [0] See http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html
>> for possible LU and Cholesky solvers
>> [0] Could not locate a solver package. Perhaps you must ./configure with
>> --download-
>>
>>
>>
>>
>> 2016-06-02 17:58 GMT+02:00 Satish Balay :
>>
>>> with petsc-master - you would have to use petsc4py-master.
>>>
>>> i.e try petsc-eab7b92 with petsc4py-6e8e093
>>>
>>> Satish
>>>
>>>
>>> On Thu, 2 Jun 2016, neok m4700 wrote:
>>>
>>> > Hi Matthew,
>>> >
>>> > I've rebuilt petsc // petsc4py with following versions:
>>> >
>>> > 3.7.0 // 3.7.0 => same runtime error
>>> > 00c67f3 // 3.7.1 => fails to build petsc4py (error below)
>>> > 00c67f3 // 6e8e093 => same as above
>>> > f1b0812 (latest commit) // 6e8e093 (latest commit) => same as above
>>> >
>>> > In file included from src/PETSc.c:3:0:
>>> > src/petsc4py.PETSc.c: In function
>>> > ‘__pyx_pf_8petsc4py_5PETSc_6DMPlex_4createBoxMesh’:
>>> > src/petsc4py.PETSc.c:214629:112: error: incompatible type for argument
>>> 4 of
>>> > ‘DMPlexCreateBoxMesh’
>>> >__pyx_t_4 =
>>> > __pyx_f_8petsc4py_5PETSc_CHKERR(DMPlexCreateBoxMesh(__pyx_v_ccomm,
>>> > __pyx_v_cdim, __pyx_v_interp, (&__pyx_v_newdm))); if
>>> (unlikely(__pyx_t_4 ==
>>> > -1)) __PYX_ERR(42, 49, __pyx_L1_error)
>>> >
>>> > using
>>> > - numpy 1.11.0
>>> > - openblas 0.2.18
>>> > - openmpi 1.10.2
>>> >
>>> > Thanks
>>> >
>>> > 2016-06-02 16:39 GMT+02:00 Matthew Knepley :
>>> >
>>> > > On Thu, Jun 2, 2016 at 9:12 AM, neok m4700 
>>> wrote:
>>> > >
>>> > >> Hi,
>>> > >>
>>> > >> I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and
>>> ran the
>>> > >> examples in the demo directory.
>>> > >>
>>> > >
>>> > > I believe this was fixed in 'master':
>>> > >
>>> https://bitbucket.org/petsc/petsc/commits/00c67f3b09c0bcda06af5ed306d845d9138e5003
>>> > >
>>> > > Is it possible to try this?
>>> > >
>>> > >   Thanks,
>>> > >
>>> > > Matt
>>> > >
>>> > >
>>> > >> $ python test_mat_ksp.py
>>> > >> => runs as expected (serial)
>>> > >>
>>> > >> $ mpiexec -np 2 python test_mat_ksp.py
>>> > >> => fails with the following output:
>>> > >>
>>> > >> Traceback (most recent call last):
>>> > >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 15, in 
>>> > >> execfile('petsc-ksp.py')
>>> > >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 6, in execfile
>>> > >> try: exec(fh.read()+"\n", globals, locals)
>>> > >>   File "", line 15, in 
>>> > >>   File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve
>>> > >> (src/petsc4py.PETSc.c:153555)
>>> > >> petsc4py.PETSc.Error: error code 92
>>> > >> [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
>>> > >> [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
>>> > >> [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
>>> > >> [0] PCSetUp_ICC() line 21 in <...>/src/ksp/pc/impls/factor/icc/icc.c
>>> > >> [0] MatGetFactor() line 4240 in<...>/src/mat/interface/matrix.c
>>> > >> [0] You cannot overwrite this option since that will conflict with
>>> other
>>> > >> previously set options
>>> > >> [0] Could not locate solver package (null). Perhaps you must
>>> ./configure
>>> > >> with --download-(null)
>>> > >>
>>> > >> <...>
>>> > >> ---
>>> > >> Primary job  terminated normally, but 1 process returned
>>> > >> a non-zero exit code.. Per user-direction, the job has been aborted.
>>> > >> ---
>>> > >>
>>> --
>>> > >> mpirun detected that one or more processes exited with non-zero
>>> status,
>>> > >> thus causing
>>> > >> the job to be terminated. The first process to do so was:
>>> > >>
>>> > >>   Process name: [[23110,1],0]
>>> > >>   Exit code:1
>>> > >>
>>> --
>>> > >>
>>> > >>
>>> > >> What have I done wrong ?
>>> > >>
>>> 

Re: [petsc-users] (no subject)

2016-06-02 Thread Matthew Knepley
On Thu, Jun 2, 2016 at 11:10 AM, neok m4700  wrote:

> Hi Satish,
>
> Thanks for the correction.
>
> The error message is now slightly different, but the result is the same
> (serial runs fine, parallel with mpirun fails with following error):
>

Now the error is correct. You are asking to run ICC in parallel, which we
do not support. It is telling you
to look at the table of available solvers.

  Thanks,

Matt


> [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
> [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
> [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
> [0] PCSetUp_ICC() line 21 in<...>/src/ksp/pc/impls/factor/icc/icc.c
> [0] MatGetFactor() line 4291 in <...>/src/mat/interface/matrix.c
> [0] See http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html
> for possible LU and Cholesky solvers
> [0] Could not locate a solver package. Perhaps you must ./configure with
> --download-
>
>
>
>
> 2016-06-02 17:58 GMT+02:00 Satish Balay :
>
>> with petsc-master - you would have to use petsc4py-master.
>>
>> i.e try petsc-eab7b92 with petsc4py-6e8e093
>>
>> Satish
>>
>>
>> On Thu, 2 Jun 2016, neok m4700 wrote:
>>
>> > Hi Matthew,
>> >
>> > I've rebuilt petsc // petsc4py with following versions:
>> >
>> > 3.7.0 // 3.7.0 => same runtime error
>> > 00c67f3 // 3.7.1 => fails to build petsc4py (error below)
>> > 00c67f3 // 6e8e093 => same as above
>> > f1b0812 (latest commit) // 6e8e093 (latest commit) => same as above
>> >
>> > In file included from src/PETSc.c:3:0:
>> > src/petsc4py.PETSc.c: In function
>> > ‘__pyx_pf_8petsc4py_5PETSc_6DMPlex_4createBoxMesh’:
>> > src/petsc4py.PETSc.c:214629:112: error: incompatible type for argument
>> 4 of
>> > ‘DMPlexCreateBoxMesh’
>> >__pyx_t_4 =
>> > __pyx_f_8petsc4py_5PETSc_CHKERR(DMPlexCreateBoxMesh(__pyx_v_ccomm,
>> > __pyx_v_cdim, __pyx_v_interp, (&__pyx_v_newdm))); if
>> (unlikely(__pyx_t_4 ==
>> > -1)) __PYX_ERR(42, 49, __pyx_L1_error)
>> >
>> > using
>> > - numpy 1.11.0
>> > - openblas 0.2.18
>> > - openmpi 1.10.2
>> >
>> > Thanks
>> >
>> > 2016-06-02 16:39 GMT+02:00 Matthew Knepley :
>> >
>> > > On Thu, Jun 2, 2016 at 9:12 AM, neok m4700 
>> wrote:
>> > >
>> > >> Hi,
>> > >>
>> > >> I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran
>> the
>> > >> examples in the demo directory.
>> > >>
>> > >
>> > > I believe this was fixed in 'master':
>> > >
>> https://bitbucket.org/petsc/petsc/commits/00c67f3b09c0bcda06af5ed306d845d9138e5003
>> > >
>> > > Is it possible to try this?
>> > >
>> > >   Thanks,
>> > >
>> > > Matt
>> > >
>> > >
>> > >> $ python test_mat_ksp.py
>> > >> => runs as expected (serial)
>> > >>
>> > >> $ mpiexec -np 2 python test_mat_ksp.py
>> > >> => fails with the following output:
>> > >>
>> > >> Traceback (most recent call last):
>> > >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 15, in 
>> > >> execfile('petsc-ksp.py')
>> > >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 6, in execfile
>> > >> try: exec(fh.read()+"\n", globals, locals)
>> > >>   File "", line 15, in 
>> > >>   File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve
>> > >> (src/petsc4py.PETSc.c:153555)
>> > >> petsc4py.PETSc.Error: error code 92
>> > >> [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
>> > >> [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
>> > >> [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
>> > >> [0] PCSetUp_ICC() line 21 in <...>/src/ksp/pc/impls/factor/icc/icc.c
>> > >> [0] MatGetFactor() line 4240 in<...>/src/mat/interface/matrix.c
>> > >> [0] You cannot overwrite this option since that will conflict with
>> other
>> > >> previously set options
>> > >> [0] Could not locate solver package (null). Perhaps you must
>> ./configure
>> > >> with --download-(null)
>> > >>
>> > >> <...>
>> > >> ---
>> > >> Primary job  terminated normally, but 1 process returned
>> > >> a non-zero exit code.. Per user-direction, the job has been aborted.
>> > >> ---
>> > >>
>> --
>> > >> mpirun detected that one or more processes exited with non-zero
>> status,
>> > >> thus causing
>> > >> the job to be terminated. The first process to do so was:
>> > >>
>> > >>   Process name: [[23110,1],0]
>> > >>   Exit code:1
>> > >>
>> --
>> > >>
>> > >>
>> > >> What have I done wrong ?
>> > >>
>> > >>
>> > >>
>> > >>
>> > >
>> > >
>> > > --
>> > > What most experimenters take for granted before they begin their
>> > > experiments is infinitely more interesting than any results to which
>> their
>> > > experiments lead.
>> > > -- Norbert Wiener
>> > >
>> >
>>
>
>


-- 
What most experimenters take for granted before 

Re: [petsc-users] (no subject)

2016-06-02 Thread neok m4700
Hi Satish,

Thanks for the correction.

The error message is now slightly different, but the result is the same
(serial runs fine, parallel with mpirun fails with following error):

[0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
[0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
[0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
[0] PCSetUp_ICC() line 21 in<...>/src/ksp/pc/impls/factor/icc/icc.c
[0] MatGetFactor() line 4291 in <...>/src/mat/interface/matrix.c
[0] See http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html
for possible LU and Cholesky solvers
[0] Could not locate a solver package. Perhaps you must ./configure with
--download-




2016-06-02 17:58 GMT+02:00 Satish Balay :

> with petsc-master - you would have to use petsc4py-master.
>
> i.e try petsc-eab7b92 with petsc4py-6e8e093
>
> Satish
>
>
> On Thu, 2 Jun 2016, neok m4700 wrote:
>
> > Hi Matthew,
> >
> > I've rebuilt petsc // petsc4py with following versions:
> >
> > 3.7.0 // 3.7.0 => same runtime error
> > 00c67f3 // 3.7.1 => fails to build petsc4py (error below)
> > 00c67f3 // 6e8e093 => same as above
> > f1b0812 (latest commit) // 6e8e093 (latest commit) => same as above
> >
> > In file included from src/PETSc.c:3:0:
> > src/petsc4py.PETSc.c: In function
> > ‘__pyx_pf_8petsc4py_5PETSc_6DMPlex_4createBoxMesh’:
> > src/petsc4py.PETSc.c:214629:112: error: incompatible type for argument 4
> of
> > ‘DMPlexCreateBoxMesh’
> >__pyx_t_4 =
> > __pyx_f_8petsc4py_5PETSc_CHKERR(DMPlexCreateBoxMesh(__pyx_v_ccomm,
> > __pyx_v_cdim, __pyx_v_interp, (&__pyx_v_newdm))); if (unlikely(__pyx_t_4
> ==
> > -1)) __PYX_ERR(42, 49, __pyx_L1_error)
> >
> > using
> > - numpy 1.11.0
> > - openblas 0.2.18
> > - openmpi 1.10.2
> >
> > Thanks
> >
> > 2016-06-02 16:39 GMT+02:00 Matthew Knepley :
> >
> > > On Thu, Jun 2, 2016 at 9:12 AM, neok m4700 
> wrote:
> > >
> > >> Hi,
> > >>
> > >> I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran
> the
> > >> examples in the demo directory.
> > >>
> > >
> > > I believe this was fixed in 'master':
> > >
> https://bitbucket.org/petsc/petsc/commits/00c67f3b09c0bcda06af5ed306d845d9138e5003
> > >
> > > Is it possible to try this?
> > >
> > >   Thanks,
> > >
> > > Matt
> > >
> > >
> > >> $ python test_mat_ksp.py
> > >> => runs as expected (serial)
> > >>
> > >> $ mpiexec -np 2 python test_mat_ksp.py
> > >> => fails with the following output:
> > >>
> > >> Traceback (most recent call last):
> > >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 15, in 
> > >> execfile('petsc-ksp.py')
> > >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 6, in execfile
> > >> try: exec(fh.read()+"\n", globals, locals)
> > >>   File "", line 15, in 
> > >>   File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve
> > >> (src/petsc4py.PETSc.c:153555)
> > >> petsc4py.PETSc.Error: error code 92
> > >> [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
> > >> [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
> > >> [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
> > >> [0] PCSetUp_ICC() line 21 in <...>/src/ksp/pc/impls/factor/icc/icc.c
> > >> [0] MatGetFactor() line 4240 in<...>/src/mat/interface/matrix.c
> > >> [0] You cannot overwrite this option since that will conflict with
> other
> > >> previously set options
> > >> [0] Could not locate solver package (null). Perhaps you must
> ./configure
> > >> with --download-(null)
> > >>
> > >> <...>
> > >> ---
> > >> Primary job  terminated normally, but 1 process returned
> > >> a non-zero exit code.. Per user-direction, the job has been aborted.
> > >> ---
> > >>
> --
> > >> mpirun detected that one or more processes exited with non-zero
> status,
> > >> thus causing
> > >> the job to be terminated. The first process to do so was:
> > >>
> > >>   Process name: [[23110,1],0]
> > >>   Exit code:1
> > >>
> --
> > >>
> > >>
> > >> What have I done wrong ?
> > >>
> > >>
> > >>
> > >>
> > >
> > >
> > > --
> > > What most experimenters take for granted before they begin their
> > > experiments is infinitely more interesting than any results to which
> their
> > > experiments lead.
> > > -- Norbert Wiener
> > >
> >
>


Re: [petsc-users] (no subject)

2016-06-02 Thread Satish Balay
with petsc-master - you would have to use petsc4py-master.

i.e try petsc-eab7b92 with petsc4py-6e8e093

Satish


On Thu, 2 Jun 2016, neok m4700 wrote:

> Hi Matthew,
> 
> I've rebuilt petsc // petsc4py with following versions:
> 
> 3.7.0 // 3.7.0 => same runtime error
> 00c67f3 // 3.7.1 => fails to build petsc4py (error below)
> 00c67f3 // 6e8e093 => same as above
> f1b0812 (latest commit) // 6e8e093 (latest commit) => same as above
> 
> In file included from src/PETSc.c:3:0:
> src/petsc4py.PETSc.c: In function
> ‘__pyx_pf_8petsc4py_5PETSc_6DMPlex_4createBoxMesh’:
> src/petsc4py.PETSc.c:214629:112: error: incompatible type for argument 4 of
> ‘DMPlexCreateBoxMesh’
>__pyx_t_4 =
> __pyx_f_8petsc4py_5PETSc_CHKERR(DMPlexCreateBoxMesh(__pyx_v_ccomm,
> __pyx_v_cdim, __pyx_v_interp, (&__pyx_v_newdm))); if (unlikely(__pyx_t_4 ==
> -1)) __PYX_ERR(42, 49, __pyx_L1_error)
> 
> using
> - numpy 1.11.0
> - openblas 0.2.18
> - openmpi 1.10.2
> 
> Thanks
> 
> 2016-06-02 16:39 GMT+02:00 Matthew Knepley :
> 
> > On Thu, Jun 2, 2016 at 9:12 AM, neok m4700  wrote:
> >
> >> Hi,
> >>
> >> I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran the
> >> examples in the demo directory.
> >>
> >
> > I believe this was fixed in 'master':
> > https://bitbucket.org/petsc/petsc/commits/00c67f3b09c0bcda06af5ed306d845d9138e5003
> >
> > Is it possible to try this?
> >
> >   Thanks,
> >
> > Matt
> >
> >
> >> $ python test_mat_ksp.py
> >> => runs as expected (serial)
> >>
> >> $ mpiexec -np 2 python test_mat_ksp.py
> >> => fails with the following output:
> >>
> >> Traceback (most recent call last):
> >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 15, in 
> >> execfile('petsc-ksp.py')
> >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 6, in execfile
> >> try: exec(fh.read()+"\n", globals, locals)
> >>   File "", line 15, in 
> >>   File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve
> >> (src/petsc4py.PETSc.c:153555)
> >> petsc4py.PETSc.Error: error code 92
> >> [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
> >> [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
> >> [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
> >> [0] PCSetUp_ICC() line 21 in <...>/src/ksp/pc/impls/factor/icc/icc.c
> >> [0] MatGetFactor() line 4240 in<...>/src/mat/interface/matrix.c
> >> [0] You cannot overwrite this option since that will conflict with other
> >> previously set options
> >> [0] Could not locate solver package (null). Perhaps you must ./configure
> >> with --download-(null)
> >>
> >> <...>
> >> ---
> >> Primary job  terminated normally, but 1 process returned
> >> a non-zero exit code.. Per user-direction, the job has been aborted.
> >> ---
> >> --
> >> mpirun detected that one or more processes exited with non-zero status,
> >> thus causing
> >> the job to be terminated. The first process to do so was:
> >>
> >>   Process name: [[23110,1],0]
> >>   Exit code:1
> >> --
> >>
> >>
> >> What have I done wrong ?
> >>
> >>
> >>
> >>
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> > experiments is infinitely more interesting than any results to which their
> > experiments lead.
> > -- Norbert Wiener
> >
> 


Re: [petsc-users] (no subject)

2016-06-02 Thread neok m4700
Hi Matthew,

I've rebuilt petsc // petsc4py with following versions:

3.7.0 // 3.7.0 => same runtime error
00c67f3 // 3.7.1 => fails to build petsc4py (error below)
00c67f3 // 6e8e093 => same as above
f1b0812 (latest commit) // 6e8e093 (latest commit) => same as above

In file included from src/PETSc.c:3:0:
src/petsc4py.PETSc.c: In function
‘__pyx_pf_8petsc4py_5PETSc_6DMPlex_4createBoxMesh’:
src/petsc4py.PETSc.c:214629:112: error: incompatible type for argument 4 of
‘DMPlexCreateBoxMesh’
   __pyx_t_4 =
__pyx_f_8petsc4py_5PETSc_CHKERR(DMPlexCreateBoxMesh(__pyx_v_ccomm,
__pyx_v_cdim, __pyx_v_interp, (&__pyx_v_newdm))); if (unlikely(__pyx_t_4 ==
-1)) __PYX_ERR(42, 49, __pyx_L1_error)

using
- numpy 1.11.0
- openblas 0.2.18
- openmpi 1.10.2

Thanks

2016-06-02 16:39 GMT+02:00 Matthew Knepley :

> On Thu, Jun 2, 2016 at 9:12 AM, neok m4700  wrote:
>
>> Hi,
>>
>> I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran the
>> examples in the demo directory.
>>
>
> I believe this was fixed in 'master':
> https://bitbucket.org/petsc/petsc/commits/00c67f3b09c0bcda06af5ed306d845d9138e5003
>
> Is it possible to try this?
>
>   Thanks,
>
> Matt
>
>
>> $ python test_mat_ksp.py
>> => runs as expected (serial)
>>
>> $ mpiexec -np 2 python test_mat_ksp.py
>> => fails with the following output:
>>
>> Traceback (most recent call last):
>>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 15, in 
>> execfile('petsc-ksp.py')
>>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 6, in execfile
>> try: exec(fh.read()+"\n", globals, locals)
>>   File "", line 15, in 
>>   File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve
>> (src/petsc4py.PETSc.c:153555)
>> petsc4py.PETSc.Error: error code 92
>> [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
>> [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
>> [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
>> [0] PCSetUp_ICC() line 21 in <...>/src/ksp/pc/impls/factor/icc/icc.c
>> [0] MatGetFactor() line 4240 in<...>/src/mat/interface/matrix.c
>> [0] You cannot overwrite this option since that will conflict with other
>> previously set options
>> [0] Could not locate solver package (null). Perhaps you must ./configure
>> with --download-(null)
>>
>> <...>
>> ---
>> Primary job  terminated normally, but 1 process returned
>> a non-zero exit code.. Per user-direction, the job has been aborted.
>> ---
>> --
>> mpirun detected that one or more processes exited with non-zero status,
>> thus causing
>> the job to be terminated. The first process to do so was:
>>
>>   Process name: [[23110,1],0]
>>   Exit code:1
>> --
>>
>>
>> What have I done wrong ?
>>
>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>


Re: [petsc-users] (no subject)

2016-06-02 Thread Matthew Knepley
On Thu, Jun 2, 2016 at 9:12 AM, neok m4700  wrote:

> Hi,
>
> I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran the
> examples in the demo directory.
>

I believe this was fixed in 'master':
https://bitbucket.org/petsc/petsc/commits/00c67f3b09c0bcda06af5ed306d845d9138e5003

Is it possible to try this?

  Thanks,

Matt


> $ python test_mat_ksp.py
> => runs as expected (serial)
>
> $ mpiexec -np 2 python test_mat_ksp.py
> => fails with the following output:
>
> Traceback (most recent call last):
>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 15, in 
> execfile('petsc-ksp.py')
>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 6, in execfile
> try: exec(fh.read()+"\n", globals, locals)
>   File "", line 15, in 
>   File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve
> (src/petsc4py.PETSc.c:153555)
> petsc4py.PETSc.Error: error code 92
> [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
> [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
> [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
> [0] PCSetUp_ICC() line 21 in <...>/src/ksp/pc/impls/factor/icc/icc.c
> [0] MatGetFactor() line 4240 in<...>/src/mat/interface/matrix.c
> [0] You cannot overwrite this option since that will conflict with other
> previously set options
> [0] Could not locate solver package (null). Perhaps you must ./configure
> with --download-(null)
>
> <...>
> ---
> Primary job  terminated normally, but 1 process returned
> a non-zero exit code.. Per user-direction, the job has been aborted.
> ---
> --
> mpirun detected that one or more processes exited with non-zero status,
> thus causing
> the job to be terminated. The first process to do so was:
>
>   Process name: [[23110,1],0]
>   Exit code:1
> --
>
>
> What have I done wrong ?
>
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener


[petsc-users] (no subject)

2016-06-02 Thread neok m4700
Hi,

I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran the
examples in the demo directory.

$ python test_mat_ksp.py
=> runs as expected (serial)

$ mpiexec -np 2 python test_mat_ksp.py
=> fails with the following output:

Traceback (most recent call last):
  File "<...>/demo/kspsolve/test_mat_ksp.py", line 15, in 
execfile('petsc-ksp.py')
  File "<...>/demo/kspsolve/test_mat_ksp.py", line 6, in execfile
try: exec(fh.read()+"\n", globals, locals)
  File "", line 15, in 
  File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve
(src/petsc4py.PETSc.c:153555)
petsc4py.PETSc.Error: error code 92
[0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
[0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
[0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
[0] PCSetUp_ICC() line 21 in <...>/src/ksp/pc/impls/factor/icc/icc.c
[0] MatGetFactor() line 4240 in<...>/src/mat/interface/matrix.c
[0] You cannot overwrite this option since that will conflict with other
previously set options
[0] Could not locate solver package (null). Perhaps you must ./configure
with --download-(null)

<...>
---
Primary job  terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.
---
--
mpirun detected that one or more processes exited with non-zero status,
thus causing
the job to be terminated. The first process to do so was:

  Process name: [[23110,1],0]
  Exit code:1
--


What have I done wrong ?