Re: [petsc-users] Clarification on PCASMSetLocalSubdomain is and is_local

2016-06-02 Thread Matthew Knepley
On Thu, Jun 2, 2016 at 5:27 PM, Luc Berger-Vergiat wrote: > Ok I get it, then if I have multiple subdomains on the local processor is > and is_local will be arrays of is that represent each subdomain on that > processor? > Yep. Matt > Best, > Luc > > On 06/02/2016

Re: [petsc-users] Clarification on PCASMSetLocalSubdomain is and is_local

2016-06-02 Thread Luc Berger-Vergiat
Ok I get it, then if I have multiple subdomains on the local processor is and is_local will be arrays of is that represent each subdomain on that processor? Best, Luc On 06/02/2016 06:21 PM, Matthew Knepley wrote: On Thu, Jun 2, 2016 at 5:11 PM, Luc Berger-Vergiat

Re: [petsc-users] Clarification on PCASMSetLocalSubdomain is and is_local

2016-06-02 Thread Matthew Knepley
On Thu, Jun 2, 2016 at 5:11 PM, Luc Berger-Vergiat wrote: > Hi all, > I would like a quick clarification on what is and is_local are > representing in the PCASMSetLocalSubdomains(). > My understanding is that if I have two mpi ranks and four subdomains I can > end up having

[petsc-users] Clarification on PCASMSetLocalSubdomain is and is_local

2016-06-02 Thread Luc Berger-Vergiat
Hi all, I would like a quick clarification on what is and is_local are representing in the PCASMSetLocalSubdomains(). My understanding is that if I have two mpi ranks and four subdomains I can end up having four blocks that I can denote as follows: | domain1 | domain2 |

Re: [petsc-users] (no subject)

2016-06-02 Thread neok m4700
Re, Makes sense to read the documentation, I will try with another preconditioners. Thanks for the support. 2016-06-02 18:15 GMT+02:00 Matthew Knepley : > On Thu, Jun 2, 2016 at 11:10 AM, neok m4700 wrote: > >> Hi Satish, >> >> Thanks for the

Re: [petsc-users] (no subject)

2016-06-02 Thread Matthew Knepley
On Thu, Jun 2, 2016 at 11:10 AM, neok m4700 wrote: > Hi Satish, > > Thanks for the correction. > > The error message is now slightly different, but the result is the same > (serial runs fine, parallel with mpirun fails with following error): > Now the error is correct. You

Re: [petsc-users] (no subject)

2016-06-02 Thread neok m4700
Hi Satish, Thanks for the correction. The error message is now slightly different, but the result is the same (serial runs fine, parallel with mpirun fails with following error): [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c [0] KSPSetUp() line 390 in

Re: [petsc-users] (no subject)

2016-06-02 Thread Satish Balay
with petsc-master - you would have to use petsc4py-master. i.e try petsc-eab7b92 with petsc4py-6e8e093 Satish On Thu, 2 Jun 2016, neok m4700 wrote: > Hi Matthew, > > I've rebuilt petsc // petsc4py with following versions: > > 3.7.0 // 3.7.0 => same runtime error > 00c67f3 // 3.7.1 => fails

Re: [petsc-users] (no subject)

2016-06-02 Thread neok m4700
Hi Matthew, I've rebuilt petsc // petsc4py with following versions: 3.7.0 // 3.7.0 => same runtime error 00c67f3 // 3.7.1 => fails to build petsc4py (error below) 00c67f3 // 6e8e093 => same as above f1b0812 (latest commit) // 6e8e093 (latest commit) => same as above In file included from

Re: [petsc-users] (no subject)

2016-06-02 Thread Matthew Knepley
On Thu, Jun 2, 2016 at 9:12 AM, neok m4700 wrote: > Hi, > > I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran the > examples in the demo directory. > I believe this was fixed in 'master':

[petsc-users] (no subject)

2016-06-02 Thread neok m4700
Hi, I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran the examples in the demo directory. $ python test_mat_ksp.py => runs as expected (serial) $ mpiexec -np 2 python test_mat_ksp.py => fails with the following output: Traceback (most recent call last): File