Re: [petsc-users] Configure nested PCFIELDSPLIT with general index sets

2017-04-28 Thread Matthew Knepley
On Fri, Apr 28, 2017 at 1:09 PM, Matthew Knepley wrote: > On Fri, Apr 28, 2017 at 11:48 AM, Natacha BEREUX > wrote: > >> Dear Matt, >> Sorry for my (very) late reply. >> I was not able to find the Fortran interface of >>

Re: [petsc-users] Configure nested PCFIELDSPLIT with general index sets

2017-04-28 Thread Matthew Knepley
On Fri, Apr 28, 2017 at 11:48 AM, Natacha BEREUX wrote: > Dear Matt, > Sorry for my (very) late reply. > I was not able to find the Fortran interface of > DMSellSetCreateFieldDecomposition in the late petsc-3.7.6 fortran (and my > code still fails to link). > I have the

Re: [petsc-users] Configure nested PCFIELDSPLIT with general index sets

2017-04-28 Thread Natacha BEREUX
Dear Matt, Sorry for my (very) late reply. I was not able to find the Fortran interface of DMSellSetCreateFieldDecomposition in the late petsc-3.7.6 fortran (and my code still fails to link). I have the feeling that it is missing in the master branch. And I was not able to get it on bitbucket

Re: [petsc-users] strange convergence

2017-04-28 Thread Barry Smith
Ok, so boomerAMG algebraic multigrid is not good for the first block. You mentioned the first block has two things glued together? AMG is fantastic for certain problems but doesn't work for everything. Tell us more about the first block, what PDE it comes from, what discretization, and

Re: [petsc-users] Using ViennaCL without recompiling

2017-04-28 Thread Satish Balay
On Fri, 28 Apr 2017, Franco Milicchio wrote: > > > Not recompiling your own project is fine. PETSc has an ABI. You just > > reconfigure/recompile PETSc with > > ViennaCL support. Then you can use -mat_type viennacl etc. > > Thanks for your answer, Matt, but I expressed myself in an ambiguous

Re: [petsc-users] Using ViennaCL without recompiling

2017-04-28 Thread Franco Milicchio
> Not recompiling your own project is fine. PETSc has an ABI. You just > reconfigure/recompile PETSc with > ViennaCL support. Then you can use -mat_type viennacl etc. Thanks for your answer, Matt, but I expressed myself in an ambiguous way. I cannot recompile PETSc, I can do whatever I want

Re: [petsc-users] Using ViennaCL without recompiling

2017-04-28 Thread Matthew Knepley
On Fri, Apr 28, 2017 at 9:13 AM, Franco Milicchio wrote: > Dear all, > > I need to integrate ViennaCL into an existing project that uses PETSc, but > for backwards compatibility, we cannot recompile it. > Not recompiling your own project is fine. PETSc has an ABI. You just

[petsc-users] Using ViennaCL without recompiling

2017-04-28 Thread Franco Milicchio
Dear all, I need to integrate ViennaCL into an existing project that uses PETSc, but for backwards compatibility, we cannot recompile it. Is there any simple interface to copy a Mat and Vec objects into a ViennaCL matrix and vector ones? I am sorry if this is a trivial question, but as far as

Re: [petsc-users] explanations on DM_BOUNDARY_PERIODIC

2017-04-28 Thread Matthew Knepley
On Fri, Apr 28, 2017 at 2:36 AM, neok m4700 wrote: > Hello Barry, > > Thank you for answering. > > I quote the DMDA webpage: > "The vectors can be thought of as either cell centered or vertex centered > on the mesh. But some variables cannot be cell centered and others

Re: [petsc-users] strange convergence

2017-04-28 Thread Hoang Giang Bui
It's in fact quite good Residual norms for fieldsplit_u_ solve. 0 KSP Residual norm 4.014715925568e+00 1 KSP Residual norm 2.160497019264e-10 Residual norms for fieldsplit_wp_ solve. 0 KSP Residual norm 0.e+00 0 KSP preconditioned resid norm 4.014715925568e+00

Re: [petsc-users] explanations on DM_BOUNDARY_PERIODIC

2017-04-28 Thread neok m4700
Hello Barry, Thank you for answering. I quote the DMDA webpage: "The vectors can be thought of as either cell centered or vertex centered on the mesh. But some variables cannot be cell centered and others vertex centered." So If I use this, then when creating the DMDA the overall size will be