[petsc-users] unsorted local columns in 3.8?

2017-10-30 Thread Randy Michael Churchill
I'm running a Fortran code that was just changed over to using petsc 3.8 (previously petsc 3.7.6). An error was thrown during a KSPSetUp() call. The error is "unsorted iscol_local is not implemented yet" (see full error below). I tried to trace down the difference in the source files, but where

Re: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods

2017-10-30 Thread zakaryah .
You were right, of course. I fixed the problem with the function evaluation and the code seems to be working now, at least on small test problems. Is there a way to setup preallocation of the Jacobian matrix, with the entire first row and column non-zero? I set the preallocation error flag to

Re: [petsc-users] petsc4py sparse matrix construction time

2017-10-30 Thread Matthew Knepley
On Mon, Oct 30, 2017 at 8:06 PM, Cetinbas, Cankur Firat wrote: > Hello, > > > > I am a beginner both in PETSc and mpi4py. I have been working on > parallelizing our water transport code (where we solve linear system of > equations) and I started with the toy code below. > > >

[petsc-users] petsc4py sparse matrix construction time

2017-10-30 Thread Cetinbas, Cankur Firat
Hello, I am a beginner both in PETSc and mpi4py. I have been working on parallelizing our water transport code (where we solve linear system of equations) and I started with the toy code below. The toy code reads right hand size (rh), row, column, value vectors to construct sparse coefficient

Re: [petsc-users] configuration error

2017-10-30 Thread Satish Balay
The compiler library detection code is a bit messy. Its there to help with interlanguage linking. One workarround [to such failures] is to tell configure not to guess, and specify the relavent info. For eg: balay@ipro^~/petsc(master) $ ./configure --download-mpich --download-hypre CC=clang

Re: [petsc-users] preconditioning matrix-free newton-krylov

2017-10-30 Thread Mark Lohry
Sparsity pattern binary AIJ gzipped here, should have 100^2 blocks of all 1's indicating the non-zero positions: https://github.com/mlohry/petsc_miscellany/blob/master/jacobian_sparsity.dat.gz Using 32 processes on 2x16-core AMD 6274, timing for MatColoringApply is ~877 seconds/~15 minutes for

Re: [petsc-users] configuration error

2017-10-30 Thread Satish Balay
--prefix=/Users/manav/Documents/codes/numerical_lib/petsc/petsc-3.8.0/../ You have a strange prefix. You are basically using: --prefix=/Users/manav/Documents/codes/numerical_lib/petsc The general convention is to use a different prefix for different versions of libraries [or different type of

Re: [petsc-users] preconditioning matrix-free newton-krylov

2017-10-30 Thread Smith, Barry F.
> On Oct 30, 2017, at 2:23 PM, Mark Lohry wrote: > > > Hmm, are those blocks dense? If so you could benefit enormously from using > BAIJ format. > > > Yes they're dense blocks. Usually coupled compressible 3D NS with DG > elements, 5 equations x order

Re: [petsc-users] preconditioning matrix-free newton-krylov

2017-10-30 Thread Mark Lohry
> > > Hmm, are those blocks dense? If so you could benefit enormously from > using BAIJ format. Yes they're dense blocks. Usually coupled compressible 3D NS with DG elements, 5 equations x order (N+1)*(N+2)*(N+3)/3 block size. So block sizes of 50^2 to 175^2 are typical. I'll try BAIJ; I

Re: [petsc-users] preconditioning matrix-free newton-krylov

2017-10-30 Thread Smith, Barry F.
> On Oct 30, 2017, at 1:58 PM, Mark Lohry wrote: > > Hmm, metis doesn't really have anything to do with the sparsity of the > Jacobian does it? > > No, I just mean I'm doing initial partitioning and parallel communication for > the residual evaluations independently of

Re: [petsc-users] preconditioning matrix-free newton-krylov

2017-10-30 Thread Mark Lohry
> > Hmm, metis doesn't really have anything to do with the sparsity of the > Jacobian does it? No, I just mean I'm doing initial partitioning and parallel communication for the residual evaluations independently of petsc, and then doing a 1-to-1 mapping to the petsc solution vector. Along with

Re: [petsc-users] Updating Fortran code to petsc 3.8

2017-10-30 Thread Satish Balay
On Mon, 30 Oct 2017, Randy Michael Churchill wrote: > > > >Please clarify. > > > > 1) You can successfully update to 3.8 and compile and run the code? > > I haven't updated all of the code to 3.8, I wanted to make sure there > wasn't any tricks to maintain backwards compatibility before I

Re: [petsc-users] Updating Fortran code to petsc 3.8

2017-10-30 Thread Randy Michael Churchill
> >Please clarify. > > 1) You can successfully update to 3.8 and compile and run the code? I haven't updated all of the code to 3.8, I wanted to make sure there wasn't any tricks to maintain backwards compatibility before I change everything. I'm only in the process of changing several of the

Re: [petsc-users] preconditioning matrix-free newton-krylov

2017-10-30 Thread Smith, Barry F.
> On Oct 30, 2017, at 12:39 PM, Mark Lohry wrote: > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the FD > > computation of the jacobians, or for the computation of the preconditioner? > > I'd like to get a handle on the relative costs of these.

Re: [petsc-users] Updating Fortran code to petsc 3.8

2017-10-30 Thread Smith, Barry F.
Please clarify. 1) You can successfully update to 3.8 and compile and run the code? 2) You do not have a way to support both 3.7 and 3.8 except by putting a large number of #ifdef in the code? Yes, 3.8 is a dramatic shift in API (to one we feel is much better), it is not simple to write

[petsc-users] Updating Fortran code to petsc 3.8

2017-10-30 Thread Randy Michael Churchill
I'm updating my Fortran code to petsc 3.8. I have several modules with types, and often when a type is initialized in the code, there are certain Petsc variables that aren't ready to be initialized, so they are set to 0. Later, in several parts of the code, these are checked with and if statments,

Re: [petsc-users] configuration error

2017-10-30 Thread Manav Bhatia
I am not getting errors with mumps (please see below). Interestingly, I just compiled this on another machine with clang-3.8 and gfortran-6.7 without problems. -Manav mumps_c.c:307:53: error: no member named 'nnz' in 'CMUMPS_STRUC_C'; did you mean 'nz'? mumps_par->n=0;

Re: [petsc-users] configuration error

2017-10-30 Thread Manav Bhatia
Fande, I made the change you recommended and it seems to have moved past that stage in the configuration. Thanks for your help! Regards, Manav > On Oct 30, 2017, at 11:36 AM, Kong, Fande wrote: > > We had exactly the same issue when upgraded compilers. I guess

Re: [petsc-users] Poisson problem with boundaries inside the domain

2017-10-30 Thread Mani Chandra
Thanks! f[j][i] = u[j][i] - ub did it. It even works with the automated Jacobian assembly. On Mon, Oct 30, 2017 at 9:28 PM, Smith, Barry F. wrote: > >If you using DMDA then you can't just "remove" some grid points from > the vector. > >What you need to do is for

Re: [petsc-users] configuration error

2017-10-30 Thread Kong, Fande
We had exactly the same issue when upgraded compilers. I guess this is somehow related to gfortran. A simple way to work around for us is to change* if with_rpath*: to * if False *at line 54 of config/BuildSystem/config/libraries.py. Not sure if it works for you. Fande, On Mon, Oct 30,

Re: [petsc-users] Poisson problem with boundaries inside the domain

2017-10-30 Thread Smith, Barry F.
If you using DMDA then you can't just "remove" some grid points from the vector. What you need to do is for your function evaluation at these interior grid points do f[j][i] = u[j][i] - ub in the FormFunction (ub is the Dirichlet value) and put a 1 one the diagonal of that row of the

[petsc-users] Poisson problem with boundaries inside the domain

2017-10-30 Thread Mani Chandra
Hello, I'm trying to solve the Poisson problem but with Dirichlet boundaries inside the domain (in addition to those imposed in the ghost zones). I'm using DMDA to create a structured grid and SNES coupled to this DMDA to solve the problem. The issue is that SNES doesn't converge when I impose

Re: [petsc-users] preconditioning matrix-free newton-krylov

2017-10-30 Thread Smith, Barry F.
> On Oct 29, 2017, at 11:50 AM, Mark Lohry wrote: > > Thanks again Barry, I've got the preconditioners hooked up with > -snes_mf_operator and at least AMG looks to be working great on a high order > unstructured DG problem. > > Couple questions on the SNESSetLagJacobian +