Re: [petsc-users] GPUs, cud, complex

2019-02-21 Thread Smith, Barry F. via petsc-users
Hmm, ex32 suddenly becomes ex39 (and there is no ex39 in the src/ksp/ksp/examples/tutorials/ directory?) I try ex32 with those options and it runs though the -n1 n2 n3 options aren't used. Barry > On Feb 21, 2019, at 6:20 PM, Randall Mackie wrote: > > Hi Barry and Satish, > > Yes,

Re: [petsc-users] GPUs, cud, complex

2019-02-21 Thread Smith, Barry F. via petsc-users
Randy, Could you please cut and paste the entire error message you get. It worked for me. I assume you mean -dm_mat_type aijcusparse not aijcuda (which doesn't exist). Satish, I does appear we do not have a nightly test for cuda and complex, could that test be added to

[petsc-users] Question with filedsplit in PETSc

2019-02-21 Thread Zhu, Qiming via petsc-users
Dear all, Sorry to disturb you. I am a user of Petsc. I am trying to use Fieldsplit in Petsc to do preconditioning for Navier-Stokes problem. I have some problems when I trying to use Fieldsplit function. I am now defining the nest matrix first, then I get the IS from nested matrix. But I

Re: [petsc-users] About DMDA (and extracting its ordering)

2019-02-21 Thread Thibaut Appel via petsc-users
Hi Matthew, Is your first part of your answer (using DMDASetBlockFills) valid only in the case I create a DMDA object? Yes I think that is the kind of stencil I am using. I could know how the stencil looks like exactly, but I preallocate looping, for each process on all the elements of the

Re: [petsc-users] Using PETSc in Cray systems

2019-02-21 Thread Matthew Knepley via petsc-users
On Thu, Feb 21, 2019 at 10:46 AM Najib Alia via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear all, > > we are trying to compile our Finite Element code on a Cray system and > have a problem with PETSc and available packages: "unable to find > scotch64", the variable PETSC_SINGLE_LIBRARY is

[petsc-users] About DMDA (and extracting its ordering)

2019-02-21 Thread Thibaut Appel via petsc-users
Dear PETSc developers/users, I’m solving linear PDEs on a regular grid with high-order finite differences, assembling an MPIAIJ matrix to solve linear systems or eigenvalue problems. I’ve been using vertex major, natural ordering for the parallelism with PetscSplitOwnership (yielding

[petsc-users] Using PETSc in Cray systems

2019-02-21 Thread Najib Alia via petsc-users
Dear all, we are trying to compile our Finite Element code on a Cray system and have a problem with PETSc and available packages: "unable to find scotch64", the variable PETSC_SINGLE_LIBRARY is set to NOTFOUND, and the compiler tests fail. What we have done: 1) We loaded the corresponding