Re: [petsc-users] dmplex face normals orientation

2017-04-19 Thread Ingo Gaertner
Thank you, Matt, this answers my question. Ingo 2017-04-18 22:21 GMT+02:00 Matthew Knepley : > On Tue, Apr 18, 2017 at 2:33 PM, Ingo Gaertner > wrote: > >> The part that does not make sense is: >> The code calculates that >> face 11 (or edge 11

Re: [petsc-users] Installation question

2017-04-19 Thread Satish Balay
Sorry - should have mentioned: do 'rm -rf arch-linux-cxx-opt' and rerun configure again. The mpich install from previous build [that is currently in arch-linux-cxx-opt/] is conflicting with --with-mpi-dir=/app1/centos6.3/gnu/mvapich2-1.9/ Satish On Wed, 19 Apr 2017, Pham Pham wrote: > I

Re: [petsc-users] Installation question

2017-04-19 Thread Satish Balay
Presumably your cluster already has a recommended MPI to use [which is already installed. So you should use that - instead of --download-mpich=1 Satish On Wed, 19 Apr 2017, Pham Pham wrote: > Hi, > > I just installed petsc-3.7.5 into my university cluster. When evaluating > the computer

Re: [petsc-users] GAMG for the unsymmetrical matrix

2017-04-19 Thread Kong, Fande
Thanks, Mark, Now, the total compute time using GAMG is competitive with ASM. Looks like I could not use something like: "-mg_level_1_ksp_type gmres" because this option makes the compute time much worse. Fande, On Thu, Apr 13, 2017 at 9:14 AM, Mark Adams wrote: > > > On

Re: [petsc-users] VecAssembly gives segmentation fault with MPI

2017-04-19 Thread Karl Rupp
Hi Francesco, please don't drop petsc-users from the communication. This will likely provide you with better and faster answers. Since your current build is with debugging turned off, please reconfigure with debugging turned on, as the error message says. Chances are good that you will get

Re: [petsc-users] VecAssembly gives segmentation fault with MPI

2017-04-19 Thread Jed Brown
Please always use "reply-all" so that your messages go to the list. This is standard mailing list etiquette. It is important to preserve threading for people who find this discussion later and so that we do not waste our time re-answering the same questions that have already been answered in

Re: [petsc-users] VecAssembly gives segmentation fault with MPI

2017-04-19 Thread Karl Rupp
Hi Francesco, please consider the following: a) run your code through valgrind to locate the segmentation fault. Maybe there is already a memory access problem in the sequential version. b) send any error messages as well as the stack trace. c) what is you intent with "do in = nnod_loc"?

[petsc-users] VecAssembly gives segmentation fault with MPI

2017-04-19 Thread Francesco Migliorini
Hello! I have an MPI code in which a linear system is created and solved with PETSc. It works in sequential run but when I use multiple cores the VecAssemblyBegin/End give segmentation fault. Here's a sample of my code: call PetscInitialize(PETSC_NULL_CHARACTER,perr) ind(1) =