Re: [petsc-users] makefile for compiling multiple sources

2016-03-19 Thread Matthew Overholt
for compiling multiple sources On Fri, Mar 18, 2016 at 9:49 AM, Matthew Overholt <overh...@capesim.com> wrote: Hi, I’m just getting started with PETSc and have been able to configure and run the examples, but as I’m starting to put together a more substantial code with multiple source files I h

[petsc-users] makefile for compiling multiple sources

2016-03-19 Thread Matthew Overholt
Hi, I'm just getting started with PETSc and have been able to configure and run the examples, but as I'm starting to put together a more substantial code with multiple source files I haven't been able to find or create a makefile which works and follows PETSc guidelines. I've configured

[petsc-users] Parallel to sequential matrix scatter for PARDISO

2016-05-02 Thread Matthew Overholt
Petsc-users, I want to use PARDISO for a KSPPREONLY solution in a parallel context. I understand that my FEA stiffness matrix for KSP (PARDISO) needs to be of type MATSEQAIJ (according to MATSOLVERMKL_PARDISO), but I would like to assemble this matrix in parallel (MATSBAIJ) and then collect

[petsc-users] PC Direct Solution failure

2016-07-21 Thread Matthew Overholt
PETSc Users, I am doing a KSPPREONLY solution (of the heat transfer equation using FEA) and comparing several packages like PARDISO and MUMPS, and I am encountering a MatSolve() failure that I am having trouble diagnosing. The matrix inversion fails and I get "nan". The failure only happens

[petsc-users] large PetscCommDuplicate overhead

2016-10-05 Thread Matthew Overholt
Hi Petsc-Users, I am trying to understand an issue where PetscCommDuplicate() calls are taking an increasing percentage of time as I run a fixed-sized problem on more processes. I am using the FEM to solve the steady-state heat transfer equation (K.x = q) using a PC direct solver, like

Re: [petsc-users] large PetscCommDuplicate overhead

2016-10-06 Thread Matthew Overholt
To: overh...@capesim.com Cc: petsc-users@mcs.anl.gov Subject: Re: [petsc-users] large PetscCommDuplicate overhead > On Oct 5, 2016, at 2:30 PM, Matthew Overholt <overh...@capesim.com> wrote: > > Hi Petsc-Users, > > I am trying to understand an issue where PetscCommDuplic

Re: [petsc-users] MatMkl_CPardisoSetCntl

2016-08-22 Thread Matthew Overholt
>> On Aug 22, 2016, at 10:49 AM, Matthew Overholt <overh...@capesim.com> wrote: >> >> I am using the Intel MKL CPardiso library as a PC direct solver, and I am >> trying to >> figure out how to properly set options (the Pardiso and CPardiso “iparm” >

Re: [petsc-users] large PetscCommDuplicate overhead

2016-10-11 Thread Matthew Overholt
ct machine are you running on? Please run modules list so we can see exactly what modules you are using. Please tell us exactly what options you are passing to pat_build? Barry > On Oct 6, 2016, at 10:45 AM, Matthew Overholt <overh...@capesim.com> wrote: > > Matthew

Re: [petsc-users] large PetscCommDuplicate overhead

2016-10-12 Thread Matthew Overholt
' Subject: Re: [petsc-users] large PetscCommDuplicate overhead Matthew Overholt <overh...@capesim.com> writes: > Barry, > > Subsequent tests with the same code and a problem (input) having a > much smaller vertex (equation) count (i.e. a much smaller matrix to > invert for the s

Re: [petsc-users] large PetscCommDuplicate overhead

2016-10-17 Thread Matthew Overholt
() are the same regardless of geometry and the time of the call shouldn't depend on the geometry. Would you be able to do another set of tests where you track the time in MPI_Get_attr() and MPI_Barrier() instead of PetscCommDuplicate()? It could be Cray did something "funny" in their impleme

[petsc-users] MKL Pardiso Solver Execution Step control

2018-06-04 Thread Matthew Overholt
Hello, I am using KSP in KSPPREONLY mode to do a direct solve on an A*x = b system, with solver algorithms MUMPS, CPardiso and Pardiso. For Pardiso, is it possible to control the solver execution step (denoted "phase" in Intel's docs)? I would like to be able to control when it refactors as one

Re: [petsc-users] MKL Pardiso Solver Execution Step control

2018-06-05 Thread Matthew Overholt
e Analysis on output time steps when unsteady, and just for the first fixed point iteration when steady. Matt... On Tue, Jun 5, 2018 at 5:03 AM, Smith, Barry F. wrote: > > > > On Jun 4, 2018, at 10:32 PM, Matthew Overholt > wrote: > > > > Hello, > > > > I am

Re: [petsc-users] MKL Pardiso Solver Execution Step control

2018-06-05 Thread Matthew Overholt
) 653-7100 x204 overh...@capesim.com On Tue, Jun 5, 2018 at 10:40 AM, Matthew Knepley wrote: > On Tue, Jun 5, 2018 at 10:04 AM, Matthew Overholt > wrote: > >> Barry, >> >> What I should have said was that I wanted to control when it does the >> "Analysi

Re: [petsc-users] MKL Pardiso Solver Execution Step control

2018-06-05 Thread Matthew Overholt
On Tue, Jun 5, 2018 at 1:48 PM, Smith, Barry F. wrote: > > > > On Jun 5, 2018, at 4:08 PM, Matthew Overholt > wrote: > > > > Yes to Matthew - not repeating Phase 1: Fill-reduction analysis and > symbolic factorization . Numerical factoring is required because th

Re: [petsc-users] MKL Pardiso Solver Execution Step control

2018-06-06 Thread Matthew Overholt
On Wed, Jun 6, 2018 at 6:30 AM, Smith, Barry F. wrote: > > > > > On Jun 5, 2018, at 7:36 PM, Matthew Overholt > wrote: > > > > > > On Tue, Jun 5, 2018 at 1:48 PM, Smith, Barry F. > wrote: > > > > > > > On Jun 5, 2018, at 4:

[petsc-users] PC_SUBPC_ERROR with -pc_factor_zeropivot

2018-07-06 Thread Matthew Overholt
I am working on handling very small pivot values for a very small percentage of my matrix (linear Ax = b solution), and I am getting an error that I don't understand when I run the KSPCG solver in parallel. KSPCreate(comm, ) KSPSetTolerances(ksp, rtol, ...) KSPSetType(ksp, KSPCG)

Re: [petsc-users] PC_SUBPC_ERROR with -pc_factor_zeropivot

2018-07-09 Thread Matthew Overholt
Great idea, thanks again! Matt... On Mon, Jul 9, 2018 at 11:08 AM, Matthew Knepley wrote: > On Mon, Jul 9, 2018 at 11:05 AM Matthew Overholt > wrote: > >> The -sub_pc_factor_zeropivot option works, but since this is for >> a commercial code I don't want users to have to

Re: [petsc-users] PC_SUBPC_ERROR with -pc_factor_zeropivot

2018-07-09 Thread Matthew Overholt
Thanks, Matt, that option works perfectly, and explains the -ksp_view output. Much appreciated! Matt... On Fri, Jul 6, 2018 at 6:25 PM, Matthew Knepley wrote: > Default Case (zeropivot is 2.22045E-14): >> mpiexec -n 1 ... >> ==> ksp fails due to pcReason = PC_FACTOR_NUMERIC_ZEROPIVOT >> >>

Re: [petsc-users] PC_SUBPC_ERROR with -pc_factor_zeropivot

2018-07-09 Thread Matthew Overholt
to call PCBjacobiGetSubKSP() and then loop over every block, calling KSPGetPC() PCFactorSetZeroPivot() on each block? That seems a little tedious, but if that is the correct approach I'll do it. Thanks, Matt Overholt On Mon, Jul 9, 2018 at 10:17 AM, Matthew Overholt wrote: > Thanks, M

Re: [petsc-users] Problems about Compiling Multifile Program

2018-10-23 Thread Matthew Overholt
Here is a sample makefile, like what I use with the Intel MPI compilers (used during PETSc configuration) and MKL library. Matt Overholt - # # makefile for Linux using the Intel C++ Compiler, MKL & MPI libraries + OpenMP # usage: make # or: make

Re: [petsc-users] Problems about Compiling Multifile Program

2018-10-23 Thread Matthew Overholt
Correction: OBJFILES = \ class1.o \ class2.o \ myprogram.o Matt Overholt On Tue, Oct 23, 2018 at 2:29 PM Matthew Overholt wrote: > Here is a sample makefile, like what I use with the Intel MPI compilers > (used during PETSc configuration) and MKL library. > > Matt Overholt > >

Re: [petsc-users] Compile petsc using intel mpi

2018-11-29 Thread Matthew Overholt via petsc-users
Hi Edoardo, I also have the Intel Parallel Studio XE compilers and MPI installed, and I use it to build PETSc as follows. # Either add these to your .bashrc or run them on the command line before beginning the PETSc installation source /opt/intel/parallel_studio_xe_2018/bin/psxevars.sh intel64

[petsc-users] METIS double precision

2019-01-10 Thread Matthew Overholt via petsc-users
Hello, How does one configure the PETSc installation to download METIS and have it use REALTYPEWIDTH 64 (as defined in metis.h)? I am using: --with-64-bit-indices --download-metis=yes to get IDXTYPEWIDTH 64. If I were installing METIS independently I would set the following near the top of

Re: [petsc-users] METIS double precision

2019-01-10 Thread Matthew Overholt via petsc-users
tached patch provides --download-metis-use-doubleprecision option > > Satish > > On Thu, 10 Jan 2019, Matthew Overholt via petsc-users wrote: > > > Hello, > > > > How does one configure the PETSc installation to download METIS and have > it > > use REALTYPEW

Re: [petsc-users] Slow linear solver via MUMPS

2019-01-28 Thread Matthew Overholt via petsc-users
Hi Mohammad, We tried the same thing for our finite element heat transfer code, and experimented with both MUMPS and MKL's Cluster PARDISO for about a year, and were very disappointed with how they scaled. Give the full PETSc PCG solver with the ILU(0) preconditioner a try (pure MPI, no hybrid