[petsc-dev] SuperLU_dist 6.0.0 ?

2019-01-09 Thread Victor Eijkhout via petsc-dev
I can not find my previous correspondence with you guy sabout this topic. Has SLU_DIST 6.0.0 been incorporated in the latest petsc? %% There is a new release on Superlu_DIST package version 6.0.0 (released on Sept 23rd), where it improves strong scaling in triangular solve stage. %%

Re: [petsc-dev] Elemental & 64-bit int

2019-04-02 Thread Victor Eijkhout via petsc-dev
That seems to have fixed it. Thanks. Victor. > On Apr 1, 2019, at 8:02 PM, Balay, Satish wrote: > > On Tue, 2 Apr 2019, Victor Eijkhout via petsc-dev wrote: > >> Configuring with elemental & 64-bit integers: >> >> Cannot use elemental with 64 bit BLAS/Lapac

Re: [petsc-dev] [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-27 Thread Victor Eijkhout via petsc-dev
On Mar 27, 2019, at 7:29 AM, Mark Adams mailto:mfad...@lbl.gov>> wrote: How should he configure to this? remove "--download-fblaslapack=1" and add 1. If using gcc module load mkl with either compiler: export BLAS_LAPACK_LOAD=--with-blas-lapack-dir=${MKLROOT} 2. We define MPICH_HOME

Re: [petsc-dev] [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-27 Thread Victor Eijkhout via petsc-dev
instead of clang++ On Wed, Mar 27, 2019 at 9:30 AM Matthew Knepley mailto:knep...@gmail.com>> wrote: On Wed, Mar 27, 2019 at 8:55 AM Victor Eijkhout via petsc-dev mailto:petsc-dev@mcs.anl.gov>> wrote: On Mar 27, 2019, at 7:29 AM, Mark Adams mailto:mfad...@lbl.gov>> wrote: Ho

[petsc-dev] Elemental & 64-bit int

2019-04-01 Thread Victor Eijkhout via petsc-dev
Configuring with elemental & 64-bit integers: Cannot use elemental with 64 bit BLAS/Lapack indices This used to work in 3.10. What changed? Victor.

Re: [petsc-dev] Elemental & 64-bit int

2019-04-01 Thread Victor Eijkhout via petsc-dev
> On Apr 1, 2019, at 8:02 PM, Balay, Satish wrote: > > On Tue, 2 Apr 2019, Victor Eijkhout via petsc-dev wrote: > >> Configuring with elemental & 64-bit integers: >> >> Cannot use elemental with 64 bit BLAS/Lapack indices >> >> This used to w

Re: [petsc-dev] [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-28 Thread Victor Eijkhout via petsc-dev
On Mar 27, 2019, at 8:30 AM, Matthew Knepley mailto:knep...@gmail.com>> wrote: I think Satish now prefers --with-cc=${MPICH_HOME}/mpicc --with-cxx=${MPICH_HOME}/mpicxx --with-fc=${MPICH_HOME}/mpif90 That still requires with-mpi:

Re: [petsc-dev] [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-26 Thread Victor Eijkhout via petsc-dev
On Mar 26, 2019, at 6:25 PM, Mark Adams via petsc-dev mailto:petsc-dev@mcs.anl.gov>> wrote: /home1/04906/bonnheim/olympus-keaveny/Olympus/olympus.petsc-3.9.3.skx-cxx-O on a skx-cxx-O named c478-062.stampede2.tacc.utexas.edu with 4800 processors,

[petsc-dev] HYPRE_LinSysCore.h

2019-01-29 Thread Victor Eijkhout via petsc-dev
I’ve been happily freeloading on the petsc installation in the sense that I claim to install things like hypre on our clusters by pointing into the petsc installation. Until of course someone needs a bit that does not get installed by petsc. In this case: HYPRE_LinSysCore.h Does the petsc

Re: [petsc-dev] HYPRE_LinSysCore.h

2019-01-29 Thread Victor Eijkhout via petsc-dev
On Jan 29, 2019, at 3:58 PM, Balay, Satish mailto:ba...@mcs.anl.gov>> wrote: -args.append('--without-fei') The late-1990s Finite Element Interface? I’ll enable it and see if anyone complains about it breaking whatever. Victor.

[petsc-dev] Structure of Elemental changed?

2019-04-10 Thread Victor Eijkhout via petsc-dev
This rpm was make with 3.10.3: root@build-BLDCHROOT:SPECS # rpm -qlp ../RPMS/x86_64/tacc-petsc-intel18-impi18_0-package-3.10-4.el7.x86_64.rpm | grep libElSuite /home1/apps/intel18/impi18_0/petsc/3.10/skylake-debug/lib/libElSuiteSparse.so