Re: [petsc-users] MatMatMult with MAT_INITIAL_MATRIX

2020-04-19 Thread Marius Buerkle
Hi Hong and Stefano,   I just checked with the latest PETSC commit (v3.13-145-gf227361) there C2 maintains the local number of A, for  commit v3.12.2-537-g5f77d1e which I tried initially I get a different number of local rows for A. While I did not check it thoroughly it seems to work with

Re: [petsc-users] MatMatMult with MAT_INITIAL_MATRIX

2020-04-19 Thread Zhang, Hong via petsc-users
Marius, I'll test it tomorrow. Hong From: Marius Buerkle Sent: Sunday, April 19, 2020 7:47 PM To: Zhang, Hong Cc: Stefano Zampini ; PETSc users list Subject: Aw: Re: [petsc-users] MatMatMult with MAT_INITIAL_MATRIX Hi Hong, Yes exactly, I would like C2 to

Re: [petsc-users] MatMatMult with MAT_INITIAL_MATRIX

2020-04-19 Thread Marius Buerkle
Hi Hong,   Yes exactly, I would like C2 to maintain the local number of A. Do you think this is possible? I tried to allocated A before with the correct local number and use MAT_REUSE but this gave a segmentation fault.   Best Marius      Marius, C1 = Ampidense*Bmpiaij inherits the

Re: [petsc-users] MatMatMult with MAT_INITIAL_MATRIX

2020-04-19 Thread Zhang, Hong via petsc-users
Marius, C1 = Ampidense*Bmpiaij inherits the number of local rows from A and the number of local columns from B. C2 = Ampidense*Bmpidense is computed via external package Elemental, which petsc does not dictate the parallel layout of C2 in current petsc/elemental interface. I am not sure if we

Re: [petsc-users] MatMatMult with MAT_INITIAL_MATRIX

2020-04-19 Thread Stefano Zampini
Matrix C should always inherit the number of local rows from A and the number of local columns from B. Is it not the case for your code? If so, please provide a MWE to reproduce Also, which version of PETSc are you using? Il Dom 19 Apr 2020, 19:21 Marius Buerkle ha scritto: > > Hi, > > I have a

[petsc-users] MatMatMult with MAT_INITIAL_MATRIX

2020-04-19 Thread Marius Buerkle
Hi,   I have a question about the behavior of MAT_INITIAL_MATRIX for MatMatMult. I a have a set of MPIDENSE and MPIAIJ matrices for which I have defined the number of local rows (and local columns) manually, i.e. not used PETSC_DECIDE for MatSetSizes. The number of local rows is different

Re: [petsc-users] SuperLU + GPUs

2020-04-19 Thread Mark Adams
Ahhh, thanks, OK, now I am able to reproduce the error in the test. I can work on that, Thanks again, On Sun, Apr 19, 2020 at 11:45 AM Satish Balay wrote: > > *[0]PETSC ERROR: Could not locate solver package superlu for > factorization > > Here you are requesting 'superlu' - instead of

Re: [petsc-users] SuperLU + GPUs

2020-04-19 Thread Satish Balay via petsc-users
> *[0]PETSC ERROR: Could not locate solver package superlu for factorization Here you are requesting 'superlu' - instead of 'superlu_dist' - hence this error. Satish On Sun, 19 Apr 2020, Mark Adams wrote: > > > > > > > > > > --download-superlu --download-superlu_dist > > > > You are

Re: [petsc-users] SuperLU + GPUs

2020-04-19 Thread Fande Kong
Hi Mark, This should help: -pc_factor_mat_solver_type superlu_dist Thanks, Fande > On Apr 19, 2020, at 9:41 AM, Mark Adams wrote: > >  >> >> >> > > --download-superlu --download-superlu_dist >> >> You are installing with both superlu and superlu_dist. To verify - remove >> superlu

Re: [petsc-users] SuperLU + GPUs

2020-04-19 Thread Mark Adams
> > > > > > --download-superlu --download-superlu_dist > > You are installing with both superlu and superlu_dist. To verify - remove > superlu - and keep only superlu_dist > I tried this earlier. Here is the error message: 0 SNES Function norm 1.511918966798e-02 [0]PETSC ERROR:

Re: [petsc-users] Mumps giving huge volume of output

2020-04-19 Thread Satish Balay via petsc-users
This is coming from mumps. PETSc configure option --with-debugging=0 does not control it. You might have to check which mumps option controls it - and perhaps use the runtime option -mat_mumps_icntl_(x) Satish On Sun, 19 Apr 2020, san.tempo...@gmail.com wrote: > I have just successfully

Re: [petsc-users] Ignoring PETSC_ARCH for make check?

2020-04-19 Thread Satish Balay via petsc-users
PETSc supports both inplace multiple builds - and prefix builds. Most packages don't support inplace multiple builds. For inplace multiple builds - you need the PETSC_ARCH concept. But not for prefix builds. i.e: 2 inplace builds. ./configure PETSC_ARCH=arch-build1 --with-cc=gcc etc.. make

Re: [petsc-users] SuperLU + GPUs

2020-04-19 Thread Satish Balay via petsc-users
On Sun, 19 Apr 2020, Mark Adams wrote: > On Sat, Apr 18, 2020 at 9:04 PM Xiaoye S. Li wrote: > > > That works, but your previous email showed the following: > > > > Ah, so PETSc must switch internally. I don't think so > > Is there any reason why we should not use superlu_dist all of the

[petsc-users] Mumps giving huge volume of output

2020-04-19 Thread san . temporal
I have just successfully compiled PETSc with ./configure --with-cc=mpicc --with-fc=mpif90 -with-cxx=mpicxx --prefix=/home/santiago/usr/local --with-make-np=10 --with-shared-libraries --with-packages-download-dir=/home/santiago/Documents/installers/petsc --download-fblaslapack --download-mumps

Re: [petsc-users] Ignoring PETSC_ARCH for make check?

2020-04-19 Thread san . temporal
Ok, the, the second option applies... I had forgotten about this observation from the multiple times I installed PETSc in the past. Then, two questions come to mind: 1. Why is it set up like that? 2. What is the difference in behaviour? I see the same output from both options. Thanks again!

Re: [petsc-users] SuperLU + GPUs

2020-04-19 Thread Mark Adams
On Sat, Apr 18, 2020 at 9:04 PM Xiaoye S. Li wrote: > That works, but your previous email showed the following: > Ah, so PETSc must switch internally. Is there any reason why we should not use superlu_dist all of the time? > > SuperLU: > Version: 5.2.1 > Includes: