Re: [petsc-users] Slow linear solver via MUMPS

2019-01-28 Thread Matthew Overholt via petsc-users
Hi Mohammad,

We tried the same thing for our finite element heat transfer code, and
experimented with both MUMPS and MKL's Cluster PARDISO for about a year,
and were very disappointed with how they scaled.

Give the full PETSc PCG solver with the ILU(0) preconditioner a try (pure
MPI, no hybrid MPI-OpenMP).  We found that it scales very well over two or
more nodes, and even though it is slower than MKL PARDISO on a single node,
its speedup is so much better over multiple MPI ranks that it quickly
overtakes the speed of the direct solvers.

ierr = KSPSetType(ksp, KSPCG);   // And stick with the
default ILU preconditioner

The interconnect we've been using is 25 Gbps Ethernet, which is standard on
the AWS EC2 cloud.

Matt Overholt

On Fri, Jan 25, 2019 at 10:44 AM Mohammad Gohardoust via petsc-users <
petsc-users@mcs.anl.gov> wrote:

> Hi,
>
> I am trying to modify a "pure MPI" code for solving water movement
> equation in soils which employs KSP iterative solvers. This code gets
> really slow in the hpc I am testing it as I increase the number of
> calculating nodes (each node has 28 cores) even from 1 to 2. I went for
> implementing "MPI-OpenMP" solutions like MUMPS. I did this inside the petsc
> by:
>
> KSPSetType(ksp, KSPPREONLY);
> PCSetType(pc, PCLU);
> PCFactorSetMatSolverType(pc, MATSOLVERMUMPS);
> KSPSolve(ksp, ...
>
> and I run it through:
>
> export OMP_NUM_THREADS=16 && mpirun -n 2 ~/Programs/my_programs
>
> The code is working (in my own PC) but it is too slow (maybe about 50
> times slower). Since I am not an expert, I like to know is this what I
> should expect from MUMPS!?
>
> Thanks,
> Mohammad
>
>


Re: [petsc-users] METIS double precision

2019-01-10 Thread Matthew Overholt via petsc-users
Thank you, Satish!  That edit was easy to make and the configured and built
library works perfectly!

Just a note for new users: the patch to metis.py does not replace the
original --download-metis= argument, but supplements it (i.e. use
both).  I made this mistake on my first try and METIS was not downloaded.

Matt Overholt



On Thu, Jan 10, 2019 at 10:35 AM Balay, Satish  wrote:

> Well - this was the previous default - and was removed as it was deemed
> unnecessary.
>
>
> https://bitbucket.org/petsc/petsc/commits/2d4f01c230fe350f0ab5a28d1f5ef05ceab7ea3d
>
> The attached patch provides --download-metis-use-doubleprecision option
>
> Satish
>
> On Thu, 10 Jan 2019, Matthew Overholt via petsc-users wrote:
>
> > Hello,
> >
> > How does one configure the PETSc installation to download METIS and have
> it
> > use REALTYPEWIDTH 64 (as defined in metis.h)?  I am using:
> >
> > --with-64-bit-indices --download-metis=yes
> >
> > to get IDXTYPEWIDTH 64.  If I were installing METIS independently I would
> > set the following near the top of metis.h:  #define
> > METIS_USE_DOUBLEPRECISION
> >
> > The reason for this is I want to call METIS routines outside of PETSc and
> > prefer double precision.
> >
> > Thanks,
> > Matt Overholt
> > CapeSym, Inc.
> >
>


[petsc-users] METIS double precision

2019-01-10 Thread Matthew Overholt via petsc-users
Hello,

How does one configure the PETSc installation to download METIS and have it
use REALTYPEWIDTH 64 (as defined in metis.h)?  I am using:

--with-64-bit-indices --download-metis=yes

to get IDXTYPEWIDTH 64.  If I were installing METIS independently I would
set the following near the top of metis.h:  #define
METIS_USE_DOUBLEPRECISION

The reason for this is I want to call METIS routines outside of PETSc and
prefer double precision.

Thanks,
Matt Overholt
CapeSym, Inc.


Re: [petsc-users] Compile petsc using intel mpi

2018-11-29 Thread Matthew Overholt via petsc-users
Hi Edoardo,

I also have the Intel Parallel Studio XE compilers and MPI installed, and I
use it to build PETSc as follows.

# Either add these to your .bashrc or run them on the command line before
beginning the PETSc installation
source /opt/intel/parallel_studio_xe_2018/bin/psxevars.sh intel64
export PETSC_DIR=/opt/petsc/petsc-3.10.2
export PETSC_ARCH=arch-intel-opt
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$PETSC_DIR/$PETSC_ARCH/lib

# Next I create a configure script file in the PETSc directory,
/opt/petsc/petsc-3.10.2/config-3.10.2opt
echo
echo Optimized Configure with Intel 2018 compilers, MKL PARDISO-CPARDISO,
and Intel MPI
echo No need to edit .bashrc except for PETSC_DIR and PETSC_ARCH
echo Run psxevars first
echo
./configure PETSC_ARCH=arch-intel-opt --with-cc=mpiicc --with-cxx=mpiicpc
--with-fc=mpiifort --with-clanguage=cxx --with-debugging=0 COPTFLAGS='-ipo
-03 -xHost' CXXOPTFLAGS='-ipo -03 -xHost' FOPTFLAGS='-ipo -03 -xHost'
--download-scalapack=yes --with-blas-lapack-dir=/opt/intel/mkl
--with-mkl_pardiso-dir=/opt/intel/mkl --with-mkl_cpardiso-dir=/opt/intel/mkl

# Then I run this configure script as a regular user. I make and check it
as a regular user as well.
./config-3.10.2opt
make all
make check

# My suggestion would be to completely delete your failed build, extract
fresh source files from the downloaded tarfile, then try the above.
Good luck,
Matt Overholt
CapeSym, Inc.
(508) 653-7100 x204
overh...@capesim.com


On Thu, Nov 29, 2018 at 8:19 AM Matthew Knepley via petsc-users <
petsc-users@mcs.anl.gov> wrote:

> On Thu, Nov 29, 2018 at 8:17 AM Edoardo alinovi 
> wrote:
>
>> Ok, this makes sense, but the reason to use sudo was this one:
>>
>
> sudo rm RDict.log
>
>   Matt
>
>
>> [Errno 13] Permission denied: 'RDict.log'
>>   File "./config/configure.py", line 391, in petsc_configure
>> framework =
>> config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:],
>> loadArgDB = 0)
>>   File
>> "/home/edo/software/petsc_intel/config/BuildSystem/config/framework.py",
>> line 80, in __init__
>> argDB = RDict.RDict(load = loadArgDB)
>>   File "/home/edo/software/petsc_intel/config/BuildSystem/RDict.py", line
>> 90, in __init__
>> self.setupLogFile()
>>   File "/home/edo/software/petsc_intel/config/BuildSystem/RDict.py", line
>> 145, in setupLogFile
>> self.logFile = file(filename, 'a')
>>
>> --
>>
>> Edoardo Alinovi, Ph.D.
>>
>> DICCA, Scuola Politecnica,
>> Universita' degli Studi di Genova,
>> 1, via Montallegro,
>> 16145 Genova, Italy
>>
>>
>> Il giorno gio 29 nov 2018 alle ore 14:10 Matthew Knepley <
>> knep...@gmail.com> ha scritto:
>>
>>> On Thu, Nov 29, 2018 at 4:23 AM Edoardo alinovi <
>>> edoardo.alin...@gmail.com> wrote:
>>>
 Hello guys,

 thank you very much for you suggestions and sorry for getting back you
 late. Unfortunately, actually my attempts to compile with intel are not
 successful.

 Here my command:

>>>
>>> Do NOT sudo the configure. This is really dangerous, and as you see
>>> 'root' has a different path than you do. Run configure normally
>>> and make, and only use 'sudo' for 'make install'.
>>>
>>>   Thanks,
>>>
>>>  Matt
>>>
>>>
 sudo ./configure --prefix=/home/edo/software/petsc-3.10.2
 PETSC_ARCH=arch-intel-opt --with-cc=mpiicc --with-cxx=mpiiccp
 --with-fc=mpiifort FOPTFLAGS='-O3' COPTFLAGS='-O3' CXXOPTFLAGS='-O3'
 --with-blas-lapack-dir=/home/edo/intel/mkl/lib/intel64/ --with-debugging=no
 --download-fblaslapack=1 --download-superlu_dist --download-mumps
 --download-hypre --download-metis --download-parmetis

 The log file states that:

 -- LOG --

 TEST checkCCompiler from
 config.setCompilers(/home/edo/software/petsc_intel/config/BuildSystem/config/setCompilers.py:587)
 TESTING: checkCCompiler from
 config.setCompilers(config/BuildSystem/config/setCompilerspy:587)
   Locate a functional C compiler
 Checking for program /usr/sbin/mpiicc...not found
 Checking for program /usr/bin/mpiicc...not found
 Checking for program /sbin/mpiicc...not found
 Checking for program /bin/mpiicc...not found
 Checking for program
 /home/edo/software/petsc_intel/lib/petsc/bin/win32fe/mpiicc...not found

 ***
  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log
 for details):

 ---
 C compiler you provided with -with-cc=mpiicc does not work.

 

 It seems that petsc is not able to find mpiicc. However, the path to
 intel is well defined in my .bashrc and I can easily compile a test file
 with those compiler.

 if I put the full path in --with-cc= ...  then I get:

 -- LOG