Re: [petsc-users] Saving Vecs/Mats in HDF5 and visualizing in Matlab

2019-03-19 Thread Yuyun Yang via petsc-users
Sounds good, thanks for the advice!

-Original Message-
From: Jed Brown  
Sent: Tuesday, March 19, 2019 1:23 PM
To: Yuyun Yang ; zakaryah ; 
petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Saving Vecs/Mats in HDF5 and visualizing in Matlab

Yuyun Yang via petsc-users  writes:

> It's simply for visualization purposes. I wasn't sure if HDF5 would perform 
> better than binary, and what specific functions are needed to load the PETSc 
> vectors/matrices, so wanted to ask for some advice here. Since Matt mentioned 
> it's not likely to be much faster than binary, I guess there is no need to 
> make the change?

There is no speed benefit.  The advantage of HDF5 is that it supports better 
metadata, including the data types and sizes.  The PETSc data format is quick, 
dirty, and fast.



Re: [petsc-users] MatCompositeMerge + MatCreateRedundantMatrix

2019-03-19 Thread Marius Buerkle via petsc-users
If it is not too complicated it would be nice if MPICreateSubMatricesMPI  would be avaible for other matrices than MPIAIJ.
 
 

 



On Mon, Mar 18, 2019 at 10:30 PM Marius Buerkle  wrote:





I have another question regarding MatCreateRedundantMatrix and MPICreateSubMatricesMPI. The former works for MPIAIJ and  MPIDENSE and the later only for MPIAIJ. Would it be possible to use MatCreateRedundantMatrix with a factored matrix




 

We usually do not have direct access to the data for factored matrices since it lives in the factorization package.

 




and MPICreateSubMatricesMPI with dense and/or elemental matrices ?




 

This would not be hard, it would just take time. The dense case is easy. I think Elemental already has such an operation, but we would have to find it and call it.

 

  Thanks,

 

    Matt 








Indeed, was very easy to add. Are you going to include the Fortran interface for MPICreateSubMatricesMPI  in future releases of PETSC ?

Regarding my initial problem, thanks a lot. It works very well with MPICreateSubMatricesMPI  and the solution can be implemented in a few lines. 

Thanks and Best,

Marius

 

 




On Tue, Mar 12, 2019 at 4:50 AM Marius Buerkle  wrote:





I tried to follow your suggestions but it seems there is no MPICreateSubMatricesMPI for Fortran. Is this correct?




 

We just have to write the binding. Its almost identical to MatCreateSubMatrices() in src/mat/interface/ftn-custom/zmatrixf.c

 

   Matt 






 



On Wed, Feb 20, 2019 at 6:57 PM Marius Buerkle  wrote:





ok, I think I understand now. I will give it a try and if there is some trouble comeback to you. thanks.




 

Cool.

 

   Matt

 




 

marius



 



On Tue, Feb 19, 2019 at 8:42 PM Marius Buerkle  wrote:




ok, so it seems there is no straight forward way to transfer data between PETSc matrices on different subcomms. Probably doing it by "hand" extracting the matricies on the subcomms create a MPI_INTERCOMM transfering the data to PETSC_COMM_WORLD and assembling them in a new PETSc matrix would be possible, right?



 

That sounds too complicated. Why not just reverse MPICreateSubMatricesMPI()? Meaning make it collective on the whole big communicator, so that you can swap out all the subcommunicator for the aggregation call, just like we do in that function.

Then its really just a matter of reversing the communication call.

 

   Matt 






 



On Tue, Feb 19, 2019 at 7:12 PM Marius Buerkle  wrote:





I see. This would work if the matrices are on different subcommumicators ? Is it possible to add this functionality ?




 

Hmm, no. That is specialized to serial matrices. You need the inverse of MatCreateSubMatricesMPI().

 

  Thanks,

 

     Matt

  




marius

 

 


You basically need the inverse of MatCreateSubmatrices(). I do not think we have that right now, but it could probably be done without too much trouble by looking at that code.
 

  Thanks,

 

     Matt

 


On Tue, Feb 19, 2019 at 6:15 AM Marius Buerkle via petsc-users  wrote:




Hi !

 

Is there some way to combine MatCompositeMerge with MatCreateRedundantMatrix? I basically want to create copies of a matrix from PETSC_COMM_WORLD to subcommunicators, do some work on each subcommunicator and than gather the results back to PETSC_COMM_WORLD, namely  I want to sum the  the invidual matrices from the subcommunicatos component wise and get the resulting matrix on PETSC_COMM_WORLD. Is this somehow possible without going through all the hassle of using MPI directly? 

 

marius




 

 
--







What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/













 

 
--







What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/














 

 
--







What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/














 

 
--







What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/














 

 
--







What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/



















 

 
--







What most experimenters take for granted before they begin their experiments is infinitely 

Re: [petsc-users] Saving Vecs/Mats in HDF5 and visualizing in Matlab

2019-03-19 Thread Jed Brown via petsc-users
Yuyun Yang via petsc-users  writes:

> It's simply for visualization purposes. I wasn't sure if HDF5 would perform 
> better than binary, and what specific functions are needed to load the PETSc 
> vectors/matrices, so wanted to ask for some advice here. Since Matt mentioned 
> it's not likely to be much faster than binary, I guess there is no need to 
> make the change?

There is no speed benefit.  The advantage of HDF5 is that it supports
better metadata, including the data types and sizes.  The PETSc data
format is quick, dirty, and fast.


Re: [petsc-users] Saving Vecs/Mats in HDF5 and visualizing in Matlab

2019-03-19 Thread Yuyun Yang via petsc-users
It's simply for visualization purposes. I wasn't sure if HDF5 would perform 
better than binary, and what specific functions are needed to load the PETSc 
vectors/matrices, so wanted to ask for some advice here. Since Matt mentioned 
it's not likely to be much faster than binary, I guess there is no need to make 
the change?

So running h5read will load the vector from the hdf5 file directly to a Matlab 
vector? And similarly so for matrices?

Thanks,
Yuyun

Get Outlook for iOS

From: petsc-users  on behalf of zakaryah via 
petsc-users 
Sent: Tuesday, March 19, 2019 11:54:02 AM
To: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Saving Vecs/Mats in HDF5 and visualizing in Matlab

Hi Yuyun,

I'm not sure exactly what you want to do but I use Matlab to work with and 
visualize HDF5 files from PETSc all the time.  Matlab has h5info and h5read 
routines, then I visualize with my own routines.  Is there something specific 
you need from Matlab?

On Tue, Mar 19, 2019 at 1:18 PM Yuyun Yang via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:
Got it, thanks!

From: Matthew Knepley mailto:knep...@gmail.com>>
Sent: Tuesday, March 19, 2019 10:10 AM
To: Yuyun Yang mailto:yyan...@stanford.edu>>
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Saving Vecs/Mats in HDF5 and visualizing in Matlab

On Tue, Mar 19, 2019 at 11:58 AM Yuyun Yang via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:
Hello team,

Currently we’re using the PETSc binary file format to save Vecs and Mats and 
visualize them in Matlab. It looks like HDF5 works more efficiently with large 
data sets (faster I/O), and we’re wondering if PETSc Vecs/Mats saved in HDF5 
viewer can be visualized in Matlab as well?

We do not have code for that. I am using Paraview to look at HDF5 since 
everything I do is on 2D and 3D meshes. Note that HDF5 is not likely to have 
faster I/O than the PETSc binary.

  Thanks,

Matt


Thanks for your help,
Yuyun


--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/


Re: [petsc-users] Saving Vecs/Mats in HDF5 and visualizing in Matlab

2019-03-19 Thread Yuyun Yang via petsc-users
Got it, thanks!

From: Matthew Knepley 
Sent: Tuesday, March 19, 2019 10:10 AM
To: Yuyun Yang 
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Saving Vecs/Mats in HDF5 and visualizing in Matlab

On Tue, Mar 19, 2019 at 11:58 AM Yuyun Yang via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:
Hello team,

Currently we’re using the PETSc binary file format to save Vecs and Mats and 
visualize them in Matlab. It looks like HDF5 works more efficiently with large 
data sets (faster I/O), and we’re wondering if PETSc Vecs/Mats saved in HDF5 
viewer can be visualized in Matlab as well?

We do not have code for that. I am using Paraview to look at HDF5 since 
everything I do is on 2D and 3D meshes. Note that HDF5 is not likely to have 
faster I/O than the PETSc binary.

  Thanks,

Matt


Thanks for your help,
Yuyun


--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/


[petsc-users] Saving Vecs/Mats in HDF5 and visualizing in Matlab

2019-03-19 Thread Yuyun Yang via petsc-users
Hello team,

Currently we're using the PETSc binary file format to save Vecs and Mats and 
visualize them in Matlab. It looks like HDF5 works more efficiently with large 
data sets (faster I/O), and we're wondering if PETSc Vecs/Mats saved in HDF5 
viewer can be visualized in Matlab as well?

Thanks for your help,
Yuyun


Re: [petsc-users] PCFieldSplit gives different results for direct and iterative solver

2019-03-19 Thread Mark Adams via petsc-users
>
>
>
>   -fieldsplit_velocity_ksp_type preonly -fieldsplit_velocity_pc_type gamg
>   -fieldsplit_pressure_ksp_type minres  -fieldsplit_pressure_pc_type none
>

You should use cg for the ksp_type with gamg if you are symmetric and gmres
if not (you can try cg even if it is mildly asymmetric).

minres is for indefinite symmetric, but you probably are SPD and should use
cg.


Re: [petsc-users] [petsc4py] DMPlexCreateFromDAG and other missing functions

2019-03-19 Thread Chris Finn via petsc-users
I am sorry for the bad formatting of my createFromDAG function. I attach this 
function, and I hope it will be easier to read.

Mar 19, 2019, 12:46 PM by finnkochin...@keemail.me:

> Thanks Matt,
> I tried your suggestions to implement DMPlexCreateFromDAG in petsc4py:
>
>
> 1. In > petsc4py/src/PETSc/petscdmplex.pxi>  I uncommented:
>
> > int>  > DMMPlexCreateFromDAG> (> PetscDM> ,> PetscInt> ,> 
> const_PetscInt> [],> const_PetscInt> [], > 
> const_PetscInt> [],> const_PetscInt> [],> const_PetscScalar> [])
>
> 2. In > petsc4py/src/PETSc/DMPlex.pyx>  I added:
>
>   > def>  > createFromDAG> (> self> , > depth> , > numPoints> , > coneSize> , 
> > cones> , > coneOrientations> , > vertexCoords> ):> cdef>  > 
> PetscInt>   > cdepth>  > =>  > asInt> (> depth> )> cdef>  > const>  > 
> PetscInt>  > *> cnumPoints>  > =>  > NULL> > cdef>  > const>  > 
> PetscInt>  > *> cconeSize>  > =>  > NULL> > cdef>  > const>  > 
> PetscInt>  > *> ccones>  > =>  > NULL> > cdef>  > const>  > PetscInt> 
>  > *> cconeOrientations>  > =>  > NULL> > cdef>  > const>  > 
> PetscScalar>  > *> cvertexCoords>  > =>  > NULL> > cnumPoints>  > =>  
> > <> const>  > PetscInt>  > *> >>  > PyArray_DATA> (> numPoints> )
> > cconeSize>  > =>  > <> const>  > PetscInt>  > *> >>  > PyArray_DATA> (> 
> coneSize> )> ccones>  > =>  > <> const>  > PetscInt>  > *> >>  > 
> PyArray_DATA> (> cones> )> cconeOrientations>  > =>  > <> const>  > 
> PetscInt>  > *> >>  > PyArray_DATA> (> coneOrientations> )> 
> cvertexCoords>  > =>  > <> const>  > PetscScalar>  > *> >>  > 
> PyArray_DATA> (> vertexCoords> )> CHKERR> ( > DMPlexCreateFromDAG> (> 
> self> .> dm> , > cdepth> , > cnumPoints> , > cconeSize> , > ccones> , 
> > cconeOrientations> , > cvertexCoords> ) )> return>  > self
> I am testing this function using this snippet (following > 
> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DMPLEX/DMPlexCreateFromDAG.html
>  
> >
>  ):
>
> import petsc4py
> import numpy as np
> import sys
> petsc4py.init(sys.argv)
> from petsc4py import PETSc
>
> dm=PETSc.DMPlex().create()
> dm.setType(PETSc.DM.Type.PLEX)
>
> numPoints=np.array([4,2])
> coneSize=np.array([3,3,0,0,0,0])
> cones=np.array([2,3,4, 3,5,4])
> coneOrientations=np.array([0,0,0, 0,0,0])
> vertexCoords=np.array([-1,0, 0,-1, 0,1, 1,0])
> depth=1
> dm.createFromDAG(depth,numPoints,coneSize,cones,coneOrientations,vertexCoords)
>
> It fails with output:
>
> Traceback (most recent call last):
>   File "test.py", line 16, in 
>     
> dm.createFromDAG(depth,numPoints,coneSize,cones,coneOrientations,vertexCoords)
>   File "PETSc/DMPlex.pyx", line 64, in petsc4py.PETSc.DMPlex.createFromDAG
> petsc4py.PETSc.Error: error code 63
> [0] DMPlexCreateFromDAG() line 1669 in 
> /build/petsc-vurd6G/petsc-3.7.7+dfsg1/src/dm/impls/plex/plexcreate.c
> [0] DMPlexSetCone() line 1066 in 
> /build/petsc-vurd6G/petsc-3.7.7+dfsg1/src/dm/impls/plex/plex.c
> [0] Argument out of range
> [0] Cone point 4 is not in the valid range [0, 4)
>
> Someone can spot the problem in my python wrapping attempts?
> I assume my test.py snippet should be fine. At least, the equivalent 
> C-snippet runs without problems:
>
> #include 
> int main(int argc,char **argv){
>   PetscInitialize(, , NULL, NULL);
>   DM  dm;
>   int dim=2;
>   DMPlexCreate(PETSC_COMM_WORLD,);
>   DMSetType(dm, DMPLEX);
>   DMSetDimension(dm,dim);
>   int depth=1;
>   int numPoints[]={4,2};
>   int coneSize[]={3,3,0,0,0,0};
>   int cones[]={2, 3, 4,  3, 5, 4};
>   int coneOrientations[]={0,0,0, 0,0,0};
>   double vertexCoords[]={-1,0, 0,-1, 0,1, 1,0};
>   DMPlexCreateFromDAG(dm, depth, numPoints, coneSize, cones, 
> coneOrientations,vertexCoords);
>   PetscFinalize();
> }
>
> regards
> Chris
>
> Mar 8, 2019, 6:38 PM by > knep...@gmail.com > :
>
>> On Fri, Mar 8, 2019 at 11:02 AM Chris Finn via petsc-users <>> 
>> petsc-users@mcs.anl.gov >> > wrote:
>>
>>> Dear petsc4py experts,
>>> I'd like to ask why several PETSc functions are not wrapped in petsc4py. 
>>> I'd need to use DMPlexCreateFromDAG from python. Could you explain with 
>>> this function as an example why there is no python wrapper available? Do I 
>>> have to expect severe difficulties when I try this myself - impossible data 
>>> structures, memory management or something else?
>>>
>>
>> Lisandro is the expert, but I will try answering. The main problem is just 
>> time. There is no documentation for contributing, but what I do
>> is copy a function that is pretty much like the one I want. So I think 
>> DMPlexCreateFromCellList() is wrapped, and it looks almost the same.
>>
>>   Thanks,
>>
>>     Matt
>>  
>>
>>> Then, if it was just lack of time that prevented these functions from being 
>>> 

Re: [petsc-users] SLEPc Build Error

2019-03-19 Thread Jose E. Roman via petsc-users
What is the output of 'make check' in $PETSC_DIR ?

> El 19 mar 2019, a las 13:02, Eda Oktay  escribió:
> 
> Without slepc, I configured petsc succesfully. Then I installed slepc with 
> following steps:
> 
> export PETSC_ARCH=arch-linux2-c-debug
> export PETSC_DIR=/home/slurm_local/e200781/petsc-3.10.4
> export SLEPC_DIR=/home/slurm_local/e200781/slepc-3.10.2
> cd slepc-3.10.2
> ./configure
> 
> However, I still get the error:
> 
> Checking environment... done
> Checking PETSc installation... 
> ERROR: Unable to link with PETSc
> ERROR: See "arch-linux2-c-debug/lib/slepc/conf/configure.log" file for details
> 
> Where the configure.log is:
> 
> 
> Starting Configure Run at Tue Mar 19 14:57:21 2019
> Configure Options: 
> Working directory: /home/slurm_local/e200781/slepc-3.10.2
> Python version:
> 2.7.9 (default, Sep 25 2018, 20:42:16) 
> [GCC 4.9.2]
> make: /usr/bin/make
> PETSc source directory: /home/slurm_local/e200781/petsc-3.10.4
> PETSc install directory: 
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug
> PETSc version: 3.10.4
> PETSc architecture: arch-linux2-c-debug
> SLEPc source directory: /home/slurm_local/e200781/slepc-3.10.2
> SLEPc version: 3.10.2
> 
> Checking PETSc installation...
> #include "petscsnes.h"
> int main() {
> Vec v; Mat m; KSP k;
> PetscInitializeNoArguments();
> VecCreate(PETSC_COMM_WORLD,);
> MatCreate(PETSC_COMM_WORLD,);
> KSPCreate(PETSC_COMM_WORLD,);
> return 0;
> }
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/bin/mpicc -o 
> checklink.o -c -fPIC  -Wall -Wwrite-strings -Wno-strict-aliasing 
> -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden -g3   
> -I/home/slurm_local/e200781/petsc-3.10.4/include 
> -I/home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/include
> `pwd`/checklink.c
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/bin/mpicc -fPIC  
> -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas 
> -fstack-protector -fvisibility=hidden -g3  -o checklink checklink.o  
> -Wl,-rpath,/home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib 
> -L/home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib 
> -Wl,-rpath,/home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib 
> -L/home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib 
> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9 
> -L/usr/lib/gcc/x86_64-linux-gnu/4.9 -Wl,-rpath,/usr/lib/x86_64-linux-gnu 
> -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu 
> -L/lib/x86_64-linux-gnu -lpetsc -lopenblas -lparmetis -lmetis -lm -lX11 
> -lstdc++ -ldl -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi 
> -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lpthread -lstdc++ -ldl
> /usr/bin/ld: warning: libmpi.so.0, needed by 
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib/libopenblas.so,
>  may conflict with libmpi.so.40
> /usr/bin/ld: warning: libmpi.so.0, needed by 
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib/libopenblas.so,
>  may conflict with libmpi.so.40
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib/libpetsc.so: 
> undefined reference to `MatPartitioningParmetisSetRepartition'
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib/libpetsc.so: 
> undefined reference to `MatPartitioningCreate_Parmetis'
> collect2: error: ld returned 1 exit status
> makefile:2: recipe for target 'checklink' failed
> make: *** [checklink] Error 1
> 
> ERROR: Unable to link with PETSc
> 
> Jose E. Roman , 19 Mar 2019 Sal, 14:08 tarihinde şunu 
> yazdı:
> There seems to be a link problem with PETSc.
> Suggest re-configuring without the option --download-slepc
> Then, after building PETSc, try 'make check' to make sure that PETSc is built 
> correctly. Then install SLEPc afterwards.
> Jose
> 
> 
> > El 19 mar 2019, a las 11:58, Eda Oktay  escribió:
> > 
> > 
> > Starting Configure Run at Tue Mar 19 11:53:05 2019
> > Configure Options: 
> > --prefix=/home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug
> > Working directory: 
> > /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/externalpackages/git.slepc
> > Python version:
> > 2.7.9 (default, Sep 25 2018, 20:42:16) 
> > [GCC 4.9.2]
> > make: /usr/bin/make
> > PETSc source directory: /home/slurm_local/e200781/petsc-3.10.4
> > PETSc install directory: 
> > /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug
> > PETSc version: 3.10.4
> > PETSc architecture: arch-linux2-c-debug
> > SLEPc source directory: 
> > /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/externalpackages/git.slepc
> > SLEPc install directory: 
> > /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug
> > SLEPc version: 3.10.1
> > 

Re: [petsc-users] Confusing Schur preconditioner behaviour

2019-03-19 Thread Dave May via petsc-users
Hi Colin,

On Tue, 19 Mar 2019 at 09:33, Cotter, Colin J 
wrote:

> Hi Dave,
>
> >If you are doing that, then you need to tell fieldsplit to use the Amat
> to define the splits otherwise it will define the Schur compliment as
> >S = B22 - B21 inv(B11) B12
> >preconditiones with B22, where as what you want is
> >S = A22 - A21 inv(A11) A12
> >preconditioned with B22.
>
> >If your operators are set up this way and you didn't indicate to use Amat
> to define S this would definitely explain why preonly works but iterating
> on Schur does not.
>
> Yes, thanks - this solves it! I need pc_use_amat.
>

Okay great. But doesn't that option eradicate your custom Schur complement
object which you inserted into the Bmat in the (2,2) slot?

I thought you would use the option
-pc_fieldsplit_diag_use_amat

In general for fieldsplit (Schur) I found that the best way to manage user
defined Schur complement preconditioners is via PCFieldSplitSetSchurPre().

https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre

Also, for solver debugging purposes with fieldsplit and MatNest, I find it
incredibly useful to attach textual names to all the matrices going to into
FieldSplit. You can use PetscObjectSetName() with each of your sub-matrices
in the Amat and the Bmat, and any schur complement operators. The textual
names will be displayed in KSP view. In that way you have a better chance
of understanding which operators are being used where. (Note that this
trick is less useful with the Amat and Bmat are AIJ matrices).

Below is an example KSPView associated with 2x2 block system where I've
attached the names Auu,Aup,Apu,App, and S* to the Amat sub-matices and the
schur complement preconditioner.

PC Object:(dcy_) 1 MPI processes

  type: fieldsplit

FieldSplit with Schur preconditioner, factorization FULL

Preconditioner for the Schur complement formed from Sp, an assembled
approximation to S, which uses (lumped, if requested) A00's diagonal's
inverse

Split info:

Split number 0 Defined by IS

Split number 1 Defined by IS

KSP solver for A00 block

  KSP Object:  (dcy_fieldsplit_u_)   1 MPI processes

type: preonly

maximum iterations=1, initial guess is zero

tolerances:  relative=1e-05, absolute=1e-50, divergence=1.

left preconditioning

using NONE norm type for convergence test

  PC Object:  (dcy_fieldsplit_u_)   1 MPI processes

type: lu

  LU: out-of-place factorization

  tolerance for zero pivot 2.22045e-14

  matrix ordering: nd

  factor fill ratio given 0., needed 0.

Factored matrix follows:

  Mat Object:   1 MPI processes

type: seqaij

rows=85728, cols=85728

package used to perform factorization: umfpack

total: nonzeros=0, allocated nonzeros=0

total number of mallocs used during MatSetValues calls =0

  not using I-node routines

  UMFPACK run parameters:

Control[UMFPACK_PRL]: 1.

Control[UMFPACK_STRATEGY]: 0.

Control[UMFPACK_DENSE_COL]: 0.2

Control[UMFPACK_DENSE_ROW]: 0.2

Control[UMFPACK_AMD_DENSE]: 10.

Control[UMFPACK_BLOCK_SIZE]: 32.

Control[UMFPACK_FIXQ]: 0.

Control[UMFPACK_AGGRESSIVE]: 1.

Control[UMFPACK_PIVOT_TOLERANCE]: 0.1

Control[UMFPACK_SYM_PIVOT_TOLERANCE]: 0.001

Control[UMFPACK_SCALE]: 1.

Control[UMFPACK_ALLOC_INIT]: 0.7

Control[UMFPACK_DROPTOL]: 0.

Control[UMFPACK_IRSTEP]: 0.

Control[UMFPACK_ORDERING]: AMD (not using the PETSc
ordering)

linear system matrix = precond matrix:

Mat Object:Auu(dcy_fieldsplit_u_) 1 MPI
processes

  type: seqaij

  rows=85728, cols=85728

  total: nonzeros=1028736, allocated nonzeros=1028736

  total number of mallocs used during MatSetValues calls =0

using I-node routines: found 21432 nodes, limit used is 5

KSP solver for S = A11 - A10 inv(A00) A01

  KSP Object:  (dcy_fieldsplit_p_)   1 MPI processes

type: fgmres

  GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement

  GMRES: happy breakdown tolerance 1e-30

maximum iterations=300, initial guess is zero

tolerances:  relative=0.01, absolute=1e-50, divergence=1.

right preconditioning

using UNPRECONDITIONED norm type for convergence test

  PC Object:  (dcy_fieldsplit_p_)   1 MPI processes

type: lu

  LU: out-of-place 

Re: [petsc-users] SLEPc Build Error

2019-03-19 Thread Jose E. Roman via petsc-users
There seems to be a link problem with PETSc.
Suggest re-configuring without the option --download-slepc
Then, after building PETSc, try 'make check' to make sure that PETSc is built 
correctly. Then install SLEPc afterwards.
Jose


> El 19 mar 2019, a las 11:58, Eda Oktay  escribió:
> 
> 
> Starting Configure Run at Tue Mar 19 11:53:05 2019
> Configure Options: 
> --prefix=/home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug
> Working directory: 
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/externalpackages/git.slepc
> Python version:
> 2.7.9 (default, Sep 25 2018, 20:42:16) 
> [GCC 4.9.2]
> make: /usr/bin/make
> PETSc source directory: /home/slurm_local/e200781/petsc-3.10.4
> PETSc install directory: 
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug
> PETSc version: 3.10.4
> PETSc architecture: arch-linux2-c-debug
> SLEPc source directory: 
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/externalpackages/git.slepc
> SLEPc install directory: 
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug
> SLEPc version: 3.10.1
> 
> Checking PETSc installation...
> #include "petscsnes.h"
> int main() {
> Vec v; Mat m; KSP k;
> PetscInitializeNoArguments();
> VecCreate(PETSC_COMM_WORLD,);
> MatCreate(PETSC_COMM_WORLD,);
> KSPCreate(PETSC_COMM_WORLD,);
> return 0;
> }
> make[2]: Entering directory '/tmp/slepc-2F1MtJ'
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/bin/mpicc -o 
> checklink.o -c -fPIC  -Wall -Wwrite-strings -Wno-strict-aliasing 
> -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden -g3   
> -I/home/slurm_local/e200781/petsc-3.10.4/include 
> -I/home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/include
> `pwd`/checklink.c
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/bin/mpicc -fPIC  
> -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas 
> -fstack-protector -fvisibility=hidden -g3  -o checklink checklink.o  
> -Wl,-rpath,/home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib 
> -L/home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib 
> -Wl,-rpath,/home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib 
> -L/home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib 
> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9 
> -L/usr/lib/gcc/x86_64-linux-gnu/4.9 -Wl,-rpath,/usr/lib/x86_64-linux-gnu 
> -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu 
> -L/lib/x86_64-linux-gnu -lpetsc -lopenblas -lparmetis -lmetis -lm -lX11 
> -lstdc++ -ldl -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi 
> -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lpthread -lstdc++ -ldl
> /usr/bin/ld: warning: libmpi.so.0, needed by 
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib/libopenblas.so,
>  may conflict with libmpi.so.40
> /usr/bin/ld: warning: libmpi.so.0, needed by 
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib/libopenblas.so,
>  may conflict with libmpi.so.40
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib/libpetsc.so: 
> undefined reference to `MatPartitioningParmetisSetRepartition'
> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib/libpetsc.so: 
> undefined reference to `MatPartitioningCreate_Parmetis'
> collect2: error: ld returned 1 exit status
> makefile:2: recipe for target 'checklink' failed
> make[2]: *** [checklink] Error 1
> make[2]: Leaving directory '/tmp/slepc-2F1MtJ'
> 
> ERROR: Unable to link with PETSc
> 
> 
> Jose E. Roman , 19 Mar 2019 Sal, 13:49 tarihinde şunu 
> yazdı:
> Correction: the correct path is 
> $PETSC_DIR/$PETSC_ARCH/externalpackages/git.slepc/$PETSC_ARCH/lib/slepc/conf/configure.log
> 
> 
> 
> > El 19 mar 2019, a las 11:46, Jose E. Roman via petsc-users 
> >  escribió:
> > 
> > And what is in $SLEPC_DIR/arch-linux2-c-debug/lib/slepc/conf/configure.log ?
> > Jose
> > 
> >> El 19 mar 2019, a las 11:41, Eda Oktay via petsc-users 
> >>  escribió:
> >> 
> >> This is slepc.log:
> >> 
> >> Checking environment... done
> >> Checking PETSc installation... 
> >> ERROR: Unable to link with PETSc
> >> ERROR: See "arch-linux2-c-debug/lib/slepc/conf/configure.log" file for 
> >> details
> >> 
> >> 
> >> 
> >> Matthew Knepley , 19 Mar 2019 Sal, 13:36 tarihinde şunu 
> >> yazdı:
> >> On Tue, Mar 19, 2019 at 6:31 AM Eda Oktay via petsc-users 
> >>  wrote:
> >> Hello,
> >> 
> >> I am trying to install PETSc with following configure options:
> >> 
> >> ./configure --download-openmpi --download-openblas --download-slepc 
> >> --download-cmake --download-metis --download-parmetis
> >> 
> >> Compilation is done but after the following command, I got an error:
> >> 
> >> make PETSC_DIR=/home/slurm_local/e200781/petsc-3.10.4 
> >> PETSC_ARCH=arch-linux2-c-debug all
> >> 
> >> *** Building slepc ***
> >> 

Re: [petsc-users] PCFieldSplit gives different results for direct and iterative solver

2019-03-19 Thread Y. Shidi via petsc-users

Hello Barry,

Thank you for your reply.

I reduced the tolerances and get desired solution.

I am solving a multiphase incompressible n-s problems and currently
we are using augmented lagrangina technique with uzawa iteration.
Because the problems are getting larger, we are also looking for some
other methods for solving the linear system.
I follow pcfieldsplit tutorial from:
https://www.mcs.anl.gov/petsc/documentation/tutorials/MSITutorial.pdf

However, it takes about 10s to finish one iteration and overall
it requires like 150s to complete one time step with 100k unknowns,
which is a long time compared to our current solver 10s for one
time step.

I tried the following options:
1).
-ksp_type fgmres -pc_type fieldsplit -pc_fieldsplit_type schur
-pc_fieldsplit_schur_factorization_type lower
 -fieldsplit_velocity_ksp_type preonly -fieldsplit_velocity_pc_type gamg
 -fieldsplit_pressure_ksp_type minres  -fieldsplit_pressure_pc_type none
2).
-ksp_type fgmres -pc_type fieldsplit -pc_fieldsplit_type schur
-pc_fieldsplit_schur_factorization_type diag
 -fieldsplit_velocity_ksp_type preonly -fieldsplit_velocity_pc_type gamg
 -fieldsplit_pressure_ksp_type minres  -fieldsplit_pressure_pc_type none
3).
-ksp_type fgmres -pc_type fieldsplit -pc_fieldsplit_type schur
-pc_fieldsplit_schur_factorization_type full
 -fieldsplit_velocity_ksp_type preonly -fieldsplit_velocity_pc_type lu
 -fieldsplit_pressure_ksp_rtol 1e-10 -fieldsplit_pressure_pc_type jacobi

So I am wondering if there is any other options that can help improve 
the

pcfieldsplit performance.

Kind Regards,
Shidi


On 2019-03-17 00:05, Smith, Barry F. wrote:
On Mar 16, 2019, at 6:50 PM, Y. Shidi via petsc-users 
 wrote:


Hello,

I am trying to solve the incompressible n-s equations by
PCFieldSplit.

The large matrix and vectors are formed by MatCreateNest()
and VecCreateNest().
The system is solved directly by the following command:
   -ksp_type fgmres \
   -pc_type fieldsplit \
   -pc_fieldsplit_type schur \
   -pc_fieldsplit_schur_fact_type full \
   -ksp_converged_reason \
   -ksp_monitor_true_residual \
   -fieldsplit_0_ksp_type preonly \
   -fieldsplit_0_pc_type cholesky \
   -fieldsplit_0_pc_factor_mat_solver_package mumps \
   -mat_mumps_icntl_28 2 \
   -mat_mumps_icntl_29 2 \
   -fieldsplit_1_ksp_type preonly \
   -fieldsplit_1_pc_type jacobi \
Output:
 0 KSP unpreconditioned resid norm 1.214252932161e+04 true resid norm 
1.214252932161e+04 ||r(i)||/||b|| 1.e+00
 1 KSP unpreconditioned resid norm 1.642782495109e-02 true resid norm 
1.642782495109e-02 ||r(i)||/||b|| 1.352916226594e-06

Linear solve converged due to CONVERGED_RTOL iterations 1

The system is solved iteratively by the following command:
   -ksp_type fgmres \
   -pc_type fieldsplit \
   -pc_fieldsplit_type schur \
   -pc_fieldsplit_schur_factorization_type diag \
   -ksp_converged_reason \
   -ksp_monitor_true_residual \
   -fieldsplit_0_ksp_type preonly \
   -fieldsplit_0_pc_type gamg \
   -fieldsplit_1_ksp_type minres \
   -fieldsplit_1_pc_type none \
Output:
 0 KSP unpreconditioned resid norm 1.214252932161e+04 true resid norm 
1.214252932161e+04 ||r(i)||/||b|| 1.e+00
 1 KSP unpreconditioned resid norm 2.184037364915e+02 true resid norm 
2.184037364915e+02 ||r(i)||/||b|| 1.798667565109e-02
 2 KSP unpreconditioned resid norm 2.120097409539e+02 true resid norm 
2.120097409635e+02 ||r(i)||/||b|| 1.746009709742e-02
 3 KSP unpreconditioned resid norm 4.364091658268e+01 true resid norm 
4.364091658575e+01 ||r(i)||/||b|| 3.594054865332e-03
 4 KSP unpreconditioned resid norm 2.632671796885e+00 true resid norm 
2.632671797020e+00 ||r(i)||/||b|| 2.168141189773e-04
 5 KSP unpreconditioned resid norm 2.209213998004e+00 true resid norm 
2.209213980361e+00 ||r(i)||/||b|| 1.819401808180e-04
 6 KSP unpreconditioned resid norm 4.683775185840e-01 true resid norm 
4.683775085753e-01 ||r(i)||/||b|| 3.857330677735e-05
 7 KSP unpreconditioned resid norm 3.042503284736e-02 true resid norm 
3.042503349258e-02 ||r(i)||/||b|| 2.505658638883e-06



Both methods give answers, but they are different


   What do you mean the answers are different? Do you mean the
solution x from KSPSolve() is different? How are you calculating their
difference and how different are they?

Since the solutions are only approximate; true residual norm is
around 1.642782495109e-02 and 3.042503349258e-02  for the two
different solvers there will only be a certain number of identical
digits in the two solutions (which depends on the condition number of
the original matrix). You can run both solvers with -ksp_rtol 1.e-12
and then (assuming everything is working correctly) the two solutions
will be much closer to each other.

   Barry


so I am wondering
if it is possible that you can help me figure out which part I am
doing wrong.

Thank you for your time.

Kind Regards,
Shidi


Re: [petsc-users] SLEPc Build Error

2019-03-19 Thread Jose E. Roman via petsc-users
Correction: the correct path is 
$PETSC_DIR/$PETSC_ARCH/externalpackages/git.slepc/$PETSC_ARCH/lib/slepc/conf/configure.log



> El 19 mar 2019, a las 11:46, Jose E. Roman via petsc-users 
>  escribió:
> 
> And what is in $SLEPC_DIR/arch-linux2-c-debug/lib/slepc/conf/configure.log ?
> Jose
> 
>> El 19 mar 2019, a las 11:41, Eda Oktay via petsc-users 
>>  escribió:
>> 
>> This is slepc.log:
>> 
>> Checking environment... done
>> Checking PETSc installation... 
>> ERROR: Unable to link with PETSc
>> ERROR: See "arch-linux2-c-debug/lib/slepc/conf/configure.log" file for 
>> details
>> 
>> 
>> 
>> Matthew Knepley , 19 Mar 2019 Sal, 13:36 tarihinde şunu 
>> yazdı:
>> On Tue, Mar 19, 2019 at 6:31 AM Eda Oktay via petsc-users 
>>  wrote:
>> Hello,
>> 
>> I am trying to install PETSc with following configure options:
>> 
>> ./configure --download-openmpi --download-openblas --download-slepc 
>> --download-cmake --download-metis --download-parmetis
>> 
>> Compilation is done but after the following command, I got an error:
>> 
>> make PETSC_DIR=/home/slurm_local/e200781/petsc-3.10.4 
>> PETSC_ARCH=arch-linux2-c-debug all
>> 
>> *** Building slepc ***
>> **ERROR*
>> Error building slepc. Check arch-linux2-c-debug/lib/petsc/conf/slepc.log
>> 
>> We need slepc.log
>> 
>>  Thanks,
>> 
>> Matt
>> 
>> 
>> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib/petsc/conf/petscrules:46:
>>  recipe for target 'slepcbuild' failed
>> make[1]: *** [slepcbuild] Error 1
>> make[1]: Leaving directory '/home/slurm_local/e200781/petsc-3.10.4'
>> **ERROR*
>>  Error during compile, check arch-linux2-c-debug/lib/petsc/conf/make.log
>>  Send it and arch-linux2-c-debug/lib/petsc/conf/configure.log to 
>> petsc-ma...@mcs.anl.gov
>> 
>> makefile:30: recipe for target 'all' failed
>> make: *** [all] Error 1
>> 
>> How can I fix the problem?
>> 
>> Thank you,
>> 
>> Eda
>> 
>> 
>> -- 
>> What most experimenters take for granted before they begin their experiments 
>> is infinitely more interesting than any results to which their experiments 
>> lead.
>> -- Norbert Wiener
>> 
>> https://www.cse.buffalo.edu/~knepley/
> 



Re: [petsc-users] SLEPc Build Error

2019-03-19 Thread Eda Oktay via petsc-users
This is slepc.log:

Checking environment... done
Checking PETSc installation...
ERROR: Unable to link with PETSc
ERROR: See "arch-linux2-c-debug/lib/slepc/conf/configure.log" file for
details



Matthew Knepley , 19 Mar 2019 Sal, 13:36 tarihinde şunu
yazdı:

> On Tue, Mar 19, 2019 at 6:31 AM Eda Oktay via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
>> Hello,
>>
>> I am trying to install PETSc with following configure options:
>>
>> ./configure --download-openmpi --download-openblas --download-slepc
>> --download-cmake --download-metis --download-parmetis
>>
>> Compilation is done but after the following command, I got an error:
>>
>> make PETSC_DIR=/home/slurm_local/e200781/petsc-3.10.4
>> PETSC_ARCH=arch-linux2-c-debug all
>>
>> *** Building slepc ***
>> **ERROR*
>> Error building slepc. Check arch-linux2-c-debug/lib/petsc/conf/slepc.log
>>
>
> We need slepc.log
>
>   Thanks,
>
>  Matt
>
>
>> 
>> /home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib/petsc/conf/petscrules:46:
>> recipe for target 'slepcbuild' failed
>> make[1]: *** [slepcbuild] Error 1
>> make[1]: Leaving directory '/home/slurm_local/e200781/petsc-3.10.4'
>> **ERROR*
>>   Error during compile, check arch-linux2-c-debug/lib/petsc/conf/make.log
>>   Send it and arch-linux2-c-debug/lib/petsc/conf/configure.log to
>> petsc-ma...@mcs.anl.gov
>> 
>> makefile:30: recipe for target 'all' failed
>> make: *** [all] Error 1
>>
>> How can I fix the problem?
>>
>> Thank you,
>>
>> Eda
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


[petsc-users] SLEPc Build Error

2019-03-19 Thread Eda Oktay via petsc-users
Hello,

I am trying to install PETSc with following configure options:

./configure --download-openmpi --download-openblas --download-slepc
--download-cmake --download-metis --download-parmetis

Compilation is done but after the following command, I got an error:

make PETSC_DIR=/home/slurm_local/e200781/petsc-3.10.4
PETSC_ARCH=arch-linux2-c-debug all

*** Building slepc ***
**ERROR*
Error building slepc. Check arch-linux2-c-debug/lib/petsc/conf/slepc.log

/home/slurm_local/e200781/petsc-3.10.4/arch-linux2-c-debug/lib/petsc/conf/petscrules:46:
recipe for target 'slepcbuild' failed
make[1]: *** [slepcbuild] Error 1
make[1]: Leaving directory '/home/slurm_local/e200781/petsc-3.10.4'
**ERROR*
  Error during compile, check arch-linux2-c-debug/lib/petsc/conf/make.log
  Send it and arch-linux2-c-debug/lib/petsc/conf/configure.log to
petsc-ma...@mcs.anl.gov

makefile:30: recipe for target 'all' failed
make: *** [all] Error 1

How can I fix the problem?

Thank you,

Eda


Re: [petsc-users] Confusing Schur preconditioner behaviour

2019-03-19 Thread Cotter, Colin J via petsc-users
Hi Dave,

>If you are doing that, then you need to tell fieldsplit to use the Amat to 
>define the splits otherwise it will define the Schur compliment as
>S = B22 - B21 inv(B11) B12
>preconditiones with B22, where as what you want is
>S = A22 - A21 inv(A11) A12
>preconditioned with B22.

>If your operators are set up this way and you didn't indicate to use Amat to 
>define S this would definitely explain why preonly works but iterating on 
>Schur does not.

Yes, thanks - this solves it! I need pc_use_amat.

all the best
--Colin