I originally solve that example problem using LU. But when I solve this one:

http://fenicsproject.org/documentation/dolfin/1.5.0/python/demo/documented/stokes-iterative/python/documentation.html

By simply running their code as is for TH and adding the one like I
mentioned for MTH, I get the following outputs when I pass in -ksp_monitor
-ksp_view and -log_summary

The latter obviously takes a greater amount of time and iterations to
converge, and it was using the solver and precondition options that was
originally designed for P2/P1. I haven't experimented around with this
fully yet.

Thanks,
Justin

On Tue, Jun 2, 2015 at 1:19 PM, Jed Brown <[email protected]> wrote:

> Lawrence Mitchell <[email protected]> writes:
> > Maybe Justin can chime in here, I don't know, I just happened to know
> > how the fenics implementation produces the "basis", so proffered that.
>
> Thanks, Lawrence.  Unfortunately, my original questions remain
> unanswered and now I'm doubly curious why FEniCS appears not to fail due
> to the singular linear system.
>
Passing options to PETSc: -ksp_monitor -ksp_view -log_summary
Solving linear system of size 112724 x 112724 (PETSc Krylov solver).
  0 KSP Residual norm 5.017503570069e+02 
  1 KSP Residual norm 3.737157859305e+01 
  2 KSP Residual norm 2.979324808435e+01 
  3 KSP Residual norm 7.076390589707e+00 
  4 KSP Residual norm 6.651019208389e+00 
  5 KSP Residual norm 3.040640197522e+00 
  6 KSP Residual norm 2.936453097103e+00 
  7 KSP Residual norm 1.443287107783e+00 
  8 KSP Residual norm 1.417561465252e+00 
  9 KSP Residual norm 7.533915944325e-01 
 10 KSP Residual norm 6.623728338048e-01 
 11 KSP Residual norm 4.561495922812e-01 
 12 KSP Residual norm 3.555846865824e-01 
 13 KSP Residual norm 2.964732303563e-01 
 14 KSP Residual norm 1.930070052823e-01 
 15 KSP Residual norm 1.792585450624e-01 
 16 KSP Residual norm 1.143903220357e-01 
 17 KSP Residual norm 1.048967460021e-01 
 18 KSP Residual norm 6.301418914100e-02 
 19 KSP Residual norm 6.250520956617e-02 
 20 KSP Residual norm 4.106922874156e-02 
 21 KSP Residual norm 4.104646869635e-02 
 22 KSP Residual norm 2.664758594872e-02 
 23 KSP Residual norm 2.597907401397e-02 
 24 KSP Residual norm 1.655514236429e-02 
 25 KSP Residual norm 1.579419759534e-02 
 26 KSP Residual norm 1.043065949981e-02 
 27 KSP Residual norm 9.768856836029e-03 
 28 KSP Residual norm 7.006258580611e-03 
 29 KSP Residual norm 6.103060333635e-03 
 30 KSP Residual norm 4.620147218783e-03 
 31 KSP Residual norm 3.513987400913e-03 
 32 KSP Residual norm 2.772766351999e-03 
 33 KSP Residual norm 1.891637515779e-03 
 34 KSP Residual norm 1.565364643042e-03 
 35 KSP Residual norm 9.159139436359e-04 
 36 KSP Residual norm 8.825693602071e-04 
 37 KSP Residual norm 4.792409993813e-04 
KSP Object: 1 MPI processes
  type: minres
  maximum iterations=10000, initial guess is zero
  tolerances:  relative=1e-06, absolute=1e-15, divergence=10000
  left preconditioning
  using PRECONDITIONED norm type for convergence test
PC Object: 1 MPI processes
  type: hypre
    HYPRE BoomerAMG preconditioning
    HYPRE BoomerAMG: Cycle type V
    HYPRE BoomerAMG: Maximum number of levels 25
    HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1
    HYPRE BoomerAMG: Convergence tolerance PER hypre call 0
    HYPRE BoomerAMG: Threshold for strong coupling 0.25
    HYPRE BoomerAMG: Interpolation truncation factor 0
    HYPRE BoomerAMG: Interpolation: max elements per row 0
    HYPRE BoomerAMG: Number of levels of aggressive coarsening 0
    HYPRE BoomerAMG: Number of paths for aggressive coarsening 1
    HYPRE BoomerAMG: Maximum row sums 0.9
    HYPRE BoomerAMG: Sweeps down         1
    HYPRE BoomerAMG: Sweeps up           1
    HYPRE BoomerAMG: Sweeps on coarse    1
    HYPRE BoomerAMG: Relax down          symmetric-SOR/Jacobi
    HYPRE BoomerAMG: Relax up            symmetric-SOR/Jacobi
    HYPRE BoomerAMG: Relax on coarse     Gaussian-elimination
    HYPRE BoomerAMG: Relax weight  (all)      1
    HYPRE BoomerAMG: Outer relax weight (all) 1
    HYPRE BoomerAMG: Using CF-relaxation
    HYPRE BoomerAMG: Measure type        local
    HYPRE BoomerAMG: Coarsen type        Falgout
    HYPRE BoomerAMG: Interpolation type  classical
  linear system matrix followed by preconditioner matrix:
  Matrix Object:   1 MPI processes
    type: seqaij
    rows=112724, cols=112724
    total: nonzeros=10553536, allocated nonzeros=10553536
    total number of mallocs used during MatSetValues calls =0
      using I-node routines: found 77769 nodes, limit used is 5
  Matrix Object:   1 MPI processes
    type: seqaij
    rows=112724, cols=112724
    total: nonzeros=10553536, allocated nonzeros=10553536
    total number of mallocs used during MatSetValues calls =0
      using I-node routines: found 77769 nodes, limit used is 5
************************************************************************************************************************
***             WIDEN YOUR WINDOW TO 120 CHARACTERS.  Use 'enscript -r -fCourier9' to print this document            ***
************************************************************************************************************************

---------------------------------------------- PETSc Performance Summary: ----------------------------------------------

Unknown Name on a linux-gnu-c-opt named pacotaco-xps with 1 processor, by justin Tue Jun  2 14:17:00 2015
Using Petsc Release Version 3.4.2, Jul, 02, 2013 

                         Max       Max/Min        Avg      Total 
Time (sec):           1.743e+01      1.00000   1.743e+01
Objects:              5.000e+01      1.00000   5.000e+01
Flops:                8.646e+08      1.00000   8.646e+08  8.646e+08
Flops/sec:            4.960e+07      1.00000   4.960e+07  4.960e+07
MPI Messages:         0.000e+00      0.00000   0.000e+00  0.000e+00
MPI Message Lengths:  0.000e+00      0.00000   0.000e+00  0.000e+00
MPI Reductions:       1.360e+02      1.00000

Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
                            e.g., VecAXPY() for real vectors of length N --> 2N flops
                            and VecAXPY() for complex vectors of length N --> 8N flops

Summary of Stages:   ----- Time ------  ----- Flops -----  --- Messages ---  -- Message Lengths --  -- Reductions --
                        Avg     %Total     Avg     %Total   counts   %Total     Avg         %Total   counts   %Total 
 0:      Main Stage: 1.7433e+01 100.0%  8.6460e+08 100.0%  0.000e+00   0.0%  0.000e+00        0.0%  1.350e+02  99.3% 

------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
   Count: number of times phase was executed
   Time and Flops: Max - maximum over all processors
                   Ratio - ratio of maximum to minimum over all processors
   Mess: number of messages sent
   Avg. len: average message length (bytes)
   Reduct: number of global reductions
   Global: entire computation
   Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
      %T - percent time in this phase         %f - percent flops in this phase
      %M - percent messages in this phase     %L - percent message lengths in this phase
      %R - percent reductions in this phase
   Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------
Event                Count      Time (sec)     Flops                             --- Global ---  --- Stage ---   Total
                   Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg len Reduct  %T %f %M %L %R  %T %f %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------

--- Event Stage 0: Main Stage

------------------------------------------------------------------------------------------------------------------------

Memory usage is given in bytes:

Object Type          Creations   Destructions     Memory  Descendants' Mem.
Reports information only for process 0.

--- Event Stage 0: Main Stage

              Viewer     1              0            0     0
           Index Set     6              6         4584     0
   IS L to G Mapping    10             10      3162152     0
              Vector    26             26     12664544     0
      Vector Scatter     3              3         1932     0
              Matrix     2              2    256897368     0
      Preconditioner     1              1         1072     0
       Krylov Solver     1              1         1160     0
========================================================================================================================
Average time to get PetscTime(): 0
#PETSc Option Table entries:
-ksp_monitor
-ksp_view
-log_summary
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure run at: Tue Dec 17 23:10:14 2013
Configure options: --with-shared-libraries --with-debugging=0 --useThreads 0 --with-clanguage=C++ --with-c-support --with-fortran-interfaces=1 --with-mpi-dir=/usr/lib/openmpi --with-mpi-shared=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-blacs=1 --with-blacs-include=/usr/include --with-blacs-lib="[/usr/lib/libblacsCinit-openmpi.so,/usr/lib/libblacs-openmpi.so]" --with-scalapack=1 --with-scalapack-include=/usr/include --with-scalapack-lib=/usr/lib/libscalapack-openmpi.so --with-mumps=1 --with-mumps-include=/usr/include --with-mumps-lib="[/usr/lib/libdmumps.so,/usr/lib/libzmumps.so,/usr/lib/libsmumps.so,/usr/lib/libcmumps.so,/usr/lib/libmumps_common.so,/usr/lib/libpord.so]" --with-umfpack=1 --with-umfpack-include=/usr/include/suitesparse --with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]" --with-cholmod=1 --with-cholmod-include=/usr/include/suitesparse --with-cholmod-lib=/usr/lib/libcholmod.so --with-spooles=1 --with-spooles-include=/usr/include/spooles --with-spooles-lib=/usr/lib/libspooles.so --with-hypre=1 --with-hypre-dir=/usr --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch --with-ptscotch-lib="[/usr/lib/libptesmumps.so,/usr/lib/libptscotch.so,/usr/lib/libptscotcherr.so]" --with-fftw=1 --with-fftw-include=/usr/include --with-fftw-lib="[/usr/lib/x86_64-linux-gnu/libfftw3.so,/usr/lib/x86_64-linux-gnu/libfftw3_mpi.so]" --CXX_LINKER_FLAGS=-Wl,--no-as-needed
-----------------------------------------
Libraries compiled on Tue Dec 17 23:10:14 2013 on lamiak 
Machine characteristics: Linux-3.2.0-37-generic-x86_64-with-Ubuntu-14.04-trusty
Using PETSc directory: /build/buildd/petsc-3.4.2.dfsg1
Using PETSc arch: linux-gnu-c-opt
-----------------------------------------

Using C compiler: mpicxx  -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O   -fPIC   ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: mpif90  -fPIC -Wall -Wno-unused-variable -Wno-unused-dummy-argument -O   ${FOPTFLAGS} ${FFLAGS} 
-----------------------------------------

Using include paths: -I/build/buildd/petsc-3.4.2.dfsg1/linux-gnu-c-opt/include -I/build/buildd/petsc-3.4.2.dfsg1/include -I/build/buildd/petsc-3.4.2.dfsg1/include -I/build/buildd/petsc-3.4.2.dfsg1/linux-gnu-c-opt/include -I/usr/include -I/usr/include/suitesparse -I/usr/include/scotch -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi
-----------------------------------------

Using C linker: mpicxx
Using Fortran linker: mpif90
Using libraries: -L/build/buildd/petsc-3.4.2.dfsg1/linux-gnu-c-opt/lib -L/build/buildd/petsc-3.4.2.dfsg1/linux-gnu-c-opt/lib -lpetsc -L/usr/lib -ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord -lscalapack-openmpi -lHYPRE_utilities -lHYPRE_struct_mv -lHYPRE_struct_ls -lHYPRE_sstruct_mv -lHYPRE_sstruct_ls -lHYPRE_IJ_mv -lHYPRE_parcsr_ls -lcholmod -lumfpack -lamd -llapack -lblas -lX11 -lpthread -lptesmumps -lptscotch -lptscotcherr -L/usr/lib/x86_64-linux-gnu -lfftw3 -lfftw3_mpi -lm -L/usr/lib/openmpi/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.8 -L/lib/x86_64-linux-gnu -lmpi_f90 -lmpi_f77 -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -ldl -lmpi -lhwloc -lgcc_s -lpthread -ldl 
-----------------------------------------

Passing options to PETSc: -ksp_monitor -ksp_view -log_summary
Solving linear system of size 137300 x 137300 (PETSc Krylov solver).
  0 KSP Residual norm 6.984814267994e+02 
  1 KSP Residual norm 5.493325830728e+01 
  2 KSP Residual norm 4.386928799903e+01 
  3 KSP Residual norm 1.031431631122e+01 
  4 KSP Residual norm 9.683970801727e+00 
  5 KSP Residual norm 4.492681679872e+00 
  6 KSP Residual norm 4.340306547098e+00 
  7 KSP Residual norm 2.135394287667e+00 
  8 KSP Residual norm 2.097412712810e+00 
  9 KSP Residual norm 1.096236093731e+00 
 10 KSP Residual norm 9.535512231328e-01 
 11 KSP Residual norm 6.917690133823e-01 
 12 KSP Residual norm 5.386662543399e-01 
 13 KSP Residual norm 4.616502798336e-01 
 14 KSP Residual norm 3.102893969880e-01 
 15 KSP Residual norm 2.917421220343e-01 
 16 KSP Residual norm 1.918400191865e-01 
 17 KSP Residual norm 1.812755188650e-01 
 18 KSP Residual norm 1.158182979594e-01 
 19 KSP Residual norm 1.151908164682e-01 
 20 KSP Residual norm 9.058449170786e-02 
 21 KSP Residual norm 9.055898020223e-02 
 22 KSP Residual norm 7.425649717505e-02 
 23 KSP Residual norm 7.351535447599e-02 
 24 KSP Residual norm 6.301950557241e-02 
 25 KSP Residual norm 6.261047067584e-02 
 26 KSP Residual norm 5.359165957152e-02 
 27 KSP Residual norm 5.308994456210e-02 
 28 KSP Residual norm 4.768264761089e-02 
 29 KSP Residual norm 4.721831467684e-02 
 30 KSP Residual norm 4.403681236290e-02 
 31 KSP Residual norm 4.318350122247e-02 
 32 KSP Residual norm 4.076488155961e-02 
 33 KSP Residual norm 3.982012042849e-02 
 34 KSP Residual norm 3.751275756065e-02 
 35 KSP Residual norm 3.658221022369e-02 
 36 KSP Residual norm 3.469321601254e-02 
 37 KSP Residual norm 3.379481252363e-02 
 38 KSP Residual norm 3.255925443205e-02 
 39 KSP Residual norm 3.162015121674e-02 
 40 KSP Residual norm 3.076771637763e-02 
 41 KSP Residual norm 2.965658595649e-02 
 42 KSP Residual norm 2.898320251436e-02 
 43 KSP Residual norm 2.780019547061e-02 
 44 KSP Residual norm 2.726851737782e-02 
 45 KSP Residual norm 2.594327781331e-02 
 46 KSP Residual norm 2.566217495496e-02 
 47 KSP Residual norm 2.437797968484e-02 
 48 KSP Residual norm 2.422341873735e-02 
 49 KSP Residual norm 2.290120676285e-02 
 50 KSP Residual norm 2.285352901537e-02 
 51 KSP Residual norm 2.164514606185e-02 
 52 KSP Residual norm 2.162907078765e-02 
 53 KSP Residual norm 2.051414555528e-02 
 54 KSP Residual norm 2.051095271222e-02 
 55 KSP Residual norm 1.944730228799e-02 
 56 KSP Residual norm 1.944668418773e-02 
 57 KSP Residual norm 1.847189038629e-02 
 58 KSP Residual norm 1.846732901693e-02 
 59 KSP Residual norm 1.753212431887e-02 
 60 KSP Residual norm 1.751835459746e-02 
 61 KSP Residual norm 1.668094765639e-02 
 62 KSP Residual norm 1.664561112552e-02 
 63 KSP Residual norm 1.595493992865e-02 
 64 KSP Residual norm 1.587138592544e-02 
 65 KSP Residual norm 1.526674252314e-02 
 66 KSP Residual norm 1.516389755723e-02 
 67 KSP Residual norm 1.456461445420e-02 
 68 KSP Residual norm 1.443066937800e-02 
 69 KSP Residual norm 1.396237329206e-02 
 70 KSP Residual norm 1.377290629743e-02 
 71 KSP Residual norm 1.343107220724e-02 
 72 KSP Residual norm 1.315264418425e-02 
 73 KSP Residual norm 1.289666837457e-02 
 74 KSP Residual norm 1.258663729938e-02 
 75 KSP Residual norm 1.236886249713e-02 
 76 KSP Residual norm 1.200793893597e-02 
 77 KSP Residual norm 1.185150980899e-02 
 78 KSP Residual norm 1.150879645690e-02 
 79 KSP Residual norm 1.138370996242e-02 
 80 KSP Residual norm 1.103680924490e-02 
 81 KSP Residual norm 1.094774068890e-02 
 82 KSP Residual norm 1.057788700323e-02 
 83 KSP Residual norm 1.052819244959e-02 
 84 KSP Residual norm 1.014274382922e-02 
 85 KSP Residual norm 1.012114645483e-02 
 86 KSP Residual norm 9.761031158883e-03 
 87 KSP Residual norm 9.747153655619e-03 
 88 KSP Residual norm 9.387960157131e-03 
 89 KSP Residual norm 9.381828538579e-03 
 90 KSP Residual norm 9.014277231204e-03 
 91 KSP Residual norm 9.014229856726e-03 
 92 KSP Residual norm 8.679417056348e-03 
 93 KSP Residual norm 8.672530935538e-03 
 94 KSP Residual norm 8.401096695481e-03 
 95 KSP Residual norm 8.386647501312e-03 
 96 KSP Residual norm 8.136805688624e-03 
 97 KSP Residual norm 8.120877067430e-03 
 98 KSP Residual norm 7.876344692055e-03 
 99 KSP Residual norm 7.847985478810e-03 
100 KSP Residual norm 7.628475446401e-03 
101 KSP Residual norm 7.592179030410e-03 
102 KSP Residual norm 7.404740000094e-03 
103 KSP Residual norm 7.357650097477e-03 
104 KSP Residual norm 7.195462011562e-03 
105 KSP Residual norm 7.134532459633e-03 
106 KSP Residual norm 6.995374858274e-03 
107 KSP Residual norm 6.898299033748e-03 
108 KSP Residual norm 6.788894145164e-03 
109 KSP Residual norm 6.663166054082e-03 
110 KSP Residual norm 6.576480180115e-03 
111 KSP Residual norm 6.436511247560e-03 
112 KSP Residual norm 6.371497085621e-03 
113 KSP Residual norm 6.212877009186e-03 
114 KSP Residual norm 6.162841840312e-03 
115 KSP Residual norm 6.008296261843e-03 
116 KSP Residual norm 5.970498088037e-03 
117 KSP Residual norm 5.803609398817e-03 
118 KSP Residual norm 5.782551823872e-03 
119 KSP Residual norm 5.614965018938e-03 
120 KSP Residual norm 5.600205150404e-03 
121 KSP Residual norm 5.433132880196e-03 
122 KSP Residual norm 5.425731162545e-03 
123 KSP Residual norm 5.256874096972e-03 
124 KSP Residual norm 5.254927993095e-03 
125 KSP Residual norm 5.089163105699e-03 
126 KSP Residual norm 5.089102966172e-03 
127 KSP Residual norm 4.929437575546e-03 
128 KSP Residual norm 4.929347790169e-03 
129 KSP Residual norm 4.775351896223e-03 
130 KSP Residual norm 4.775131087172e-03 
131 KSP Residual norm 4.623116520749e-03 
132 KSP Residual norm 4.621194521596e-03 
133 KSP Residual norm 4.474968977168e-03 
134 KSP Residual norm 4.467972810766e-03 
135 KSP Residual norm 4.335475722927e-03 
136 KSP Residual norm 4.321370236881e-03 
137 KSP Residual norm 4.192673391804e-03 
138 KSP Residual norm 4.169969918403e-03 
139 KSP Residual norm 4.056508102829e-03 
140 KSP Residual norm 4.033211133088e-03 
141 KSP Residual norm 3.926240479896e-03 
142 KSP Residual norm 3.898157230334e-03 
143 KSP Residual norm 3.798372393385e-03 
144 KSP Residual norm 3.764560363026e-03 
145 KSP Residual norm 3.679026290926e-03 
146 KSP Residual norm 3.636401035897e-03 
147 KSP Residual norm 3.564834809452e-03 
148 KSP Residual norm 3.509431881247e-03 
149 KSP Residual norm 3.449437869205e-03 
150 KSP Residual norm 3.391545980988e-03 
151 KSP Residual norm 3.342466836526e-03 
152 KSP Residual norm 3.282797985900e-03 
153 KSP Residual norm 3.236187780843e-03 
154 KSP Residual norm 3.176352807341e-03 
155 KSP Residual norm 3.137731081773e-03 
156 KSP Residual norm 3.064970523613e-03 
157 KSP Residual norm 3.043173685714e-03 
158 KSP Residual norm 2.956071465814e-03 
159 KSP Residual norm 2.944906845505e-03 
160 KSP Residual norm 2.856071804521e-03 
161 KSP Residual norm 2.849329676813e-03 
162 KSP Residual norm 2.760832406111e-03 
163 KSP Residual norm 2.756822446599e-03 
164 KSP Residual norm 2.674362715263e-03 
165 KSP Residual norm 2.672426700073e-03 
166 KSP Residual norm 2.592010161891e-03 
167 KSP Residual norm 2.591413647724e-03 
168 KSP Residual norm 2.508700856533e-03 
169 KSP Residual norm 2.508655178886e-03 
170 KSP Residual norm 2.427351730383e-03 
171 KSP Residual norm 2.426048977689e-03 
172 KSP Residual norm 2.352717573255e-03 
173 KSP Residual norm 2.350063898881e-03 
174 KSP Residual norm 2.279322998160e-03 
175 KSP Residual norm 2.274010421713e-03 
176 KSP Residual norm 2.210457546549e-03 
177 KSP Residual norm 2.202113341316e-03 
178 KSP Residual norm 2.144982008671e-03 
179 KSP Residual norm 2.133951020469e-03 
180 KSP Residual norm 2.081106686809e-03 
181 KSP Residual norm 2.064012313007e-03 
182 KSP Residual norm 2.020058317180e-03 
183 KSP Residual norm 1.998545518086e-03 
184 KSP Residual norm 1.961873939473e-03 
185 KSP Residual norm 1.935792219075e-03 
186 KSP Residual norm 1.906497948694e-03 
187 KSP Residual norm 1.875762773903e-03 
188 KSP Residual norm 1.851367084724e-03 
189 KSP Residual norm 1.816317110460e-03 
190 KSP Residual norm 1.795806462799e-03 
191 KSP Residual norm 1.756858290038e-03 
192 KSP Residual norm 1.741666680282e-03 
193 KSP Residual norm 1.698062986392e-03 
194 KSP Residual norm 1.689303898506e-03 
195 KSP Residual norm 1.642807981113e-03 
196 KSP Residual norm 1.638904306608e-03 
197 KSP Residual norm 1.593028881256e-03 
198 KSP Residual norm 1.591166714592e-03 
199 KSP Residual norm 1.546244741796e-03 
200 KSP Residual norm 1.544875627591e-03 
201 KSP Residual norm 1.502370789533e-03 
202 KSP Residual norm 1.500495175772e-03 
203 KSP Residual norm 1.461024931520e-03 
204 KSP Residual norm 1.458811768696e-03 
205 KSP Residual norm 1.423249804911e-03 
206 KSP Residual norm 1.421578904828e-03 
207 KSP Residual norm 1.386308667308e-03 
208 KSP Residual norm 1.385830483318e-03 
209 KSP Residual norm 1.350490466635e-03 
210 KSP Residual norm 1.350456088142e-03 
211 KSP Residual norm 1.318737063084e-03 
212 KSP Residual norm 1.318022566097e-03 
213 KSP Residual norm 1.288152493222e-03 
214 KSP Residual norm 1.285739719094e-03 
215 KSP Residual norm 1.258790266827e-03 
216 KSP Residual norm 1.253748922773e-03 
217 KSP Residual norm 1.232016874754e-03 
218 KSP Residual norm 1.225638505300e-03 
219 KSP Residual norm 1.207003584855e-03 
220 KSP Residual norm 1.199615759860e-03 
221 KSP Residual norm 1.181319315443e-03 
222 KSP Residual norm 1.171975374537e-03 
223 KSP Residual norm 1.155466566306e-03 
224 KSP Residual norm 1.143947780652e-03 
225 KSP Residual norm 1.130291551822e-03 
226 KSP Residual norm 1.116741479989e-03 
227 KSP Residual norm 1.106315210888e-03 
228 KSP Residual norm 1.089951820534e-03 
229 KSP Residual norm 1.082894133823e-03 
230 KSP Residual norm 1.063831639968e-03 
231 KSP Residual norm 1.060423051198e-03 
232 KSP Residual norm 1.040760147285e-03 
233 KSP Residual norm 1.038948669603e-03 
234 KSP Residual norm 1.018986703920e-03 
235 KSP Residual norm 1.018142432958e-03 
236 KSP Residual norm 9.980695764619e-04 
237 KSP Residual norm 9.979515968113e-04 
238 KSP Residual norm 9.793792227025e-04 
239 KSP Residual norm 9.793717135410e-04 
240 KSP Residual norm 9.618792461217e-04 
241 KSP Residual norm 9.615337767538e-04 
242 KSP Residual norm 9.447739551746e-04 
243 KSP Residual norm 9.436801422838e-04 
244 KSP Residual norm 9.284147767688e-04 
245 KSP Residual norm 9.267607907445e-04 
246 KSP Residual norm 9.133980843083e-04 
247 KSP Residual norm 9.117377646607e-04 
248 KSP Residual norm 8.994973201513e-04 
249 KSP Residual norm 8.978136832035e-04 
250 KSP Residual norm 8.862457521110e-04 
251 KSP Residual norm 8.830530626220e-04 
252 KSP Residual norm 8.728368040612e-04 
253 KSP Residual norm 8.679056613416e-04 
254 KSP Residual norm 8.587019562844e-04 
255 KSP Residual norm 8.519650508993e-04 
256 KSP Residual norm 8.441997185200e-04 
257 KSP Residual norm 8.364430350460e-04 
258 KSP Residual norm 8.299638968203e-04 
259 KSP Residual norm 8.216076970142e-04 
260 KSP Residual norm 8.163746492791e-04 
261 KSP Residual norm 8.066411406347e-04 
262 KSP Residual norm 8.028515211342e-04 
263 KSP Residual norm 7.928897005764e-04 
264 KSP Residual norm 7.900714115635e-04 
265 KSP Residual norm 7.797479647780e-04 
266 KSP Residual norm 7.775624496561e-04 
267 KSP Residual norm 7.652474269250e-04 
268 KSP Residual norm 7.634522920052e-04 
269 KSP Residual norm 7.490739584669e-04 
270 KSP Residual norm 7.478473965427e-04 
271 KSP Residual norm 7.326737060656e-04 
272 KSP Residual norm 7.319995713132e-04 
273 KSP Residual norm 7.157869349845e-04 
274 KSP Residual norm 7.155001681877e-04 
275 KSP Residual norm 6.981457453348e-04 
KSP Object: 1 MPI processes
  type: minres
  maximum iterations=10000, initial guess is zero
  tolerances:  relative=1e-06, absolute=1e-15, divergence=10000
  left preconditioning
  using PRECONDITIONED norm type for convergence test
PC Object: 1 MPI processes
  type: hypre
    HYPRE BoomerAMG preconditioning
    HYPRE BoomerAMG: Cycle type V
    HYPRE BoomerAMG: Maximum number of levels 25
    HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1
    HYPRE BoomerAMG: Convergence tolerance PER hypre call 0
    HYPRE BoomerAMG: Threshold for strong coupling 0.25
    HYPRE BoomerAMG: Interpolation truncation factor 0
    HYPRE BoomerAMG: Interpolation: max elements per row 0
    HYPRE BoomerAMG: Number of levels of aggressive coarsening 0
    HYPRE BoomerAMG: Number of paths for aggressive coarsening 1
    HYPRE BoomerAMG: Maximum row sums 0.9
    HYPRE BoomerAMG: Sweeps down         1
    HYPRE BoomerAMG: Sweeps up           1
    HYPRE BoomerAMG: Sweeps on coarse    1
    HYPRE BoomerAMG: Relax down          symmetric-SOR/Jacobi
    HYPRE BoomerAMG: Relax up            symmetric-SOR/Jacobi
    HYPRE BoomerAMG: Relax on coarse     Gaussian-elimination
    HYPRE BoomerAMG: Relax weight  (all)      1
    HYPRE BoomerAMG: Outer relax weight (all) 1
    HYPRE BoomerAMG: Using CF-relaxation
    HYPRE BoomerAMG: Measure type        local
    HYPRE BoomerAMG: Coarsen type        Falgout
    HYPRE BoomerAMG: Interpolation type  classical
  linear system matrix followed by preconditioner matrix:
  Matrix Object:   1 MPI processes
    type: seqaij
    rows=137300, cols=137300
    total: nonzeros=12249280, allocated nonzeros=12249280
    total number of mallocs used during MatSetValues calls =0
      using I-node routines: found 101325 nodes, limit used is 5
  Matrix Object:   1 MPI processes
    type: seqaij
    rows=137300, cols=137300
    total: nonzeros=12249280, allocated nonzeros=12249280
    total number of mallocs used during MatSetValues calls =0
      using I-node routines: found 101325 nodes, limit used is 5
************************************************************************************************************************
***             WIDEN YOUR WINDOW TO 120 CHARACTERS.  Use 'enscript -r -fCourier9' to print this document            ***
************************************************************************************************************************

---------------------------------------------- PETSc Performance Summary: ----------------------------------------------

Unknown Name on a linux-gnu-c-opt named pacotaco-xps with 1 processor, by justin Tue Jun  2 14:14:26 2015
Using Petsc Release Version 3.4.2, Jul, 02, 2013 

                         Max       Max/Min        Avg      Total 
Time (sec):           8.743e+01      1.00000   8.743e+01
Objects:              5.000e+01      1.00000   5.000e+01
Flops:                7.493e+09      1.00000   7.493e+09  7.493e+09
Flops/sec:            8.570e+07      1.00000   8.570e+07  8.570e+07
MPI Messages:         0.000e+00      0.00000   0.000e+00  0.000e+00
MPI Message Lengths:  0.000e+00      0.00000   0.000e+00  0.000e+00
MPI Reductions:       6.120e+02      1.00000

Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
                            e.g., VecAXPY() for real vectors of length N --> 2N flops
                            and VecAXPY() for complex vectors of length N --> 8N flops

Summary of Stages:   ----- Time ------  ----- Flops -----  --- Messages ---  -- Message Lengths --  -- Reductions --
                        Avg     %Total     Avg     %Total   counts   %Total     Avg         %Total   counts   %Total 
 0:      Main Stage: 8.7432e+01 100.0%  7.4925e+09 100.0%  0.000e+00   0.0%  0.000e+00        0.0%  6.110e+02  99.8% 

------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
   Count: number of times phase was executed
   Time and Flops: Max - maximum over all processors
                   Ratio - ratio of maximum to minimum over all processors
   Mess: number of messages sent
   Avg. len: average message length (bytes)
   Reduct: number of global reductions
   Global: entire computation
   Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
      %T - percent time in this phase         %f - percent flops in this phase
      %M - percent messages in this phase     %L - percent message lengths in this phase
      %R - percent reductions in this phase
   Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------
Event                Count      Time (sec)     Flops                             --- Global ---  --- Stage ---   Total
                   Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg len Reduct  %T %f %M %L %R  %T %f %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------

--- Event Stage 0: Main Stage

------------------------------------------------------------------------------------------------------------------------

Memory usage is given in bytes:

Object Type          Creations   Destructions     Memory  Descendants' Mem.
Reports information only for process 0.

--- Event Stage 0: Main Stage

              Viewer     1              0            0     0
           Index Set     6              6         4584     0
   IS L to G Mapping    10             10      3850280     0
              Vector    26             26     15417056     0
      Vector Scatter     3              3         1932     0
              Matrix     2              2    298381656     0
      Preconditioner     1              1         1072     0
       Krylov Solver     1              1         1160     0
========================================================================================================================
Average time to get PetscTime(): 0
#PETSc Option Table entries:
-ksp_monitor
-ksp_view
-log_summary
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure run at: Tue Dec 17 23:10:14 2013
Configure options: --with-shared-libraries --with-debugging=0 --useThreads 0 --with-clanguage=C++ --with-c-support --with-fortran-interfaces=1 --with-mpi-dir=/usr/lib/openmpi --with-mpi-shared=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-blacs=1 --with-blacs-include=/usr/include --with-blacs-lib="[/usr/lib/libblacsCinit-openmpi.so,/usr/lib/libblacs-openmpi.so]" --with-scalapack=1 --with-scalapack-include=/usr/include --with-scalapack-lib=/usr/lib/libscalapack-openmpi.so --with-mumps=1 --with-mumps-include=/usr/include --with-mumps-lib="[/usr/lib/libdmumps.so,/usr/lib/libzmumps.so,/usr/lib/libsmumps.so,/usr/lib/libcmumps.so,/usr/lib/libmumps_common.so,/usr/lib/libpord.so]" --with-umfpack=1 --with-umfpack-include=/usr/include/suitesparse --with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]" --with-cholmod=1 --with-cholmod-include=/usr/include/suitesparse --with-cholmod-lib=/usr/lib/libcholmod.so --with-spooles=1 --with-spooles-include=/usr/include/spooles --with-spooles-lib=/usr/lib/libspooles.so --with-hypre=1 --with-hypre-dir=/usr --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch --with-ptscotch-lib="[/usr/lib/libptesmumps.so,/usr/lib/libptscotch.so,/usr/lib/libptscotcherr.so]" --with-fftw=1 --with-fftw-include=/usr/include --with-fftw-lib="[/usr/lib/x86_64-linux-gnu/libfftw3.so,/usr/lib/x86_64-linux-gnu/libfftw3_mpi.so]" --CXX_LINKER_FLAGS=-Wl,--no-as-needed
-----------------------------------------
Libraries compiled on Tue Dec 17 23:10:14 2013 on lamiak 
Machine characteristics: Linux-3.2.0-37-generic-x86_64-with-Ubuntu-14.04-trusty
Using PETSc directory: /build/buildd/petsc-3.4.2.dfsg1
Using PETSc arch: linux-gnu-c-opt
-----------------------------------------

Using C compiler: mpicxx  -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O   -fPIC   ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: mpif90  -fPIC -Wall -Wno-unused-variable -Wno-unused-dummy-argument -O   ${FOPTFLAGS} ${FFLAGS} 
-----------------------------------------

Using include paths: -I/build/buildd/petsc-3.4.2.dfsg1/linux-gnu-c-opt/include -I/build/buildd/petsc-3.4.2.dfsg1/include -I/build/buildd/petsc-3.4.2.dfsg1/include -I/build/buildd/petsc-3.4.2.dfsg1/linux-gnu-c-opt/include -I/usr/include -I/usr/include/suitesparse -I/usr/include/scotch -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi
-----------------------------------------

Using C linker: mpicxx
Using Fortran linker: mpif90
Using libraries: -L/build/buildd/petsc-3.4.2.dfsg1/linux-gnu-c-opt/lib -L/build/buildd/petsc-3.4.2.dfsg1/linux-gnu-c-opt/lib -lpetsc -L/usr/lib -ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord -lscalapack-openmpi -lHYPRE_utilities -lHYPRE_struct_mv -lHYPRE_struct_ls -lHYPRE_sstruct_mv -lHYPRE_sstruct_ls -lHYPRE_IJ_mv -lHYPRE_parcsr_ls -lcholmod -lumfpack -lamd -llapack -lblas -lX11 -lpthread -lptesmumps -lptscotch -lptscotcherr -L/usr/lib/x86_64-linux-gnu -lfftw3 -lfftw3_mpi -lm -L/usr/lib/openmpi/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.8 -L/lib/x86_64-linux-gnu -lmpi_f90 -lmpi_f77 -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -ldl -lmpi -lhwloc -lgcc_s -lpthread -ldl 
-----------------------------------------

Reply via email to