Re: [petsc-dev] testing on Titan

2018-06-29 Thread Richard Tran Mills
Hi Mark,

I've got an OLCF account but don't have access to Titan. Would it be
helpful for me to apply to get on whatever allocation you are using so I
can poke around at this stuff? (If so, send me a message off list with
details of what I should apply for.)

Hopefully I could get on there faster than it took them to enable me on
Summit. I had volunteered to help with issues on Summit some time ago, but
it took so long for them to enable my account on there that I didn't matter
by then.

--Richard

On Fri, Jun 29, 2018 at 7:12 AM, Mark Adams  wrote:

> Thanks all, I knew this but its been a while since I've run on this file
> setup.
>
> On Fri, Jun 29, 2018 at 9:15 AM Seung-Hoe Ku  wrote:
>
>> Try copying files to scratch file system.
>> Home is not available when executive batch job. Probably interactive
>> she'll too.
>>
>> On Fri, Jun 29, 2018 at 7:47 AM Mark Adams  wrote:
>>
>>> I can not even run a job here. This is from an interactive shell:
>>>
>>> 64-pgi ex19-batch1:~/petsc/src/snes/examples/tutorials> make
>>> PETSC_DIR=/autofs/nccs-svm1_home1/adams/petsc PETSC_ARCH=arch-titan-dbg6
>>> cc -o ex19.o -c -fast -mp   -I/autofs/nccs-svm1_home1/adams/petsc/include
>>> -I/autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/include
>>> -I/ccs/proj/env003/petscv3.9-dbg64-pgi/include`pwd`/ex19.c
>>> cc -fast -mp  -o ex19  ex19.o -L/autofs/nccs-svm1_home1/
>>> adams/petsc/arch-titan-dbg64-pgi/lib 
>>> -Wl,-rpath,/ccs/proj/env003/petscv3.9-dbg64-pgi/lib
>>> -L/ccs/proj/env003/petscv3.9-dbg64-pgi/lib -lpetsc -lHYPRE -lflapack
>>> -lfblas -lparmetis -lmetis -lstdc++ -ldl
>>> /autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib/libpetsc.a(dlimpl.o):
>>> In function `PetscDLOpen':
>>> /autofs/nccs-svm1_home1/adams/petsc/src/sys/dll/dlimpl.c:54: warning:
>>> Using 'dlopen' in statically linked applications requires at runtime the
>>> shared libraries from the glibc version used for linking
>>> /autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib/libpetsc.a(send.o):
>>> In function `PetscOpenSocket':
>>> /autofs/nccs-svm1_home1/adams/petsc/src/sys/classes/viewer/impls/socket/send.c:106:
>>> warning: Using 'gethostbyname' in statically linked applications requires
>>> at runtime the shared libraries from the glibc version used for linking
>>> adams@titan-batch1:~/petsc/src/snes/examples/tutorials> aprun -n 1
>>> ./ex19
>>> [NID 02258] 2018-06-29 07:45:09 Exec ./ex19 failed: chdir
>>> /autofs/nccs-svm1_home1/adams/petsc/src/snes/examples/tutorials No such
>>> file or directory
>>> adams@titan-batch1:~/petsc/src/snes/examples/tutorials>
>>>
>>> Seung-Hoe runs here, maybe he can give me advice on what is wrong here.
>>>
>>>
>>> On Thu, Jun 28, 2018 at 7:53 PM Smith, Barry F. 
>>> wrote:
>>>

   1) We need to get this PGI compiler into our regular testing. It
 doesn't like some Fortran code the other compilers are ok with

2) Mark,  likely you just need to submit the examples manually to
 run them. cd src/snes/examples/tutorials; make ex19  then use the batch
 system to submit the example

Barry


 > On Jun 28, 2018, at 5:35 PM, Mark Adams  wrote:
 >
 > I am trying to run on Titan and I am having problems running the
 tests. I get this output. It says it can not find a directory, but I do an
 ls in after the error and you can see that it is there.
 >
 > I've attached the logs.
 >
 > Any ideas?
 > Thanks,
 > Mark
 >
 > adams@titan-login6:~/petsc> make 
 > PETSC_DIR=/ccs/proj/env003/petscv3.9-opt64-pgi
 PETSC_ARCH="" test
 > Running test examples to verify correct installation
 > Using PETSC_DIR=/ccs/proj/env003/petscv3.9-opt64-pgi and PETSC_ARCH=
 > ***Error detected during compile or
 link!***
 > See http://www.mcs.anl.gov/petsc/documentation/faq.html
 > /ccs/home/adams/petsc/src/snes/examples/tutorials ex19
 > 
 *
 > cc -o ex19.o -c -fast -mp   
 > -I/ccs/proj/env003/petscv3.9-opt64-pgi/include
 -I/ccs/proj/env003/petscv3.9-opt64-pgi/include
 -I/ccs/proj/env003/petscv3.9-opt64-pgi/include`pwd`/ex19.c
 > cc -fast -mp  -o ex19  ex19.o -L/ccs/proj/env003/petscv3.9-opt64-pgi/lib
 -Wl,-rpath,/ccs/proj/env003/petscv3.9-opt64-pgi/lib
 -L/ccs/proj/env003/petscv3.9-opt64-pgi/lib -lpetsc -lHYPRE -lflapack
 -lfblas -lparmetis -lmetis -lstdc++ -ldl
 > /ccs/proj/env003/petscv3.9-opt64-pgi/lib/libpetsc.a(dlimpl.o): In
 function `PetscDLOpen':
 > /autofs/nccs-svm1_home1/adams/petsc/src/sys/dll/dlimpl.c:53:
 warning: Using 'dlopen' in statically linked applications requires at
 runtime the shared libraries from the glibc version used for linking
 > /ccs/proj/env003/petscv3.9-opt64-pgi/lib/libpetsc.a(send.o): In
 function `PetscOpenSocket':
 > 

Re: [petsc-dev] GAMG and custom MatMults in smoothers

2018-06-29 Thread Smith, Barry F.


> On Jun 29, 2018, at 9:33 AM, Vaclav Hapla  wrote:
> 
> 
> 
>> 22. 6. 2018 v 17:47, Smith, Barry F. :
>> 
>> 
>> 
>>> On Jun 22, 2018, at 5:43 AM, Pierre Jolivet  
>>> wrote:
>>> 
>>> Hello,
>>> I’m solving a system using a MATSHELL and PCGAMG.
>>> The MPIAIJ Mat I’m giving to GAMG has a specific structure (inherited from 
>>> the MATSHELL) I’d like to exploit during the solution phase when the 
>>> smoother on the finest level is doing MatMults.
>>> 
>>> Is there some way to:
>>> 1) decouple in -log_view the time spent in the MATSHELL MatMult and in the 
>>> smoothers MatMult
>> 
>>  You can register a new event and then inside your MATSHELL MatMult() call 
>> PetscLogEventBegin/End on your new event.
>> 
>>   Note that the MatMult() like will still contain the time for your MatShell 
>> mult so you will need to subtract it off to get the time for your non-shell 
>> matmults.
> 
> In PERMON, we sometimes have quite complicated hierarchy of wrapped matrices 
> and want to measure MatMult{,Transpose,Add,TransposeAdd} separately for 
> particular ones. Think e.g. of having additive MATCOMPOSITE wrapping 
> multiplicative MATCOMPOSITE wrapping MATTRANSPOSE wrapping MATAIJ. You want 
> to measure this MATAIJ instance's MatMult separately but you surely don't 
> want to rewrite implementation of MatMult_Transpose or force yourself to use 
> MATSHELL just to hang the events on MatMult*.
> 
> We had a special wrapper type just adding some prefix to the events for the 
> given object but this is not nice. What about adding a functionality to 
> PetscLogEventBegin/End that would distinguish based on the first 
> PetscObject's name or option prefix? Of course optionally not to break guys 
> relying on current behavior - e.g. under something like -log_view_by_name. To 
> me it's quite an elegant solution working for any PetscObject and any event.

   This could get ugly real fast, for example, for vector operations, there may 
be dozens of named vectors and each one gets its own logging? You'd have to 
make sure that only the objects you care about get named, is that possible?

I don't know if there is a good solution within the PETSc logging 
infrastructure to get what you want but maybe what you propose is the best 
possible.

   Barry

> 
> I can do that if I get some upvotes.
> 
> Vaclav
> 
>> 
>>> 2) hardwire a specific MatMult implementation for the smoother on the 
>>> finest level
>> 
>>  In the latest release you do MatSetOperation() to override the normal 
>> matrix vector product with anything else you want. 
>> 
>>> 
>>> Thanks in advance,
>>> Pierre
>>> 
>>> PS : here is what I have right now,
>>> MatMult  118 1.0 1.0740e+02 1.6 1.04e+13 1.6 1.7e+06 6.1e+05 
>>> 0.0e+00 47100 90 98  0  47100 90 98  0 81953703
>>> […]
>>> PCSetUp2 1.0 8.6513e+00 1.0 1.01e+09 1.7 2.6e+05 4.0e+05 
>>> 1.8e+02  5  0 14 10 66   5  0 14 10 68 94598
>>> PCApply   14 1.0 8.0373e+01 1.1 9.06e+12 1.6 1.3e+06 6.0e+05 
>>> 2.1e+01 45 87 72 78  8  45 87 72 78  8 95365211 // I’m guessing a lot of 
>>> time here is being wasted in doing inefficient MatMults on the finest level 
>>> but this is only speculation
>>> 
>>> Same code with -pc_type none -ksp_max_it 13,
>>> MatMult   14 1.0 1.2936e+01 1.7 1.35e+12 1.6 2.0e+05 6.1e+05 
>>> 0.0e+00 15100 78 93  0  15100 78 93  0 88202079
>>> 
>>> The grid itself is rather simple (two levels, extremely aggressive 
>>> coarsening),
>>>  type is MULTIPLICATIVE, levels=2 cycles=v
>>>  KSP Object: (mg_coarse_) 1024 MPI processes
>>> linear system matrix = precond matrix:
>>>Mat Object: 1024 MPI processes
>>>  type: mpiaij
>>>  rows=775, cols=775
>>>  total: nonzeros=1793, allocated nonzeros=1793
>>> 
>>> linear system matrix followed by preconditioner matrix:
>>> Mat Object: 1024 MPI processes
>>>  type: shell
>>>  rows=1369307136, cols=1369307136
>>> Mat Object: 1024 MPI processes
>>>  type: mpiaij
>>>  rows=1369307136, cols=1369307136
>>>  total: nonzeros=19896719360, allocated nonzeros=19896719360



Re: [petsc-dev] F90 code with PGI (Titan)

2018-06-29 Thread Smith, Barry F.



> On Jun 29, 2018, at 1:10 PM, Satish Balay  wrote:
> 
> https://bitbucket.org/petsc/petsc/commits/16d0e248c69a6e6dc72c61578459093d2bcb#Lsrc/snes/examples/tutorials/ex73f90t.F90T744
> 
> -!  requires: !single
> +!  requires: !single !libpgf90
> 
> This example is marked as incompatible with pgf90.

  True, but it does not explain all the warning/error messages below. Something 
is definitely funky about this version of the PGI fortran compiler.

   Barry

> 
> However - if one invokes 'make ex73f90t' - this check is not enforced.
> 
> Satish
> 
> 
> On Fri, 29 Jun 2018, Mark Adams wrote:
> 
>> We are having problems compiling with PGI on Titan (pgf90 18.4-0 64-bit
>> target on x86-64 Linux -tp bulldozer-64). Any idea what is wrong here?
>> 
>> Thanks,
>> 
>> adams@titan-ext5:~/petsc/src/snes/examples/tutorials> make ex73f90t
>> ftn -c -fast  -mp   -I/autofs/nccs-svm1_home1/adams/petsc/include
>> -I/autofs/nccs-svm1_home1/adams/petsc/arch-titan-opt64-pgi/include
>> -I/ccs/proj/env003/petscv3.9-opt64-pgi/include-o ex73f90t.o ex73f90t.F90
>> PGF90-S-0155-Could not resolve generic procedure dmdacreate2d
>> (ex73f90t.F90: 155)
>> PGF90-S-0450-Argument number 3 to vecsetsizes: kind mismatch (ex73f90t.F90:
>> 179)
>> PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
>> 186)
>> PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
>> 186)
>> PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
>> 190)
>> PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
>> 190)
>> PGF90-S-0450-Argument number 2 to vecsetsizes: kind mismatch (ex73f90t.F90:
>> 201)
>> PGF90-S-0450-Argument number 3 to vecsetsizes: kind mismatch (ex73f90t.F90:
>> 201)
>> PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
>> 209)
>> PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
>> 209)
>> PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
>> 213)
>> PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
>> 213)
>> PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
>> 217)
>> PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
>> 217)
>> PGF90-S-0450-Argument number 3 to vecsetsizes: kind mismatch (ex73f90t.F90:
>> 221)
>> PGF90-S-0155-Could not resolve generic procedure matsetvalues
>> (ex73f90t.F90: 243)
>> PGF90-S-0155-Could not resolve generic procedure matsetvalues
>> (ex73f90t.F90: 246)
>> PGF90-S-0155-Could not resolve generic procedure matsetvalues
>> (ex73f90t.F90: 260)
>> PGF90-S-0450-Argument number 2 to vecsetsizes: kind mismatch (ex73f90t.F90:
>> 273)
>> PGF90-S-0450-Argument number 3 to vecsetsizes: kind mismatch (ex73f90t.F90:
>> 273)
>> PGF90-S-0450-Argument number 2 to snesgetiterationnumber: kind mismatch
>> (ex73f90t.F90: 340)
>>  0 inform,   0 warnings,  21 severes, 0 fatal for main
>> PGF90-S-0155-Could not resolve generic procedure vecsetvalues
>> (ex73f90t.F90: 473)
>>  0 inform,   0 warnings,   1 severes, 0 fatal for initialguesslocal
>> PGF90-S-0155-Could not resolve generic procedure matsetvalues
>> (ex73f90t.F90: 596)
>> PGF90-S-0155-Could not resolve generic procedure matsetvalues
>> (ex73f90t.F90: 615)
>>  0 inform,   0 warnings,   2 severes, 0 fatal for formjacobianlocal
>> PGF90-S-0285-Source line too long (ex73f90t.F90: 747)
>>  0 inform,   0 warnings,   1 severes, 0 fatal for formfunctionnlterm
>> make: [ex73f90t.o] Error 2 (ignored)
>> ftn -fast  -mp  -o ex73f90t ex73f90t.o
>> -L/autofs/nccs-svm1_home1/adams/petsc/arch-titan-opt64-pgi/lib
>> -Wl,-rpath,/ccs/proj/env003/petscv3.9-opt64-pgi/lib
>> -L/ccs/proj/env003/petscv3.9-opt64-pgi/lib -lpetsc -lHYPRE -lflapack
>> -lfblas -lparmetis -lmetis -lstdc++ -ldl
>> /usr/bin/ld: cannot find ex73f90t.o: No such file or directory
>> /usr/bin/sha1sum: ex73f90t: No such file or directory
>> make: [ex73f90t] Error 2 (ignored)
>> /bin/rm -f ex73f90t.o
>> adams@titan-ext5:~/petsc/src/snes/examples/tutorials> ftn -V
>> 
>> pgf90 18.4-0 64-bit target on x86-64 Linux -tp bulldozer-64
>> PGI Compilers and Tools
>> Copyright (c) 2018, NVIDIA CORPORATION.  All rights reserved.
>> adams@titan-ext5:~/petsc/src/snes/examples/tutorials>
>> 
> 



Re: [petsc-dev] testing on Titan

2018-06-29 Thread Satish Balay
On Thu, 28 Jun 2018, Smith, Barry F. wrote:

> 
>   1) We need to get this PGI compiler into our regular testing. It doesn't 
> like some Fortran code the other compilers are ok with

We do have 2 variants of pgi compiers [linux, mac] in our test suite. [don't 
know if PGI on titan is any different]

Satish


Re: [petsc-dev] F90 code with PGI (Titan)

2018-06-29 Thread Satish Balay
https://bitbucket.org/petsc/petsc/commits/16d0e248c69a6e6dc72c61578459093d2bcb#Lsrc/snes/examples/tutorials/ex73f90t.F90T744

-!  requires: !single
+!  requires: !single !libpgf90

This example is marked as incompatible with pgf90.

However - if one invokes 'make ex73f90t' - this check is not enforced.

Satish


On Fri, 29 Jun 2018, Mark Adams wrote:

> We are having problems compiling with PGI on Titan (pgf90 18.4-0 64-bit
> target on x86-64 Linux -tp bulldozer-64). Any idea what is wrong here?
> 
> Thanks,
> 
> adams@titan-ext5:~/petsc/src/snes/examples/tutorials> make ex73f90t
> ftn -c -fast  -mp   -I/autofs/nccs-svm1_home1/adams/petsc/include
> -I/autofs/nccs-svm1_home1/adams/petsc/arch-titan-opt64-pgi/include
> -I/ccs/proj/env003/petscv3.9-opt64-pgi/include-o ex73f90t.o ex73f90t.F90
> PGF90-S-0155-Could not resolve generic procedure dmdacreate2d
> (ex73f90t.F90: 155)
> PGF90-S-0450-Argument number 3 to vecsetsizes: kind mismatch (ex73f90t.F90:
> 179)
> PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
> 186)
> PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
> 186)
> PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
> 190)
> PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
> 190)
> PGF90-S-0450-Argument number 2 to vecsetsizes: kind mismatch (ex73f90t.F90:
> 201)
> PGF90-S-0450-Argument number 3 to vecsetsizes: kind mismatch (ex73f90t.F90:
> 201)
> PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
> 209)
> PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
> 209)
> PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
> 213)
> PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
> 213)
> PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
> 217)
> PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
> 217)
> PGF90-S-0450-Argument number 3 to vecsetsizes: kind mismatch (ex73f90t.F90:
> 221)
> PGF90-S-0155-Could not resolve generic procedure matsetvalues
> (ex73f90t.F90: 243)
> PGF90-S-0155-Could not resolve generic procedure matsetvalues
> (ex73f90t.F90: 246)
> PGF90-S-0155-Could not resolve generic procedure matsetvalues
> (ex73f90t.F90: 260)
> PGF90-S-0450-Argument number 2 to vecsetsizes: kind mismatch (ex73f90t.F90:
> 273)
> PGF90-S-0450-Argument number 3 to vecsetsizes: kind mismatch (ex73f90t.F90:
> 273)
> PGF90-S-0450-Argument number 2 to snesgetiterationnumber: kind mismatch
> (ex73f90t.F90: 340)
>   0 inform,   0 warnings,  21 severes, 0 fatal for main
> PGF90-S-0155-Could not resolve generic procedure vecsetvalues
> (ex73f90t.F90: 473)
>   0 inform,   0 warnings,   1 severes, 0 fatal for initialguesslocal
> PGF90-S-0155-Could not resolve generic procedure matsetvalues
> (ex73f90t.F90: 596)
> PGF90-S-0155-Could not resolve generic procedure matsetvalues
> (ex73f90t.F90: 615)
>   0 inform,   0 warnings,   2 severes, 0 fatal for formjacobianlocal
> PGF90-S-0285-Source line too long (ex73f90t.F90: 747)
>   0 inform,   0 warnings,   1 severes, 0 fatal for formfunctionnlterm
> make: [ex73f90t.o] Error 2 (ignored)
> ftn -fast  -mp  -o ex73f90t ex73f90t.o
> -L/autofs/nccs-svm1_home1/adams/petsc/arch-titan-opt64-pgi/lib
> -Wl,-rpath,/ccs/proj/env003/petscv3.9-opt64-pgi/lib
> -L/ccs/proj/env003/petscv3.9-opt64-pgi/lib -lpetsc -lHYPRE -lflapack
> -lfblas -lparmetis -lmetis -lstdc++ -ldl
> /usr/bin/ld: cannot find ex73f90t.o: No such file or directory
> /usr/bin/sha1sum: ex73f90t: No such file or directory
> make: [ex73f90t] Error 2 (ignored)
> /bin/rm -f ex73f90t.o
> adams@titan-ext5:~/petsc/src/snes/examples/tutorials> ftn -V
> 
> pgf90 18.4-0 64-bit target on x86-64 Linux -tp bulldozer-64
> PGI Compilers and Tools
> Copyright (c) 2018, NVIDIA CORPORATION.  All rights reserved.
> adams@titan-ext5:~/petsc/src/snes/examples/tutorials>
> 



Re: [petsc-dev] VTK viewer design question

2018-06-29 Thread Jed Brown
"Smith, Barry F."  writes:

>> On Jun 29, 2018, at 10:15 AM, Jed Brown  wrote:
>> 
>> Stefano Zampini  writes:
>> 
>>> Vec and DM classes should not be visible from Sys. This is why they are
>>> PetscObject.
>>> If they were visible, builds with --with-single-library=0 will be broken.
>>> 
>>> 2018-06-29 17:06 GMT+03:00 Patrick Sanan :
>>> 
 I'm looking at the VTK viewer implementation  and I notice that
 PetscViewerVTKAddField() [1]
 accepts a parameter which, despite being called "dm", is of type
 PetscObject.
>> 
>> I would not object to moving vtkv.c into src/dm -- it isn't actually
>> usable without DM.
>
>Yeah, move it. It is based on DM and Vec 
>
>But wait. If it is moved to DM and takes a vec argument, which will know 
> about the DM, should it have a DM argument or should it be:
>
> PetscViewerVTKAddVec(PetscViewer viewer,Vec vec,PetscErrorCode 
> (*PetscViewerVTKWriteFunction)(Vec,PetscViewer),PetscViewerVTKFieldType 
> fieldtype)

I think that's fine.  In the current interface, I think I just wanted
the interface to emphasize to the caller that the DM was significant.


Re: [petsc-dev] PetscObjectViewFromOptions() pull request 1005 1006

2018-06-29 Thread Lisandro Dalcin
On Fri, 29 Jun 2018 at 17:20, Boyce Griffith  wrote:
>
>
> FWIW, we implemented some Vanka-type multigrid smoothers for the 
> incompressible Stokes equations, and we found that ~15% of total runtime was 
> spent dealing with options on real (or at least real-ish) test problems. The 
> configure-time flag turning off the viewers dropped that down to ~1%. I don't 
> know what PCPATCH is, but it sounds like it may be doing something similar.
>

That sounds like a more realistic number.


-- 
Lisandro Dalcin

Research Scientist
Computer, Electrical and Mathematical Sciences & Engineering (CEMSE)
Extreme Computing Research Center (ECRC)
King Abdullah University of Science and Technology (KAUST)
http://ecrc.kaust.edu.sa/

4700 King Abdullah University of Science and Technology
al-Khawarizmi Bldg (Bldg 1), Office # 0109
Thuwal 23955-6900, Kingdom of Saudi Arabia
http://www.kaust.edu.sa

Office Phone: +966 12 808-0459


Re: [petsc-dev] VTK viewer design question

2018-06-29 Thread Smith, Barry F.



> On Jun 29, 2018, at 10:15 AM, Jed Brown  wrote:
> 
> Stefano Zampini  writes:
> 
>> Vec and DM classes should not be visible from Sys. This is why they are
>> PetscObject.
>> If they were visible, builds with --with-single-library=0 will be broken.
>> 
>> 2018-06-29 17:06 GMT+03:00 Patrick Sanan :
>> 
>>> I'm looking at the VTK viewer implementation  and I notice that
>>> PetscViewerVTKAddField() [1]
>>> accepts a parameter which, despite being called "dm", is of type
>>> PetscObject.
> 
> I would not object to moving vtkv.c into src/dm -- it isn't actually
> usable without DM.

   Yeah, move it. It is based on DM and Vec 

   But wait. If it is moved to DM and takes a vec argument, which will know 
about the DM, should it have a DM argument or should it be:

PetscViewerVTKAddVec(PetscViewer viewer,Vec vec,PetscErrorCode 
(*PetscViewerVTKWriteFunction)(Vec,PetscViewer),PetscViewerVTKFieldType 
fieldtype)

> 
>>> I think that this is used, amongst other things, to ensure that vectors
>>> being queued up to be written all come from the same DM.
>>> 
>>> I'd like to relax this to only require that the vectors all come from
>>> *compatible* DMDAs, but this would require the DM API in vtkv.c.
> 
> Why?  The function is developer level and there is VecGetDM() to give
> the correct DM.  I would rather that PetscViewerVTKAddField_VTK change
> this logic to merely check for compatibility:
> 
>  if (vtk->dm) {
>if (dm != vtk->dm) 
> SETERRQ(PetscObjectComm((PetscObject)viewer),PETSC_ERR_ARG_INCOMP,"Cannot 
> write a field from more than one grid to the same VTK file");
> 
>>> My questions: is this argument of type PetscObject for any reason other
>>> than not wanting to bother including petscdm.h ? Might this be something
>>> other than a DM in some case (and in which case, why is the argument called
>>> "dm")? Am I missing a reason that I'll get into trouble eventually if I
>>> change this?
>>> 
>>> (Similar question for the "vec" argument).
>>> 
>>> [1] http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/
>>> Viewer/PetscViewerVTKAddField.html
>>> 
>> 
>> 
>> 
>> -- 
>> Stefano



Re: [petsc-dev] VTK viewer design question

2018-06-29 Thread Patrick Sanan
2018-06-29 17:15 GMT+02:00 Jed Brown :

> Stefano Zampini  writes:
>
> > Vec and DM classes should not be visible from Sys. This is why they are
> > PetscObject.
> > If they were visible, builds with --with-single-library=0 will be broken.
> >
> > 2018-06-29 17:06 GMT+03:00 Patrick Sanan :
> >
> >> I'm looking at the VTK viewer implementation  and I notice that
> >> PetscViewerVTKAddField() [1]
> >> accepts a parameter which, despite being called "dm", is of type
> >> PetscObject.
>
> I would not object to moving vtkv.c into src/dm -- it isn't actually
> usable without DM.
>

Yeah, it seems like moving this logic somehow into the DM package (either
by moving the entire thing, or by introducing another callback?) is the
natural thing to do.

>
> >> I think that this is used, amongst other things, to ensure that vectors
> >> being queued up to be written all come from the same DM.
> >>
> >> I'd like to relax this to only require that the vectors all come from
> >> *compatible* DMDAs, but this would require the DM API in vtkv.c.
>
> Why?  The function is developer level and there is VecGetDM() to give
> the correct DM.  I would rather that PetscViewerVTKAddField_VTK change
> this logic to merely check for compatibility:
>
>   if (vtk->dm) {
> if (dm != vtk->dm) SETERRQ(PetscObjectComm((
> PetscObject)viewer),PETSC_ERR_ARG_INCOMP,"Cannot write a field from more
> than one grid to the same VTK file");
>
This is what I actually do in my proof-of-concept hack. I hackily include
petscdm.h in vtkv.c, cast the arguments back to (DM), and use
DMGetCompatibility() instead of the check for identical DMs.

Later, I do use VecGetDM() to pull out the required dof/node, but this
happens in a callback defined in grvtk.c, which is in the DM package, so no
problem.


> >> My questions: is this argument of type PetscObject for any reason other
> >> than not wanting to bother including petscdm.h ? Might this be something
> >> other than a DM in some case (and in which case, why is the argument
> called
> >> "dm")? Am I missing a reason that I'll get into trouble eventually if I
> >> change this?
> >>
> >> (Similar question for the "vec" argument).
> >>
> >> [1] http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/
> >> Viewer/PetscViewerVTKAddField.html
> >>
> >
> >
> >
> > --
> > Stefano
>


Re: [petsc-dev] GAMG and custom MatMults in smoothers

2018-06-29 Thread Vaclav Hapla



> 22. 6. 2018 v 17:47, Smith, Barry F. :
> 
> 
> 
>> On Jun 22, 2018, at 5:43 AM, Pierre Jolivet  
>> wrote:
>> 
>> Hello,
>> I’m solving a system using a MATSHELL and PCGAMG.
>> The MPIAIJ Mat I’m giving to GAMG has a specific structure (inherited from 
>> the MATSHELL) I’d like to exploit during the solution phase when the 
>> smoother on the finest level is doing MatMults.
>> 
>> Is there some way to:
>> 1) decouple in -log_view the time spent in the MATSHELL MatMult and in the 
>> smoothers MatMult
> 
>   You can register a new event and then inside your MATSHELL MatMult() call 
> PetscLogEventBegin/End on your new event.
> 
>Note that the MatMult() like will still contain the time for your MatShell 
> mult so you will need to subtract it off to get the time for your non-shell 
> matmults.

In PERMON, we sometimes have quite complicated hierarchy of wrapped matrices 
and want to measure MatMult{,Transpose,Add,TransposeAdd} separately for 
particular ones. Think e.g. of having additive MATCOMPOSITE wrapping 
multiplicative MATCOMPOSITE wrapping MATTRANSPOSE wrapping MATAIJ. You want to 
measure this MATAIJ instance's MatMult separately but you surely don't want to 
rewrite implementation of MatMult_Transpose or force yourself to use MATSHELL 
just to hang the events on MatMult*.

We had a special wrapper type just adding some prefix to the events for the 
given object but this is not nice. What about adding a functionality to 
PetscLogEventBegin/End that would distinguish based on the first PetscObject's 
name or option prefix? Of course optionally not to break guys relying on 
current behavior - e.g. under something like -log_view_by_name. To me it's 
quite an elegant solution working for any PetscObject and any event.

I can do that if I get some upvotes.

Vaclav

> 
>> 2) hardwire a specific MatMult implementation for the smoother on the finest 
>> level
> 
>   In the latest release you do MatSetOperation() to override the normal 
> matrix vector product with anything else you want. 
> 
>> 
>> Thanks in advance,
>> Pierre
>> 
>> PS : here is what I have right now,
>> MatMult  118 1.0 1.0740e+02 1.6 1.04e+13 1.6 1.7e+06 6.1e+05 
>> 0.0e+00 47100 90 98  0  47100 90 98  0 81953703
>> […]
>> PCSetUp2 1.0 8.6513e+00 1.0 1.01e+09 1.7 2.6e+05 4.0e+05 
>> 1.8e+02  5  0 14 10 66   5  0 14 10 68 94598
>> PCApply   14 1.0 8.0373e+01 1.1 9.06e+12 1.6 1.3e+06 6.0e+05 
>> 2.1e+01 45 87 72 78  8  45 87 72 78  8 95365211 // I’m guessing a lot of 
>> time here is being wasted in doing inefficient MatMults on the finest level 
>> but this is only speculation
>> 
>> Same code with -pc_type none -ksp_max_it 13,
>> MatMult   14 1.0 1.2936e+01 1.7 1.35e+12 1.6 2.0e+05 6.1e+05 
>> 0.0e+00 15100 78 93  0  15100 78 93  0 88202079
>> 
>> The grid itself is rather simple (two levels, extremely aggressive 
>> coarsening),
>>   type is MULTIPLICATIVE, levels=2 cycles=v
>>   KSP Object: (mg_coarse_) 1024 MPI processes
>> linear system matrix = precond matrix:
>> Mat Object: 1024 MPI processes
>>   type: mpiaij
>>   rows=775, cols=775
>>   total: nonzeros=1793, allocated nonzeros=1793
>> 
>> linear system matrix followed by preconditioner matrix:
>> Mat Object: 1024 MPI processes
>>   type: shell
>>   rows=1369307136, cols=1369307136
>> Mat Object: 1024 MPI processes
>>   type: mpiaij
>>   rows=1369307136, cols=1369307136
>>   total: nonzeros=19896719360, allocated nonzeros=19896719360
> 



[petsc-dev] F90 code with PGI (Titan)

2018-06-29 Thread Mark Adams
We are having problems compiling with PGI on Titan (pgf90 18.4-0 64-bit
target on x86-64 Linux -tp bulldozer-64). Any idea what is wrong here?

Thanks,

adams@titan-ext5:~/petsc/src/snes/examples/tutorials> make ex73f90t
ftn -c -fast  -mp   -I/autofs/nccs-svm1_home1/adams/petsc/include
-I/autofs/nccs-svm1_home1/adams/petsc/arch-titan-opt64-pgi/include
-I/ccs/proj/env003/petscv3.9-opt64-pgi/include-o ex73f90t.o ex73f90t.F90
PGF90-S-0155-Could not resolve generic procedure dmdacreate2d
(ex73f90t.F90: 155)
PGF90-S-0450-Argument number 3 to vecsetsizes: kind mismatch (ex73f90t.F90:
179)
PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
186)
PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
186)
PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
190)
PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
190)
PGF90-S-0450-Argument number 2 to vecsetsizes: kind mismatch (ex73f90t.F90:
201)
PGF90-S-0450-Argument number 3 to vecsetsizes: kind mismatch (ex73f90t.F90:
201)
PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
209)
PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
209)
PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
213)
PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
213)
PGF90-S-0450-Argument number 4 to matsetsizes: kind mismatch (ex73f90t.F90:
217)
PGF90-S-0450-Argument number 5 to matsetsizes: kind mismatch (ex73f90t.F90:
217)
PGF90-S-0450-Argument number 3 to vecsetsizes: kind mismatch (ex73f90t.F90:
221)
PGF90-S-0155-Could not resolve generic procedure matsetvalues
(ex73f90t.F90: 243)
PGF90-S-0155-Could not resolve generic procedure matsetvalues
(ex73f90t.F90: 246)
PGF90-S-0155-Could not resolve generic procedure matsetvalues
(ex73f90t.F90: 260)
PGF90-S-0450-Argument number 2 to vecsetsizes: kind mismatch (ex73f90t.F90:
273)
PGF90-S-0450-Argument number 3 to vecsetsizes: kind mismatch (ex73f90t.F90:
273)
PGF90-S-0450-Argument number 2 to snesgetiterationnumber: kind mismatch
(ex73f90t.F90: 340)
  0 inform,   0 warnings,  21 severes, 0 fatal for main
PGF90-S-0155-Could not resolve generic procedure vecsetvalues
(ex73f90t.F90: 473)
  0 inform,   0 warnings,   1 severes, 0 fatal for initialguesslocal
PGF90-S-0155-Could not resolve generic procedure matsetvalues
(ex73f90t.F90: 596)
PGF90-S-0155-Could not resolve generic procedure matsetvalues
(ex73f90t.F90: 615)
  0 inform,   0 warnings,   2 severes, 0 fatal for formjacobianlocal
PGF90-S-0285-Source line too long (ex73f90t.F90: 747)
  0 inform,   0 warnings,   1 severes, 0 fatal for formfunctionnlterm
make: [ex73f90t.o] Error 2 (ignored)
ftn -fast  -mp  -o ex73f90t ex73f90t.o
-L/autofs/nccs-svm1_home1/adams/petsc/arch-titan-opt64-pgi/lib
-Wl,-rpath,/ccs/proj/env003/petscv3.9-opt64-pgi/lib
-L/ccs/proj/env003/petscv3.9-opt64-pgi/lib -lpetsc -lHYPRE -lflapack
-lfblas -lparmetis -lmetis -lstdc++ -ldl
/usr/bin/ld: cannot find ex73f90t.o: No such file or directory
/usr/bin/sha1sum: ex73f90t: No such file or directory
make: [ex73f90t] Error 2 (ignored)
/bin/rm -f ex73f90t.o
adams@titan-ext5:~/petsc/src/snes/examples/tutorials> ftn -V

pgf90 18.4-0 64-bit target on x86-64 Linux -tp bulldozer-64
PGI Compilers and Tools
Copyright (c) 2018, NVIDIA CORPORATION.  All rights reserved.
adams@titan-ext5:~/petsc/src/snes/examples/tutorials>


Re: [petsc-dev] PetscObjectViewFromOptions() pull request 1005 1006

2018-06-29 Thread Boyce Griffith



> On Jun 29, 2018, at 9:58 AM, Lisandro Dalcin  wrote:
> 
> On Thu, 28 Jun 2018 at 20:17, Lawrence Mitchell  wrote:
>> 
>> 
>> OK, I've done some more benchmarking now, and cooked up a very simple test 
>> case.  I just solve a tiny problem 10 million times.
>> 
>> On master, this completes on my machine in:
>> 
>> If I leave the viewers on, this takes ages:
>> 
>> (master-viewers-on) $ time taskset -c 1 ./many-ksps
>> real 0m37.07s
>> 
>> When I turn the viewers off
>> (master-viewers-off) $ time taskset -c 1 ./many-ksps
>> real 0m17.979s
>> 
>> On knepley/feature-pc-patch both are faster, with no difference between 
>> turning viewers on and turning viewers off.
>> 
>> (patch) $ time task set -c 1 ./many-ksps
>> real 0m12.512s
>> 
>> So a 25%-30% win.
>> 
> 
> Your example is hardly representative of any actual problem you will
> ever solve with PCPATCH. You matrix is diagonal (identity!!), and the
> same KSP object (and trivial matrix, and ILU preconditioner) is used
> over and over again (all memory will reside in cache?), and all the
> solves are just one iteration. The fact that you get an improvement in
> Matt's branch respect to turning the viewers off smells to just C
> function call overhead, otherwise how do you explain the difference?


FWIW, we implemented some Vanka-type multigrid smoothers for the incompressible 
Stokes equations, and we found that ~15% of total runtime was spent dealing 
with options on real (or at least real-ish) test problems. The configure-time 
flag turning off the viewers dropped that down to ~1%. I don't know what 
PCPATCH is, but it sounds like it may be doing something similar.

-- Boyce

Re: [petsc-dev] VTK viewer design question

2018-06-29 Thread Stefano Zampini
Vec and DM classes should not be visible from Sys. This is why they are
PetscObject.
If they were visible, builds with --with-single-library=0 will be broken.

2018-06-29 17:06 GMT+03:00 Patrick Sanan :

> I'm looking at the VTK viewer implementation  and I notice that
> PetscViewerVTKAddField() [1]
> accepts a parameter which, despite being called "dm", is of type
> PetscObject.
>
> I think that this is used, amongst other things, to ensure that vectors
> being queued up to be written all come from the same DM.
>
> I'd like to relax this to only require that the vectors all come from
> *compatible* DMDAs, but this would require the DM API in vtkv.c.
>
> My questions: is this argument of type PetscObject for any reason other
> than not wanting to bother including petscdm.h ? Might this be something
> other than a DM in some case (and in which case, why is the argument called
> "dm")? Am I missing a reason that I'll get into trouble eventually if I
> change this?
>
> (Similar question for the "vec" argument).
>
> [1] http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/
> Viewer/PetscViewerVTKAddField.html
>



-- 
Stefano


Re: [petsc-dev] testing on Titan

2018-06-29 Thread Mark Adams
Thanks all, I knew this but its been a while since I've run on this file
setup.

On Fri, Jun 29, 2018 at 9:15 AM Seung-Hoe Ku  wrote:

> Try copying files to scratch file system.
> Home is not available when executive batch job. Probably interactive
> she'll too.
>
> On Fri, Jun 29, 2018 at 7:47 AM Mark Adams  wrote:
>
>> I can not even run a job here. This is from an interactive shell:
>>
>> 64-pgi ex19-batch1:~/petsc/src/snes/examples/tutorials> make
>> PETSC_DIR=/autofs/nccs-svm1_home1/adams/petsc PETSC_ARCH=arch-titan-dbg6
>> cc -o ex19.o -c -fast -mp   -I/autofs/nccs-svm1_home1/adams/petsc/include
>> -I/autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/include
>> -I/ccs/proj/env003/petscv3.9-dbg64-pgi/include`pwd`/ex19.c
>> cc -fast -mp  -o ex19  ex19.o
>> -L/autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib
>> -Wl,-rpath,/ccs/proj/env003/petscv3.9-dbg64-pgi/lib
>> -L/ccs/proj/env003/petscv3.9-dbg64-pgi/lib -lpetsc -lHYPRE -lflapack
>> -lfblas -lparmetis -lmetis -lstdc++ -ldl
>> /autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib/libpetsc.a(dlimpl.o):
>> In function `PetscDLOpen':
>> /autofs/nccs-svm1_home1/adams/petsc/src/sys/dll/dlimpl.c:54: warning:
>> Using 'dlopen' in statically linked applications requires at runtime the
>> shared libraries from the glibc version used for linking
>> /autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib/libpetsc.a(send.o):
>> In function `PetscOpenSocket':
>> /autofs/nccs-svm1_home1/adams/petsc/src/sys/classes/viewer/impls/socket/send.c:106:
>> warning: Using 'gethostbyname' in statically linked applications requires
>> at runtime the shared libraries from the glibc version used for linking
>> adams@titan-batch1:~/petsc/src/snes/examples/tutorials> aprun -n 1 ./ex19
>> [NID 02258] 2018-06-29 07:45:09 Exec ./ex19 failed: chdir
>> /autofs/nccs-svm1_home1/adams/petsc/src/snes/examples/tutorials No such
>> file or directory
>> adams@titan-batch1:~/petsc/src/snes/examples/tutorials>
>>
>> Seung-Hoe runs here, maybe he can give me advice on what is wrong here.
>>
>>
>> On Thu, Jun 28, 2018 at 7:53 PM Smith, Barry F. 
>> wrote:
>>
>>>
>>>   1) We need to get this PGI compiler into our regular testing. It
>>> doesn't like some Fortran code the other compilers are ok with
>>>
>>>2) Mark,  likely you just need to submit the examples manually to run
>>> them. cd src/snes/examples/tutorials; make ex19  then use the batch system
>>> to submit the example
>>>
>>>Barry
>>>
>>>
>>> > On Jun 28, 2018, at 5:35 PM, Mark Adams  wrote:
>>> >
>>> > I am trying to run on Titan and I am having problems running the
>>> tests. I get this output. It says it can not find a directory, but I do an
>>> ls in after the error and you can see that it is there.
>>> >
>>> > I've attached the logs.
>>> >
>>> > Any ideas?
>>> > Thanks,
>>> > Mark
>>> >
>>> > adams@titan-login6:~/petsc> make
>>> PETSC_DIR=/ccs/proj/env003/petscv3.9-opt64-pgi PETSC_ARCH="" test
>>> > Running test examples to verify correct installation
>>> > Using PETSC_DIR=/ccs/proj/env003/petscv3.9-opt64-pgi and PETSC_ARCH=
>>> > ***Error detected during compile or
>>> link!***
>>> > See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> > /ccs/home/adams/petsc/src/snes/examples/tutorials ex19
>>> >
>>> *
>>> > cc -o ex19.o -c -fast -mp
>>>  -I/ccs/proj/env003/petscv3.9-opt64-pgi/include
>>> -I/ccs/proj/env003/petscv3.9-opt64-pgi/include
>>> -I/ccs/proj/env003/petscv3.9-opt64-pgi/include`pwd`/ex19.c
>>> > cc -fast -mp  -o ex19  ex19.o
>>> -L/ccs/proj/env003/petscv3.9-opt64-pgi/lib
>>> -Wl,-rpath,/ccs/proj/env003/petscv3.9-opt64-pgi/lib
>>> -L/ccs/proj/env003/petscv3.9-opt64-pgi/lib -lpetsc -lHYPRE -lflapack
>>> -lfblas -lparmetis -lmetis -lstdc++ -ldl
>>> > /ccs/proj/env003/petscv3.9-opt64-pgi/lib/libpetsc.a(dlimpl.o): In
>>> function `PetscDLOpen':
>>> > /autofs/nccs-svm1_home1/adams/petsc/src/sys/dll/dlimpl.c:53: warning:
>>> Using 'dlopen' in statically linked applications requires at runtime the
>>> shared libraries from the glibc version used for linking
>>> > /ccs/proj/env003/petscv3.9-opt64-pgi/lib/libpetsc.a(send.o): In
>>> function `PetscOpenSocket':
>>> >
>>> /autofs/nccs-svm1_home1/adams/petsc/src/sys/classes/viewer/impls/socket/send.c:101:
>>> warning: Using 'gethostbyname' in statically linked applications requires
>>> at runtime the shared libraries from the glibc version used for linking
>>> > /bin/rm -f ex19.o
>>> > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1
>>> MPI process
>>> > See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> > [NID 18525] 2018-06-28 18:25:39 Exec ./ex19 failed: chdir
>>> /autofs/nccs-svm1_home1/adams/petsc/src/snes/examples/tutorials No such
>>> file or directory
>>> > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2
>>> MPI processes
>>> > See 

[petsc-dev] VTK viewer design question

2018-06-29 Thread Patrick Sanan
I'm looking at the VTK viewer implementation  and I notice that
PetscViewerVTKAddField() [1]
accepts a parameter which, despite being called "dm", is of type
PetscObject.

I think that this is used, amongst other things, to ensure that vectors
being queued up to be written all come from the same DM.

I'd like to relax this to only require that the vectors all come from
*compatible* DMDAs, but this would require the DM API in vtkv.c.

My questions: is this argument of type PetscObject for any reason other
than not wanting to bother including petscdm.h ? Might this be something
other than a DM in some case (and in which case, why is the argument called
"dm")? Am I missing a reason that I'll get into trouble eventually if I
change this?

(Similar question for the "vec" argument).

[1] http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/
PetscViewerVTKAddField.html


Re: [petsc-dev] PetscObjectViewFromOptions() pull request 1005 1006

2018-06-29 Thread Lisandro Dalcin
On Thu, 28 Jun 2018 at 20:17, Lawrence Mitchell  wrote:
>
>
> OK, I've done some more benchmarking now, and cooked up a very simple test 
> case.  I just solve a tiny problem 10 million times.
>
> On master, this completes on my machine in:
>
> If I leave the viewers on, this takes ages:
>
> (master-viewers-on) $ time taskset -c 1 ./many-ksps
> real 0m37.07s
>
> When I turn the viewers off
> (master-viewers-off) $ time taskset -c 1 ./many-ksps
> real 0m17.979s
>
> On knepley/feature-pc-patch both are faster, with no difference between 
> turning viewers on and turning viewers off.
>
> (patch) $ time task set -c 1 ./many-ksps
> real 0m12.512s
>
> So a 25%-30% win.
>

Your example is hardly representative of any actual problem you will
ever solve with PCPATCH. You matrix is diagonal (identity!!), and the
same KSP object (and trivial matrix, and ILU preconditioner) is used
over and over again (all memory will reside in cache?), and all the
solves are just one iteration. The fact that you get an improvement in
Matt's branch respect to turning the viewers off smells to just C
function call overhead, otherwise how do you explain the difference?


-- 
Lisandro Dalcin

Research Scientist
Computer, Electrical and Mathematical Sciences & Engineering (CEMSE)
Extreme Computing Research Center (ECRC)
King Abdullah University of Science and Technology (KAUST)
http://ecrc.kaust.edu.sa/

4700 King Abdullah University of Science and Technology
al-Khawarizmi Bldg (Bldg 1), Office # 0109
Thuwal 23955-6900, Kingdom of Saudi Arabia
http://www.kaust.edu.sa

Office Phone: +966 12 808-0459


Re: [petsc-dev] testing on Titan

2018-06-29 Thread Seung-Hoe Ku
Try copying files to scratch file system.
Home is not available when executive batch job. Probably interactive she'll
too.

On Fri, Jun 29, 2018 at 7:47 AM Mark Adams  wrote:

> I can not even run a job here. This is from an interactive shell:
>
> 64-pgi ex19-batch1:~/petsc/src/snes/examples/tutorials> make
> PETSC_DIR=/autofs/nccs-svm1_home1/adams/petsc PETSC_ARCH=arch-titan-dbg6
> cc -o ex19.o -c -fast -mp   -I/autofs/nccs-svm1_home1/adams/petsc/include
> -I/autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/include
> -I/ccs/proj/env003/petscv3.9-dbg64-pgi/include`pwd`/ex19.c
> cc -fast -mp  -o ex19  ex19.o
> -L/autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib
> -Wl,-rpath,/ccs/proj/env003/petscv3.9-dbg64-pgi/lib
> -L/ccs/proj/env003/petscv3.9-dbg64-pgi/lib -lpetsc -lHYPRE -lflapack
> -lfblas -lparmetis -lmetis -lstdc++ -ldl
> /autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib/libpetsc.a(dlimpl.o):
> In function `PetscDLOpen':
> /autofs/nccs-svm1_home1/adams/petsc/src/sys/dll/dlimpl.c:54: warning:
> Using 'dlopen' in statically linked applications requires at runtime the
> shared libraries from the glibc version used for linking
> /autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib/libpetsc.a(send.o):
> In function `PetscOpenSocket':
> /autofs/nccs-svm1_home1/adams/petsc/src/sys/classes/viewer/impls/socket/send.c:106:
> warning: Using 'gethostbyname' in statically linked applications requires
> at runtime the shared libraries from the glibc version used for linking
> adams@titan-batch1:~/petsc/src/snes/examples/tutorials> aprun -n 1 ./ex19
> [NID 02258] 2018-06-29 07:45:09 Exec ./ex19 failed: chdir
> /autofs/nccs-svm1_home1/adams/petsc/src/snes/examples/tutorials No such
> file or directory
> adams@titan-batch1:~/petsc/src/snes/examples/tutorials>
>
> Seung-Hoe runs here, maybe he can give me advice on what is wrong here.
>
>
> On Thu, Jun 28, 2018 at 7:53 PM Smith, Barry F. 
> wrote:
>
>>
>>   1) We need to get this PGI compiler into our regular testing. It
>> doesn't like some Fortran code the other compilers are ok with
>>
>>2) Mark,  likely you just need to submit the examples manually to run
>> them. cd src/snes/examples/tutorials; make ex19  then use the batch system
>> to submit the example
>>
>>Barry
>>
>>
>> > On Jun 28, 2018, at 5:35 PM, Mark Adams  wrote:
>> >
>> > I am trying to run on Titan and I am having problems running the tests.
>> I get this output. It says it can not find a directory, but I do an ls in
>> after the error and you can see that it is there.
>> >
>> > I've attached the logs.
>> >
>> > Any ideas?
>> > Thanks,
>> > Mark
>> >
>> > adams@titan-login6:~/petsc> make
>> PETSC_DIR=/ccs/proj/env003/petscv3.9-opt64-pgi PETSC_ARCH="" test
>> > Running test examples to verify correct installation
>> > Using PETSC_DIR=/ccs/proj/env003/petscv3.9-opt64-pgi and PETSC_ARCH=
>> > ***Error detected during compile or
>> link!***
>> > See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> > /ccs/home/adams/petsc/src/snes/examples/tutorials ex19
>> >
>> *
>> > cc -o ex19.o -c -fast -mp
>>  -I/ccs/proj/env003/petscv3.9-opt64-pgi/include
>> -I/ccs/proj/env003/petscv3.9-opt64-pgi/include
>> -I/ccs/proj/env003/petscv3.9-opt64-pgi/include`pwd`/ex19.c
>> > cc -fast -mp  -o ex19  ex19.o
>> -L/ccs/proj/env003/petscv3.9-opt64-pgi/lib
>> -Wl,-rpath,/ccs/proj/env003/petscv3.9-opt64-pgi/lib
>> -L/ccs/proj/env003/petscv3.9-opt64-pgi/lib -lpetsc -lHYPRE -lflapack
>> -lfblas -lparmetis -lmetis -lstdc++ -ldl
>> > /ccs/proj/env003/petscv3.9-opt64-pgi/lib/libpetsc.a(dlimpl.o): In
>> function `PetscDLOpen':
>> > /autofs/nccs-svm1_home1/adams/petsc/src/sys/dll/dlimpl.c:53: warning:
>> Using 'dlopen' in statically linked applications requires at runtime the
>> shared libraries from the glibc version used for linking
>> > /ccs/proj/env003/petscv3.9-opt64-pgi/lib/libpetsc.a(send.o): In
>> function `PetscOpenSocket':
>> >
>> /autofs/nccs-svm1_home1/adams/petsc/src/sys/classes/viewer/impls/socket/send.c:101:
>> warning: Using 'gethostbyname' in statically linked applications requires
>> at runtime the shared libraries from the glibc version used for linking
>> > /bin/rm -f ex19.o
>> > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1
>> MPI process
>> > See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> > [NID 18525] 2018-06-28 18:25:39 Exec ./ex19 failed: chdir
>> /autofs/nccs-svm1_home1/adams/petsc/src/snes/examples/tutorials No such
>> file or directory
>> > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2
>> MPI processes
>> > See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> > [NID 18525] 2018-06-28 18:25:46 Exec ./ex19 failed: chdir
>> /autofs/nccs-svm1_home1/adams/petsc/src/snes/examples/tutorials No such
>> file or directory
>> > 1,5c1
>> > < lid velocity = 0.0016, prandtl # = 1., grashof # 

Re: [petsc-dev] testing on Titan

2018-06-29 Thread Lawrence Mitchell



> On 29 Jun 2018, at 12:46, Mark Adams  wrote:
> 
> I can not even run a job here. This is from an interactive shell:
> 
> 64-pgi ex19-batch1:~/petsc/src/snes/examples/tutorials> make 
> PETSC_DIR=/autofs/nccs-svm1_home1/adams/petsc PETSC_ARCH=arch-titan-dbg6
> cc -o ex19.o -c -fast -mp   -I/autofs/nccs-svm1_home1/adams/petsc/include 
> -I/autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/include 
> -I/ccs/proj/env003/petscv3.9-dbg64-pgi/include`pwd`/ex19.c
> cc -fast -mp  -o ex19  ex19.o 
> -L/autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib 
> -Wl,-rpath,/ccs/proj/env003/petscv3.9-dbg64-pgi/lib 
> -L/ccs/proj/env003/petscv3.9-dbg64-pgi/lib -lpetsc -lHYPRE -lflapack -lfblas 
> -lparmetis -lmetis -lstdc++ -ldl
> /autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib/libpetsc.a(dlimpl.o):
>  In function `PetscDLOpen':
> /autofs/nccs-svm1_home1/adams/petsc/src/sys/dll/dlimpl.c:54: warning: Using 
> 'dlopen' in statically linked applications requires at runtime the shared 
> libraries from the glibc version used for linking
> /autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib/libpetsc.a(send.o):
>  In function `PetscOpenSocket':
> /autofs/nccs-svm1_home1/adams/petsc/src/sys/classes/viewer/impls/socket/send.c:106:
>  warning: Using 'gethostbyname' in statically linked applications requires at 
> runtime the shared libraries from the glibc version used for linking
> adams@titan-batch1:~/petsc/src/snes/examples/tutorials> aprun -n 1 ./ex19
> [NID 02258] 2018-06-29 07:45:09 Exec ./ex19 failed: chdir 
> /autofs/nccs-svm1_home1/adams/petsc/src/snes/examples/tutorials No such file 
> or directory
> adams@titan-batch1:~/petsc/src/snes/examples/tutorials> 

Lots of Crays, I think Titan is one (per 
https://www.olcf.ornl.gov/for-users/system-user-guides/titan/running-jobs/#filesystems-available-to-compute-nodes)
 do not mount all the filesystems on the compute nodes.

So plausibly you need to run from a "work" directory.

Lawrence

Re: [petsc-dev] testing on Titan

2018-06-29 Thread Mark Adams
I can not even run a job here. This is from an interactive shell:

64-pgi ex19-batch1:~/petsc/src/snes/examples/tutorials> make
PETSC_DIR=/autofs/nccs-svm1_home1/adams/petsc PETSC_ARCH=arch-titan-dbg6
cc -o ex19.o -c -fast -mp   -I/autofs/nccs-svm1_home1/adams/petsc/include
-I/autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/include
-I/ccs/proj/env003/petscv3.9-dbg64-pgi/include`pwd`/ex19.c
cc -fast -mp  -o ex19  ex19.o
-L/autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib
-Wl,-rpath,/ccs/proj/env003/petscv3.9-dbg64-pgi/lib
-L/ccs/proj/env003/petscv3.9-dbg64-pgi/lib -lpetsc -lHYPRE -lflapack
-lfblas -lparmetis -lmetis -lstdc++ -ldl
/autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib/libpetsc.a(dlimpl.o):
In function `PetscDLOpen':
/autofs/nccs-svm1_home1/adams/petsc/src/sys/dll/dlimpl.c:54: warning: Using
'dlopen' in statically linked applications requires at runtime the shared
libraries from the glibc version used for linking
/autofs/nccs-svm1_home1/adams/petsc/arch-titan-dbg64-pgi/lib/libpetsc.a(send.o):
In function `PetscOpenSocket':
/autofs/nccs-svm1_home1/adams/petsc/src/sys/classes/viewer/impls/socket/send.c:106:
warning: Using 'gethostbyname' in statically linked applications requires
at runtime the shared libraries from the glibc version used for linking
adams@titan-batch1:~/petsc/src/snes/examples/tutorials> aprun -n 1 ./ex19
[NID 02258] 2018-06-29 07:45:09 Exec ./ex19 failed: chdir
/autofs/nccs-svm1_home1/adams/petsc/src/snes/examples/tutorials No such
file or directory
adams@titan-batch1:~/petsc/src/snes/examples/tutorials>

Seung-Hoe runs here, maybe he can give me advice on what is wrong here.


On Thu, Jun 28, 2018 at 7:53 PM Smith, Barry F.  wrote:

>
>   1) We need to get this PGI compiler into our regular testing. It doesn't
> like some Fortran code the other compilers are ok with
>
>2) Mark,  likely you just need to submit the examples manually to run
> them. cd src/snes/examples/tutorials; make ex19  then use the batch system
> to submit the example
>
>Barry
>
>
> > On Jun 28, 2018, at 5:35 PM, Mark Adams  wrote:
> >
> > I am trying to run on Titan and I am having problems running the tests.
> I get this output. It says it can not find a directory, but I do an ls in
> after the error and you can see that it is there.
> >
> > I've attached the logs.
> >
> > Any ideas?
> > Thanks,
> > Mark
> >
> > adams@titan-login6:~/petsc> make
> PETSC_DIR=/ccs/proj/env003/petscv3.9-opt64-pgi PETSC_ARCH="" test
> > Running test examples to verify correct installation
> > Using PETSC_DIR=/ccs/proj/env003/petscv3.9-opt64-pgi and PETSC_ARCH=
> > ***Error detected during compile or
> link!***
> > See http://www.mcs.anl.gov/petsc/documentation/faq.html
> > /ccs/home/adams/petsc/src/snes/examples/tutorials ex19
> >
> *
> > cc -o ex19.o -c -fast -mp
>  -I/ccs/proj/env003/petscv3.9-opt64-pgi/include
> -I/ccs/proj/env003/petscv3.9-opt64-pgi/include
> -I/ccs/proj/env003/petscv3.9-opt64-pgi/include`pwd`/ex19.c
> > cc -fast -mp  -o ex19  ex19.o -L/ccs/proj/env003/petscv3.9-opt64-pgi/lib
> -Wl,-rpath,/ccs/proj/env003/petscv3.9-opt64-pgi/lib
> -L/ccs/proj/env003/petscv3.9-opt64-pgi/lib -lpetsc -lHYPRE -lflapack
> -lfblas -lparmetis -lmetis -lstdc++ -ldl
> > /ccs/proj/env003/petscv3.9-opt64-pgi/lib/libpetsc.a(dlimpl.o): In
> function `PetscDLOpen':
> > /autofs/nccs-svm1_home1/adams/petsc/src/sys/dll/dlimpl.c:53: warning:
> Using 'dlopen' in statically linked applications requires at runtime the
> shared libraries from the glibc version used for linking
> > /ccs/proj/env003/petscv3.9-opt64-pgi/lib/libpetsc.a(send.o): In function
> `PetscOpenSocket':
> >
> /autofs/nccs-svm1_home1/adams/petsc/src/sys/classes/viewer/impls/socket/send.c:101:
> warning: Using 'gethostbyname' in statically linked applications requires
> at runtime the shared libraries from the glibc version used for linking
> > /bin/rm -f ex19.o
> > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI
> process
> > See http://www.mcs.anl.gov/petsc/documentation/faq.html
> > [NID 18525] 2018-06-28 18:25:39 Exec ./ex19 failed: chdir
> /autofs/nccs-svm1_home1/adams/petsc/src/snes/examples/tutorials No such
> file or directory
> > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI
> processes
> > See http://www.mcs.anl.gov/petsc/documentation/faq.html
> > [NID 18525] 2018-06-28 18:25:46 Exec ./ex19 failed: chdir
> /autofs/nccs-svm1_home1/adams/petsc/src/snes/examples/tutorials No such
> file or directory
> > 1,5c1
> > < lid velocity = 0.0016, prandtl # = 1., grashof # = 1.
> > <   0 SNES Function norm 0.0406612
> > <   1 SNES Function norm 4.12227e-06
> > <   2 SNES Function norm 6.098e-11
> > < Number of SNES iterations = 2
> > ---
> > > [NID 18525] 2018-06-28 18:25:55 Exec ./ex19 failed: chdir
> /autofs/nccs-svm1_home1/adams/petsc/src/snes/examples/tutorials No such
>