Re: [petsc-users] PETSc with modern C++

2017-04-03 Thread Jed Brown
Matthew Knepley writes: >> BLAS. (Here a interesting point opens: I assume an efficient BLAS >> >> implementation, but I am not so sure about how the different BLAS do >> things >> >> internally. I work from the assumption that we have a very well tuned BLAS >> >>

Re: [petsc-users] examples of DMPlex*FVM methods

2017-04-03 Thread Matthew Knepley
On Mon, Apr 3, 2017 at 2:50 PM, Ingo Gaertner wrote: > Dear all, > as part of my studies I would like to implement a simple finite volume CFD > solver (following the textbook by Ferziger) on an unstructured, distributed > mesh. It seems like the DMPlex class with its

Re: [petsc-users] PETSc with modern C++

2017-04-03 Thread Matthew Knepley
On Mon, Apr 3, 2017 at 11:45 AM, Filippo Leonardi wrote: > On Monday, 3 April 2017 02:00:53 CEST you wrote: > > > On Sun, Apr 2, 2017 at 2:15 PM, Filippo Leonardi > > > > > > > wrote: > > > > Hello, > > > > > > > > I have a project in mind and

Re: [petsc-users] -snes_mf_operator yields "No support for this operation for this object type" in TS codes?

2017-04-03 Thread Barry Smith
> On Apr 3, 2017, at 10:05 AM, Jed Brown wrote: > > Barry Smith writes: > >> >> SNESGetUsingInternalMatMFFD(snes,); Then you can get rid of the >> horrible >> >> PetscBool flg; >> ierr = >>

Re: [petsc-users] Question about DMDA BOUNDARY_CONDITION set

2017-04-03 Thread Barry Smith
> On Apr 3, 2017, at 1:10 PM, Wenbo Zhao wrote: > > Barry, > Hi. I am sorry for too late to reply you. > I read the code you send to me which create a VecScatter for ghost points on > rotation boundary. > But I am still not clear to how to use it to assemble the

Re: [petsc-users] Configuring PETSc for KNL

2017-04-03 Thread Richard Mills
Yes, one should rely on MKL (or Cray LibSci, if using the Cray toolchain) on Cori. But I'm guessing that this will make no noticeable difference for what Justin is doing. --Richard On Mon, Apr 3, 2017 at 12:57 PM, murat keçeli wrote: > How about replacing

Re: [petsc-users] Configuring PETSc for KNL

2017-04-03 Thread murat keçeli
How about replacing --download-fblaslapack with vendor specific BLAS/LAPACK? Murat On Mon, Apr 3, 2017 at 2:45 PM, Richard Mills wrote: > On Mon, Apr 3, 2017 at 12:24 PM, Zhang, Hong wrote: > >> >> On Apr 3, 2017, at 1:44 PM, Justin Chang

[petsc-users] examples of DMPlex*FVM methods

2017-04-03 Thread Ingo Gaertner
Dear all, as part of my studies I would like to implement a simple finite volume CFD solver (following the textbook by Ferziger) on an unstructured, distributed mesh. It seems like the DMPlex class with its DMPlex*FVM methods has prepared much of what is needed for such a CFD solver. Unfortunately

Re: [petsc-users] Configuring PETSc for KNL

2017-04-03 Thread Richard Mills
On Mon, Apr 3, 2017 at 12:24 PM, Zhang, Hong wrote: > > On Apr 3, 2017, at 1:44 PM, Justin Chang wrote: > > Richard, > > This is what my job script looks like: > > #!/bin/bash > #SBATCH -N 16 > #SBATCH -C knl,quad,flat > #SBATCH -p regular > #SBATCH -J

Re: [petsc-users] Configuring PETSc for KNL

2017-04-03 Thread Zhang, Hong
On Apr 3, 2017, at 1:44 PM, Justin Chang > wrote: Richard, This is what my job script looks like: #!/bin/bash #SBATCH -N 16 #SBATCH -C knl,quad,flat #SBATCH -p regular #SBATCH -J knlflat1024 #SBATCH -L SCRATCH #SBATCH -o knlflat1024.o%j #SBATCH

Re: [petsc-users] Configuring PETSc for KNL

2017-04-03 Thread Justin Chang
Richard, This is what my job script looks like: #!/bin/bash #SBATCH -N 16 #SBATCH -C knl,quad,flat #SBATCH -p regular #SBATCH -J knlflat1024 #SBATCH -L SCRATCH #SBATCH -o knlflat1024.o%j #SBATCH --mail-type=ALL #SBATCH --mail-user=jychan...@gmail.com #SBATCH -t 00:20:00 #run the application: cd

Re: [petsc-users] Configuring PETSc for KNL

2017-04-03 Thread Richard Mills
Fixing typo: Meant to say "Keep in mind that individual KNL cores are much less powerful than an individual Haswell *core*." --Richard On Mon, Apr 3, 2017 at 11:36 AM, Richard Mills wrote: > Hi Justin, > > How is the MCDRAM (on-package "high-bandwidth memory")

Re: [petsc-users] Configuring PETSc for KNL

2017-04-03 Thread Richard Mills
Hi Justin, How is the MCDRAM (on-package "high-bandwidth memory") configured for your KNL runs? And if it is in "flat" mode, what are you doing to ensure that you use the MCDRAM? Doing this wrong seems to be one of the most common reasons for unexpected poor performance on KNL. I'm not that

Re: [petsc-users] Slepc JD and GD converge to wrong eigenpair

2017-04-03 Thread Jose E. Roman
> El 1 abr 2017, a las 0:01, Toon Weyens escribió: > > Dear jose, > > I have saved the matrices in Matlab format and am sending them to you using > pCloud. If you want another format, please tell me. Please also note that > they are about 1.4GB each. > > I also attach

[petsc-users] Configuring PETSc for KNL

2017-04-03 Thread Justin Chang
Hi all, On NERSC's Cori I have the following configure options for PETSc: ./configure --download-fblaslapack --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0 --with-debugging=0 --with-fc=ftn --with-fortranlib-autodetect=0 --with-mpiexec=srun --with-64-bit-indices=1

Re: [petsc-users] Correlation between da_refine and pg_mg_levels

2017-04-03 Thread Justin Chang
So my makefile/script is slightly different from the tutorial directory. Basically I have a shell for loop that runs the 'make runex48' four times where -da_refine is increased each time. It showed Levels 1 0 then 2 1 0 because the job was in the middle of the loop, and I cancelled it halfway when

Re: [petsc-users] PETSc with modern C++

2017-04-03 Thread Michael Povolotskyi
Hi Filippo, to recompile Petsc twice is easy. The difficulty is that in both libraries there will be the same symbols for double and double complex functions. If they were a part of a C++ namespaces, then it would be easier. Michael. On 04/03/2017 12:45 PM, Filippo Leonardi wrote: On Monday,

Re: [petsc-users] Question about DMDA BOUNDARY_CONDITION set

2017-04-03 Thread Wenbo Zhao
Barry, Hi. I am sorry for too late to reply you. I read the code you send to me which create a VecScatter for ghost points on rotation boundary. But I am still not clear to how to use it to assemble the matrix. I studied the example "$SLEPC_DIR/src/eps/examples/tutorials/ex19.c", which is a 3D

Re: [petsc-users] PETSc with modern C++

2017-04-03 Thread Filippo Leonardi
On Monday, 3 April 2017 02:00:53 CEST you wrote: > On Sun, Apr 2, 2017 at 2:15 PM, Filippo Leonardi > > wrote: > > Hello, > > > > I have a project in mind and seek feedback. > > > > Disclaimer: I hope I am not abusing of this mailing list with this idea. > > If

Re: [petsc-users] -snes_mf_operator yields "No support for this operation for this object type" in TS codes?

2017-04-03 Thread Jed Brown
Barry Smith writes: >> On Apr 3, 2017, at 8:51 AM, Jed Brown wrote: >> >> Barry Smith writes: >> >>> Jed, >>> >>>Here is the problem. >>> >>> https://bitbucket.org/petsc/petsc/branch/barry/fix/even-huger-flaw-in-ts >> >>

Re: [petsc-users] -snes_mf_operator yields "No support for this operation for this object type" in TS codes?

2017-04-03 Thread Barry Smith
> On Apr 3, 2017, at 8:51 AM, Jed Brown wrote: > > Barry Smith writes: > >> Jed, >> >>Here is the problem. >> >> https://bitbucket.org/petsc/petsc/branch/barry/fix/even-huger-flaw-in-ts > > Hmm, when someone uses -snes_mf_operator, we really

Re: [petsc-users] -snes_mf_operator yields "No support for this operation for this object type" in TS codes?

2017-04-03 Thread Jed Brown
Barry Smith writes: >Jed, > > Here is the problem. > > https://bitbucket.org/petsc/petsc/branch/barry/fix/even-huger-flaw-in-ts Hmm, when someone uses -snes_mf_operator, we really just need SNESTSFormJacobian to ignore the Amat. However, the user is allowed to

Re: [petsc-users] Correlation between da_refine and pg_mg_levels

2017-04-03 Thread Jed Brown
Matthew Knepley writes: > I can't think why it would fail there, but DMDA really likes old numbers of > vertices, because it wants > to take every other point, 129 seems good. I will see if I can reproduce > once I get a chance. This problem uses periodic boundary conditions

Re: [petsc-users] Correlation between da_refine and pg_mg_levels

2017-04-03 Thread Matthew Knepley
On Mon, Apr 3, 2017 at 6:11 AM, Jed Brown wrote: > Justin Chang writes: > > > So if I begin with a 128x128x8 grid on 1032 procs, it works fine for the > > first two levels of da_refine. However, on the third level I get this > error: > > > > Level 3

Re: [petsc-users] Correlation between da_refine and pg_mg_levels

2017-04-03 Thread Jed Brown
Justin Chang writes: > So if I begin with a 128x128x8 grid on 1032 procs, it works fine for the > first two levels of da_refine. However, on the third level I get this error: > > Level 3 domain size (m)1e+04 x1e+04 x1e+03, num elements 1024 x > 1024 x 57

Re: [petsc-users] Correlation between da_refine and pg_mg_levels

2017-04-03 Thread Justin Chang
I fixed the KNL issue - apparently "larger" jobs need to have the executable copied into the /tmp directory to speedup loading/startup time so I did that instead of executing the program via makefile. On Mon, Apr 3, 2017 at 12:45 AM, Justin Chang wrote: > So if I begin with

Re: [petsc-users] -snes_mf_operator yields "No support for this operation for this object type" in TS codes?

2017-04-03 Thread Ed Bueler
Works for my code and ts/../ex2.c ... as you probably know. Ed On Sun, Apr 2, 2017 at 9:54 PM, Barry Smith wrote: > >Jed, > > Here is the problem. > > https://bitbucket.org/petsc/petsc/branch/barry/fix/even-huger-flaw-in-ts > > > > On Apr 2, 2017, at 10:39 PM, Ed