Re: [petsc-users] [EXTERNAL] Re: Problem with PCFIELDSPLIT

2021-07-07 Thread Matthew Knepley
On Wed, Jul 7, 2021 at 2:33 PM Jorti, Zakariae  wrote:

> Hi Matt,
>
>
> Thanks for your quick reply.
>
> I have not completely understood your suggestion, could you please
> elaborate a bit more?
>
> For your convenience, here is how I am proceeding for the moment in my
> code:
>
>
> TSGetKSP(ts,&ksp);
>
> KSPGetPC(ksp,&pc);
>
> PCSetType(pc,PCFIELDSPLIT);
>
> PCFieldSplitSetDetectSaddlePoint(pc,PETSC_TRUE);
>
> PCSetUp(pc);
>
> PCFieldSplitGetSubKSP(pc, &n, &subksp);
>
> KSPGetPC(subksp[1], &(subpc[1]));
>
I do not like the two lines above. We should not have to do this.

> KSPSetOperators(subksp[1],T,T);
>
 In the above line, I want you to use a separate preconditioning matrix M,
instead of T. That way, it will provide
the preconditioning matrix for your Schur complement problem.

  Thanks,

  Matt

> KSPSetUp(subksp[1]);
>
> PetscFree(subksp);
>
> TSSolve(ts,X);
>
>
> Thank you.
>
> Best,
>
>
> Zakariae
> --
> *From:* Matthew Knepley 
> *Sent:* Wednesday, July 7, 2021 12:11:10 PM
> *To:* Jorti, Zakariae
> *Cc:* petsc-users@mcs.anl.gov; Tang, Qi; Tang, Xianzhu
> *Subject:* [EXTERNAL] Re: [petsc-users] Problem with PCFIELDSPLIT
>
> On Wed, Jul 7, 2021 at 1:51 PM Jorti, Zakariae via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
>> Hi,
>>
>>
>> I am trying to build a PCFIELDSPLIT preconditioner for a matrix
>>
>> J =  [A00  A01]
>>
>>[A10  A11]
>>
>> that has the following shape:
>>
>>
>> M_{user}^{-1} = [I   -ksp(A00) A01] [ksp(A00)   0] [I
>>0]
>>
>>   [0I]  [0
>> ksp(T)] [-A10 ksp(A00)  I ]
>>
>>
>> where T is a user-defined Schur complement approximation that replaces
>> the true Schur complement S:= A11 - A10 ksp(A00) A01.
>>
>>
>> I am trying to do something similar to this example (lines 41--45 and
>> 116--121):
>> https://www.mcs.anl.gov/petsc/petsc-current/src/snes/tutorials/ex70.c.html
>>
>>
>> The problem I have is that I manage to replace S with T on a
>> separate single linear system but not for the linear systems generated by
>> my time-dependent PDE. Even if I set the preconditioner M_{user}^{-1}
>> correctly, the T matrix gets replaced by S in the preconditioner once I
>> call TSSolve.
>>
>> Do you have any suggestions how to fix this knowing that the matrix J
>> does not change over time?
>>
>> I don't like how it is done in that example for this very reason.
>
> When I want to use a custom preconditioning matrix for the Schur
> complement, I always give a preconditioning matrix M to the outer solve.
> Then PCFIELDSPLIT automatically pulls the correct block from M, (1,1) for
> the Schur complement, for that preconditioning matrix without
> extra code. Can you do this?
>
>   Thanks,
>
> Matt
>
>> Many thanks.
>>
>>
>> Best regards,
>>
>>
>> Zakariae
>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] download zlib error

2021-07-07 Thread Matthew Knepley
On Wed, Jul 7, 2021 at 3:45 PM Mark Adams  wrote:

> No diffs. I added this:
>
> diff --git a/config/BuildSystem/config/packages/zlib.py
> b/config/BuildSystem/config/packages/zlib.py
> index fbf9bdf4a0..b76d362536 100644
> --- a/config/BuildSystem/config/packages/zlib.py
> +++ b/config/BuildSystem/config/packages/zlib.py
> @@ -25,6 +25,7 @@ class Configure(config.package.Package):
>  self.pushLanguage('C')
>  args.append('CC="'+self.getCompiler()+'"')
>
>  args.append('CFLAGS="'+self.updatePackageCFlags(self.getCompilerFlags())+'"')
> +args.append('LDFLAGS="'+self.getLinkerFlags()+'"')
>  args.append('prefix="'+self.installDir+'"')
>  self.popLanguage()
>  args=' '.join(args)
> lines 1-12/12 (END)
>
> but it still fails.
>

It is hard to understand how that was added. You can see that 'CC',
'CFLAGS', and 'prefix' are present on the line below, but 'LDFLAGS' is not.
That is difficult to reconcile.

  Thanks,

 Matt


> 15:20 jczhang/fix-kokkos-includes *=
> /gpfs/alpine/csc314/scratch/adams/petsc$ cd
> /gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
> && CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0
> -I${ROCM_PATH}/include"
> prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos"
> ./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install
> Checking for shared library support...
> Building shared library libz.so.1.2.11 with cc.
> Checking for size_t... Yes.
> Checking for off64_t... Yes.
> Checking for fseeko... Yes.
> Checking for strerror... No.
> Checking for unistd.h... Yes.
> Checking for stdarg.h... Yes.
> Checking whether to use vs[n]printf() or s[n]printf()... using
> vs[n]printf().
> Checking for vsnprintf() in stdio.h... No.
>   WARNING: vsnprintf() not found, falling back to vsprintf(). zlib
>   can build but will be open to possible buffer-overflow security
>   vulnerabilities.
> Checking for return value of vsprintf()... Yes.
> Checking for attribute(visibility) support... Yes.
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o example.o
> test/example.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o adler32.o adler32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o crc32.o crc32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o deflate.o deflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o infback.o infback.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inffast.o inffast.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inflate.o inflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inftrees.o inftrees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o trees.o trees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o zutil.o zutil.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o compress.o compress.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o uncompr.o uncompr.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzclose.o gzclose.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzlib.o gzlib.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -D

Re: [petsc-users] Scaling of the Petsc Binary Viewer

2021-07-07 Thread Matthew Knepley
On Wed, Jul 7, 2021 at 3:49 PM Thibault Bridel-Bertomeu <
thibault.bridelberto...@gmail.com> wrote:

> Hi Dave,
>
> Thank you for your fast answer.
>
> To postprocess the files in python, I use the PetscBinaryIO package that
> is provided with PETSc, yes.
>
> I load the file like this :
>
>
>
>
>
>
>
>
> *import numpy as npimport meshioimport PetscBinaryIO as pioimport
> matplotlib as mplimport matplotlib.pyplot as pltimport matplotlib.cm
>  as cmmpl.use('Agg')*
>
>
>
>
>
>
>
>
>
>
>
> *restartname = "restart_1001.bin"print("Reading {}
> ...".format(restartname))io = pio.PetscBinaryIO()fh =
> open(restartname)objecttype = io.readObjectType(fh)data = Noneif objecttype
> == 'Vec':data = io.readVec(fh)print("Size of data = ",
> data.size)print("Size of a single variable (4 variables) = ", data.size /
> 4)assert(np.isclose(data.size / 4.0, np.floor(data.size / 4.0)))*
>
> Then I load the mesh (it's from Gmsh so I use the meshio package) :
>
>
>
>
>
> *meshname = "ForwardFacing.msh"print("Reading {}
> ...".format(meshname))mesh = meshio.read(meshname)print("Number of vertices
> = ", mesh.points.shape[0])print("Number of cells = ",
> mesh.cells_dict['quad'].shape[0])*
>
> From the 'data' and the 'mesh' I use tricontourf from matplotlib to plot
> the figure.
>
> I removed the call to ...SetUseMPIIO... and it gives the same kind of data
> yes (I attached a figure of the data obtained with the binary viewer
> without MPI I/O).
>
> Maybe it's just a connectivity issue ? Maybe the way the Vec is written by
> the PETSc viewer somehow does not match the connectivity from the ori Gmsh
> file but some other connectivity of the partitionned DMPlex ?
>

Yes, when you distribute the mesh, it gets permuted so that each piece is
contiguous. This happens on all meshes (DMDA, DMStag, DMPlex, DMForest).
When it is written out, it just concatenates that ordering, or what we
usually call the "global order" since it is the order of a global vector.


> If so, is there a way to get the latter ?
>

If you call


https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMSetUseNatural.html

before distribution, then a mapping back to the original ordering will be
saved. You can use
that mapping with a global vector and an original vector


https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html

to get a vector in the original ordering. However, you would also need to
understand how you want that output.
The ExodusII viewer uses this by default since people how use it (Blaise)
generally want that. Most people using
HDF5 (me) want the new order since it is faster. Plex ex15 and ex26 show
some manipulations using this mapping.


> I know the binary viewer does not work on DMPlex,
>

You can output the Vec in native format, which will use the GlobalToNatural
reordering. It will not output the Plex,
but you will have the values in the order you expect.


> the VTK viewer yields a corrupted dataset
>

VTK is not supported. We support Paraview through the Xdmf extension to
HDF5.


> and I have issues with HDF5 viewer with MPI (see another recent thread of
> mine) ...
>

I have not been able to reproduce this yet.

  Thanks,

  Matt


> Thanks again for your help !!
>
> Thibault
>
> Le mer. 7 juil. 2021 à 20:54, Dave May  a écrit :
>
>>
>>
>> On Wed 7. Jul 2021 at 20:41, Thibault Bridel-Bertomeu <
>> thibault.bridelberto...@gmail.com> wrote:
>>
>>> Dear all,
>>>
>>> I have been having issues with large Vec (based on DMPLex) and massive
>>> MPI I/O  ... it looks like the data that is written by the Petsc Binary
>>> Viewer is gibberish for large meshes split on a high number of processes.
>>> For instance, I am using a mesh that has around 50 million cells, split on
>>> 1024 processors.
>>> The computation seems to run fine, the timestep computed from the data
>>> makes sense so I think internally everything is fine. But when I look at
>>> the solution (one example attached) it's noise - at this point it should
>>> show a bow shock developing on the left near the step.
>>> The piece of code I use is below for the output :
>>>
>>> call DMGetOutputSequenceNumber(dm, save_seqnum,
>>> save_seqval, ierr); CHKERRA(ierr)
>>> call DMSetOutputSequenceNumber(dm, -1, 0.d0, ierr);
>>> CHKERRA(ierr)
>>> write(filename,'(A,I8.8,A)') "restart_", stepnum, ".bin"
>>> call PetscViewerCreate(PETSC_COMM_WORLD, binViewer,
>>> ierr); CHKERRA(ierr)
>>> call PetscViewerSetType(binViewer, PETSCVIEWERBINARY,
>>> ierr); CHKERRA(ierr)
>>> call PetscViewerFileSetMode(binViewer, FILE_MODE_WRITE,
>>> ierr); CHKERRA(ierr);
>>> call PetscViewerBinarySetUseMPIIO(binViewer, PETSC_TRUE,
>>> ierr); CHKERRA(ierr);
>>>
>>>
>>
>> Do you get the correct output if you don’t call the function above (or
>> equivalently use PETSC_FALSE)
>>
>>
>> call PetscViewerFileSetName(binViewer, trim(filen

Re: [petsc-users] Scaling of the Petsc Binary Viewer

2021-07-07 Thread Thibault Bridel-Bertomeu
Hi Dave,

Thank you for your fast answer.

To postprocess the files in python, I use the PetscBinaryIO package that is
provided with PETSc, yes.

I load the file like this :








*import numpy as npimport meshioimport PetscBinaryIO as pioimport
matplotlib as mplimport matplotlib.pyplot as pltimport matplotlib.cm
 as cmmpl.use('Agg')*











*restartname = "restart_1001.bin"print("Reading {}
...".format(restartname))io = pio.PetscBinaryIO()fh =
open(restartname)objecttype = io.readObjectType(fh)data = Noneif objecttype
== 'Vec':data = io.readVec(fh)print("Size of data = ",
data.size)print("Size of a single variable (4 variables) = ", data.size /
4)assert(np.isclose(data.size / 4.0, np.floor(data.size / 4.0)))*

Then I load the mesh (it's from Gmsh so I use the meshio package) :





*meshname = "ForwardFacing.msh"print("Reading {} ...".format(meshname))mesh
= meshio.read(meshname)print("Number of vertices = ",
mesh.points.shape[0])print("Number of cells = ",
mesh.cells_dict['quad'].shape[0])*

>From the 'data' and the 'mesh' I use tricontourf from matplotlib to plot
the figure.

I removed the call to ...SetUseMPIIO... and it gives the same kind of data
yes (I attached a figure of the data obtained with the binary viewer
without MPI I/O).

Maybe it's just a connectivity issue ? Maybe the way the Vec is written by
the PETSc viewer somehow does not match the connectivity from the ori Gmsh
file but some other connectivity of the partitionned DMPlex ? If so, is
there a way to get the latter ? I know the binary viewer does not work on
DMPlex, the VTK viewer yields a corrupted dataset and I have issues with
HDF5 viewer with MPI (see another recent thread of mine) ...

Thanks again for your help !!

Thibault

Le mer. 7 juil. 2021 à 20:54, Dave May  a écrit :

>
>
> On Wed 7. Jul 2021 at 20:41, Thibault Bridel-Bertomeu <
> thibault.bridelberto...@gmail.com> wrote:
>
>> Dear all,
>>
>> I have been having issues with large Vec (based on DMPLex) and massive
>> MPI I/O  ... it looks like the data that is written by the Petsc Binary
>> Viewer is gibberish for large meshes split on a high number of processes.
>> For instance, I am using a mesh that has around 50 million cells, split on
>> 1024 processors.
>> The computation seems to run fine, the timestep computed from the data
>> makes sense so I think internally everything is fine. But when I look at
>> the solution (one example attached) it's noise - at this point it should
>> show a bow shock developing on the left near the step.
>> The piece of code I use is below for the output :
>>
>> call DMGetOutputSequenceNumber(dm, save_seqnum,
>> save_seqval, ierr); CHKERRA(ierr)
>> call DMSetOutputSequenceNumber(dm, -1, 0.d0, ierr);
>> CHKERRA(ierr)
>> write(filename,'(A,I8.8,A)') "restart_", stepnum, ".bin"
>> call PetscViewerCreate(PETSC_COMM_WORLD, binViewer,
>> ierr); CHKERRA(ierr)
>> call PetscViewerSetType(binViewer, PETSCVIEWERBINARY,
>> ierr); CHKERRA(ierr)
>> call PetscViewerFileSetMode(binViewer, FILE_MODE_WRITE,
>> ierr); CHKERRA(ierr);
>> call PetscViewerBinarySetUseMPIIO(binViewer, PETSC_TRUE,
>> ierr); CHKERRA(ierr);
>>
>>
>
> Do you get the correct output if you don’t call the function above (or
> equivalently use PETSC_FALSE)
>
>
> call PetscViewerFileSetName(binViewer, trim(filename), ierr); CHKERRA(ierr)
>> call VecView(X, binViewer, ierr); CHKERRA(ierr)
>> call PetscViewerDestroy(binViewer, ierr); CHKERRA(ierr)
>> call DMSetOutputSequenceNumber(dm, save_seqnum,
>> save_seqval, ierr); CHKERRA(ierr)
>>
>> I do not think there is anything wrong with it but of course I would be
>> happy to hear your feedback.
>> Nonetheless my question was : how far have you tested the binary mpi i/o
>> of a Vec ? Does it make some sense that for a 50 million cell mesh split on
>> 1024 processes, it could somehow fail ?
>> Or is it my python drawing method that is completely incapable of
>> handling this dataset ? (paraview displays the same thing though so I'm not
>> sure ...)
>>
>
> Are you using the python provided tools within petsc to load the Vec from
> file?
>
>
> Thanks,
> Dave
>
>
>
>> Thank you very much for your advice and help !!!
>>
>> Thibault
>>
>


Re: [petsc-users] download zlib error

2021-07-07 Thread Mark Adams
I do see this:

15:43  /sw/spock/spack-envs/views/rocm-4.1.0/lib$ nm libhsa-runtime64.so |
grep -n hsa_signal_load_scacquir
349:00074de0 T hsa_signal_load_scacquire
1491:0004bef0 t
_ZN4rocr3HSA25hsa_signal_load_scacquireE12hsa_signal_s
15:43  /sw/spock/spack-envs/views/rocm-4.1.0/lib$


On Wed, Jul 7, 2021 at 3:45 PM Mark Adams  wrote:

> No diffs. I added this:
>
> diff --git a/config/BuildSystem/config/packages/zlib.py
> b/config/BuildSystem/config/packages/zlib.py
> index fbf9bdf4a0..b76d362536 100644
> --- a/config/BuildSystem/config/packages/zlib.py
> +++ b/config/BuildSystem/config/packages/zlib.py
> @@ -25,6 +25,7 @@ class Configure(config.package.Package):
>  self.pushLanguage('C')
>  args.append('CC="'+self.getCompiler()+'"')
>
>  args.append('CFLAGS="'+self.updatePackageCFlags(self.getCompilerFlags())+'"')
> +args.append('LDFLAGS="'+self.getLinkerFlags()+'"')
>  args.append('prefix="'+self.installDir+'"')
>  self.popLanguage()
>  args=' '.join(args)
> lines 1-12/12 (END)
>
> but it still fails.
>
> 15:20 jczhang/fix-kokkos-includes *=
> /gpfs/alpine/csc314/scratch/adams/petsc$ cd
> /gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
> && CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0
> -I${ROCM_PATH}/include"
> prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos"
> ./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install
> Checking for shared library support...
> Building shared library libz.so.1.2.11 with cc.
> Checking for size_t... Yes.
> Checking for off64_t... Yes.
> Checking for fseeko... Yes.
> Checking for strerror... No.
> Checking for unistd.h... Yes.
> Checking for stdarg.h... Yes.
> Checking whether to use vs[n]printf() or s[n]printf()... using
> vs[n]printf().
> Checking for vsnprintf() in stdio.h... No.
>   WARNING: vsnprintf() not found, falling back to vsprintf(). zlib
>   can build but will be open to possible buffer-overflow security
>   vulnerabilities.
> Checking for return value of vsprintf()... Yes.
> Checking for attribute(visibility) support... Yes.
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o example.o
> test/example.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o adler32.o adler32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o crc32.o crc32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o deflate.o deflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o infback.o infback.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inffast.o inffast.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inflate.o inflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inftrees.o inftrees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o trees.o trees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o zutil.o zutil.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o compress.o compress.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o uncompr.o uncompr.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzclose.o gzclose.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzlib.o gzlib.c
> cc -fPIC -fstack-protector -

Re: [petsc-users] download zlib error

2021-07-07 Thread Mark Adams
No diffs. I added this:

diff --git a/config/BuildSystem/config/packages/zlib.py
b/config/BuildSystem/config/packages/zlib.py
index fbf9bdf4a0..b76d362536 100644
--- a/config/BuildSystem/config/packages/zlib.py
+++ b/config/BuildSystem/config/packages/zlib.py
@@ -25,6 +25,7 @@ class Configure(config.package.Package):
 self.pushLanguage('C')
 args.append('CC="'+self.getCompiler()+'"')

 args.append('CFLAGS="'+self.updatePackageCFlags(self.getCompilerFlags())+'"')
+args.append('LDFLAGS="'+self.getLinkerFlags()+'"')
 args.append('prefix="'+self.installDir+'"')
 self.popLanguage()
 args=' '.join(args)
lines 1-12/12 (END)

but it still fails.

15:20 jczhang/fix-kokkos-includes *=
/gpfs/alpine/csc314/scratch/adams/petsc$ cd
/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
&& CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0
-I${ROCM_PATH}/include"
prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos"
./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install
Checking for shared library support...
Building shared library libz.so.1.2.11 with cc.
Checking for size_t... Yes.
Checking for off64_t... Yes.
Checking for fseeko... Yes.
Checking for strerror... No.
Checking for unistd.h... Yes.
Checking for stdarg.h... Yes.
Checking whether to use vs[n]printf() or s[n]printf()... using
vs[n]printf().
Checking for vsnprintf() in stdio.h... No.
  WARNING: vsnprintf() not found, falling back to vsprintf(). zlib
  can build but will be open to possible buffer-overflow security
  vulnerabilities.
Checking for return value of vsprintf()... Yes.
Checking for attribute(visibility) support... Yes.
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o example.o
test/example.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o adler32.o adler32.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o crc32.o crc32.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o deflate.o deflate.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o infback.o infback.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inffast.o inffast.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inflate.o inflate.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inftrees.o inftrees.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o trees.o trees.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o zutil.o zutil.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o compress.o compress.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o uncompr.o uncompr.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzclose.o gzclose.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzlib.o gzlib.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzread.o gzread.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzwrite.o gzwrite.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDD

Re: [petsc-users] Scaling of the Petsc Binary Viewer

2021-07-07 Thread Dave May
On Wed 7. Jul 2021 at 20:41, Thibault Bridel-Bertomeu <
thibault.bridelberto...@gmail.com> wrote:

> Dear all,
>
> I have been having issues with large Vec (based on DMPLex) and massive MPI
> I/O  ... it looks like the data that is written by the Petsc Binary Viewer
> is gibberish for large meshes split on a high number of processes. For
> instance, I am using a mesh that has around 50 million cells, split on 1024
> processors.
> The computation seems to run fine, the timestep computed from the data
> makes sense so I think internally everything is fine. But when I look at
> the solution (one example attached) it's noise - at this point it should
> show a bow shock developing on the left near the step.
> The piece of code I use is below for the output :
>
> call DMGetOutputSequenceNumber(dm, save_seqnum,
> save_seqval, ierr); CHKERRA(ierr)
> call DMSetOutputSequenceNumber(dm, -1, 0.d0, ierr);
> CHKERRA(ierr)
> write(filename,'(A,I8.8,A)') "restart_", stepnum, ".bin"
> call PetscViewerCreate(PETSC_COMM_WORLD, binViewer, ierr);
> CHKERRA(ierr)
> call PetscViewerSetType(binViewer, PETSCVIEWERBINARY,
> ierr); CHKERRA(ierr)
> call PetscViewerFileSetMode(binViewer, FILE_MODE_WRITE,
> ierr); CHKERRA(ierr);
> call PetscViewerBinarySetUseMPIIO(binViewer, PETSC_TRUE,
> ierr); CHKERRA(ierr);
>
>

Do you get the correct output if you don’t call the function above (or
equivalently use PETSC_FALSE)


call PetscViewerFileSetName(binViewer, trim(filename), ierr); CHKERRA(ierr)
> call VecView(X, binViewer, ierr); CHKERRA(ierr)
> call PetscViewerDestroy(binViewer, ierr); CHKERRA(ierr)
> call DMSetOutputSequenceNumber(dm, save_seqnum,
> save_seqval, ierr); CHKERRA(ierr)
>
> I do not think there is anything wrong with it but of course I would be
> happy to hear your feedback.
> Nonetheless my question was : how far have you tested the binary mpi i/o
> of a Vec ? Does it make some sense that for a 50 million cell mesh split on
> 1024 processes, it could somehow fail ?
> Or is it my python drawing method that is completely incapable of handling
> this dataset ? (paraview displays the same thing though so I'm not sure ...)
>

Are you using the python provided tools within petsc to load the Vec from
file?


Thanks,
Dave



> Thank you very much for your advice and help !!!
>
> Thibault
>


[petsc-users] Scaling of the Petsc Binary Viewer

2021-07-07 Thread Thibault Bridel-Bertomeu
Dear all,

I have been having issues with large Vec (based on DMPLex) and massive MPI
I/O  ... it looks like the data that is written by the Petsc Binary Viewer
is gibberish for large meshes split on a high number of processes. For
instance, I am using a mesh that has around 50 million cells, split on 1024
processors.
The computation seems to run fine, the timestep computed from the data
makes sense so I think internally everything is fine. But when I look at
the solution (one example attached) it's noise - at this point it should
show a bow shock developing on the left near the step.
The piece of code I use is below for the output :

call DMGetOutputSequenceNumber(dm, save_seqnum,
save_seqval, ierr); CHKERRA(ierr)
call DMSetOutputSequenceNumber(dm, -1, 0.d0, ierr);
CHKERRA(ierr)
write(filename,'(A,I8.8,A)') "restart_", stepnum, ".bin"
call PetscViewerCreate(PETSC_COMM_WORLD, binViewer, ierr);
CHKERRA(ierr)
call PetscViewerSetType(binViewer, PETSCVIEWERBINARY,
ierr); CHKERRA(ierr)
call PetscViewerFileSetMode(binViewer, FILE_MODE_WRITE,
ierr); CHKERRA(ierr);
call PetscViewerBinarySetUseMPIIO(binViewer, PETSC_TRUE,
ierr); CHKERRA(ierr);
call PetscViewerFileSetName(binViewer, trim(filename),
ierr); CHKERRA(ierr)
call VecView(X, binViewer, ierr); CHKERRA(ierr)
call PetscViewerDestroy(binViewer, ierr); CHKERRA(ierr)
call DMSetOutputSequenceNumber(dm, save_seqnum,
save_seqval, ierr); CHKERRA(ierr)

I do not think there is anything wrong with it but of course I would be
happy to hear your feedback.
Nonetheless my question was : how far have you tested the binary mpi i/o of
a Vec ? Does it make some sense that for a 50 million cell mesh split on
1024 processes, it could somehow fail ?
Or is it my python drawing method that is completely incapable of handling
this dataset ? (paraview displays the same thing though so I'm not sure ...)

Thank you very much for your advice and help !!!

Thibault


Re: [petsc-users] [EXTERNAL] Re: Problem with PCFIELDSPLIT

2021-07-07 Thread Jorti, Zakariae via petsc-users
Hi Matt,


Thanks for your quick reply.

I have not completely understood your suggestion, could you please elaborate a 
bit more?

For your convenience, here is how I am proceeding for the moment in my code:


TSGetKSP(ts,&ksp);

KSPGetPC(ksp,&pc);

PCSetType(pc,PCFIELDSPLIT);

PCFieldSplitSetDetectSaddlePoint(pc,PETSC_TRUE);

PCSetUp(pc);

PCFieldSplitGetSubKSP(pc, &n, &subksp);

KSPGetPC(subksp[1], &(subpc[1]));

KSPSetOperators(subksp[1],T,T);

KSPSetUp(subksp[1]);

PetscFree(subksp);

TSSolve(ts,X);


Thank you.

Best,


Zakariae


From: Matthew Knepley 
Sent: Wednesday, July 7, 2021 12:11:10 PM
To: Jorti, Zakariae
Cc: petsc-users@mcs.anl.gov; Tang, Qi; Tang, Xianzhu
Subject: [EXTERNAL] Re: [petsc-users] Problem with PCFIELDSPLIT

On Wed, Jul 7, 2021 at 1:51 PM Jorti, Zakariae via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:

Hi,


I am trying to build a PCFIELDSPLIT preconditioner for a matrix

J =  [A00  A01]

   [A10  A11]

that has the following shape:


M_{user}^{-1} = [I   -ksp(A00) A01] [ksp(A00)   0] [I   
 0]

  [0I]  [0   
ksp(T)] [-A10 ksp(A00)  I ]


where T is a user-defined Schur complement approximation that replaces the true 
Schur complement S:= A11 - A10 ksp(A00) A01.


I am trying to do something similar to this example (lines 41--45 and 
116--121): 
https://www.mcs.anl.gov/petsc/petsc-current/src/snes/tutorials/ex70.c.html


The problem I have is that I manage to replace S with T on a separate single 
linear system but not for the linear systems generated by my time-dependent 
PDE. Even if I set the preconditioner M_{user}^{-1} correctly, the T matrix 
gets replaced by S in the preconditioner once I call TSSolve.

Do you have any suggestions how to fix this knowing that the matrix J does not 
change over time?

I don't like how it is done in that example for this very reason.

When I want to use a custom preconditioning matrix for the Schur complement, I 
always give a preconditioning matrix M to the outer solve.
Then PCFIELDSPLIT automatically pulls the correct block from M, (1,1) for the 
Schur complement, for that preconditioning matrix without
extra code. Can you do this?

  Thanks,

Matt

Many thanks.


Best regards,


Zakariae


--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/


Re: [petsc-users] [Ext] Re: [SLEPc] Computing Smallest Eigenvalue+Eigenvector of Many Small Matrices

2021-07-07 Thread Adam Denchfield via petsc-users
syevjBatched from cuSolver is quite good once it's configured fine. It's a
direct solve for all eigenpairs, works for batches of small matrices with
sizes up to (I believe) 32x32. The default CUDA example using it works
except if you have "too many" small matrices, in which case you'll overload
the GPU memory and need to further batch the calls. I found it to be fast
enough for my needs.

Regards,
*Adam Denchfield*

*Ph.D Student, Physics*
University of Illinois in Chicago
B.S.  Applied Physics (2018)
Illinois Institute of Technology
Email: adenc...@hawk.iit.edu


On Wed, Jul 7, 2021 at 2:31 AM Jose E. Roman  wrote:

> cuSolver has syevjBatched, which seems to fit your purpose. But I have
> never used it.
>
> Lanczos is not competitive for such small matrices.
>
> Jose
>
>
> > El 6 jul 2021, a las 21:56, Jed Brown  escribió:
> >
> > Have you tried just calling LAPACK directly? (You could try dsyevx to
> see if there's something to gain by computing less than all the
> eigenvalues.) I'm not aware of a batched interface at this time, but that's
> what you'd want for performance.
> >
> > Jacob Faibussowitsch  writes:
> >
> >> Hello PETSc/SLEPc users,
> >>
> >> Similar to a recent question I am looking for an algorithm to compute
> the smallest eigenvalue and eigenvector for a bunch of matrices however I
> have a few extra “restrictions”. All matrices have the following properties:
> >>
> >> - All matrices are the same size
> >> - All matrices are small (perhaps no larger than 12x12)
> >> - All matrices are SPD
> >> - I only need the smallest eigenpair
> >>
> >> So far my best bet seems to be Lanczos but I’m wondering if there is
> some wunder method I’ve overlooked.
> >>
> >> Best regards,
> >>
> >> Jacob Faibussowitsch
> >> (Jacob Fai - booss - oh - vitch)
>
>


Re: [petsc-users] Problem with PCFIELDSPLIT

2021-07-07 Thread Matthew Knepley
On Wed, Jul 7, 2021 at 1:51 PM Jorti, Zakariae via petsc-users <
petsc-users@mcs.anl.gov> wrote:

> Hi,
>
>
> I am trying to build a PCFIELDSPLIT preconditioner for a matrix
>
> J =  [A00  A01]
>
>[A10  A11]
>
> that has the following shape:
>
>
> M_{user}^{-1} = [I   -ksp(A00) A01] [ksp(A00)   0] [I
>0]
>
>   [0I]  [0
> ksp(T)] [-A10 ksp(A00)  I ]
>
>
> where T is a user-defined Schur complement approximation that replaces the
> true Schur complement S:= A11 - A10 ksp(A00) A01.
>
>
> I am trying to do something similar to this example (lines 41--45 and
> 116--121):
> https://www.mcs.anl.gov/petsc/petsc-current/src/snes/tutorials/ex70.c.html
>
>
> The problem I have is that I manage to replace S with T on a
> separate single linear system but not for the linear systems generated by
> my time-dependent PDE. Even if I set the preconditioner M_{user}^{-1}
> correctly, the T matrix gets replaced by S in the preconditioner once I
> call TSSolve.
>
> Do you have any suggestions how to fix this knowing that the matrix J does
> not change over time?
>
> I don't like how it is done in that example for this very reason.

When I want to use a custom preconditioning matrix for the Schur
complement, I always give a preconditioning matrix M to the outer solve.
Then PCFIELDSPLIT automatically pulls the correct block from M, (1,1) for
the Schur complement, for that preconditioning matrix without
extra code. Can you do this?

  Thanks,

Matt

> Many thanks.
>
>
> Best regards,
>
>
> Zakariae
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] download zlib error

2021-07-07 Thread Matthew Knepley
On Wed, Jul 7, 2021 at 1:40 PM Mark Adams  wrote:

>
>
> On Wed, Jul 7, 2021 at 1:26 PM Barry Smith  wrote:
>
>>
>>   You will need to pass the -L arguments appropriately to zlib's
>> ./configure so it can link its shared library appropriately. That is, the
>> zlib configure requires the value obtained with
>> L'+os.environ['ROCM_PATH'],+'/lib -lhsa-runtime64',
>>
>
> It's not clear to me how to do that.  I added the -L to my configure
> script. It is not clear to me how to modify Matt's command.
>

Can you try this?

knepley/feature-orientation-rethink *$:/PETSc3/petsc/petsc-dev$ git diff
config/BuildSystem/config/packages/zlib.py
diff --git a/config/BuildSystem/config/packages/zlib.py
b/config/BuildSystem/config/packages/zlib.py
index fbf9bdf4a0a..b76d3625364 100644
--- a/config/BuildSystem/config/packages/zlib.py
+++ b/config/BuildSystem/config/packages/zlib.py
@@ -25,6 +25,7 @@ class Configure(config.package.Package):

 self.pushLanguage('C')
 args.append('CC="'+self.getCompiler()+'"')

 args.append('CFLAGS="'+self.updatePackageCFlags(self.getCompilerFlags())+'"')
+args.append('LDFLAGS="'+self.getLinkerFlags()+'"')
 args.append('prefix="'+self.installDir+'"')
 self.popLanguage()
 args=' '.join(args)

  Matt


>
>> On Jul 7, 2021, at 12:18 PM, Mark Adams  wrote:
>>
>> Well, still getting these hsa errors:
>>
>> 13:07 jczhang/fix-kokkos-includes=
>> /gpfs/alpine/csc314/scratch/adams/petsc$ !136
>> cd
>> /gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
>> && CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I${ROCM_PATH}/include"
>> prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos"
>> ./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install
>> Checking for shared library support...
>> Building shared library libz.so.1.2.11 with cc.
>> Checking for size_t... Yes.
>> Checking for off64_t... Yes.
>> Checking for fseeko... Yes.
>> Checking for strerror... No.
>> Checking for unistd.h... Yes.
>> Checking for stdarg.h... Yes.
>> Checking whether to use vs[n]printf() or s[n]printf()... using
>> vs[n]printf().
>> Checking for vsnprintf() in stdio.h... No.
>>   WARNING: vsnprintf() not found, falling back to vsprintf(). zlib
>>   can build but will be open to possible buffer-overflow security
>>   vulnerabilities.
>> Checking for return value of vsprintf()... Yes.
>> Checking for attribute(visibility) support... Yes.
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o example.o
>> test/example.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o adler32.o adler32.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o crc32.o crc32.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o deflate.o deflate.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o infback.o infback.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inffast.o inffast.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inflate.o inflate.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inftrees.o inftrees.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o trees.o trees.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o zutil.o zutil.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o compress.o compress.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o uncompr.o uncompr.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/sp

[petsc-users] Problem with PCFIELDSPLIT

2021-07-07 Thread Jorti, Zakariae via petsc-users
Hi,


I am trying to build a PCFIELDSPLIT preconditioner for a matrix

J =  [A00  A01]

   [A10  A11]

that has the following shape:


M_{user}^{-1} = [I   -ksp(A00) A01] [ksp(A00)   0] [I   
 0]

  [0I]  [0   
ksp(T)] [-A10 ksp(A00)  I ]


where T is a user-defined Schur complement approximation that replaces the true 
Schur complement S:= A11 - A10 ksp(A00) A01.


I am trying to do something similar to this example (lines 41--45 and 
116--121): 
https://www.mcs.anl.gov/petsc/petsc-current/src/snes/tutorials/ex70.c.html


The problem I have is that I manage to replace S with T on a separate single 
linear system but not for the linear systems generated by my time-dependent 
PDE. Even if I set the preconditioner M_{user}^{-1} correctly, the T matrix 
gets replaced by S in the preconditioner once I call TSSolve.

Do you have any suggestions how to fix this knowing that the matrix J does not 
change over time?

Many thanks.


Best regards,


Zakariae


Re: [petsc-users] download zlib error

2021-07-07 Thread Mark Adams
On Wed, Jul 7, 2021 at 1:26 PM Barry Smith  wrote:

>
>   You will need to pass the -L arguments appropriately to zlib's
> ./configure so it can link its shared library appropriately. That is, the
> zlib configure requires the value obtained with
> L'+os.environ['ROCM_PATH'],+'/lib -lhsa-runtime64',
>

It's not clear to me how to do that.  I added the -L to my configure
script. It is not clear to me how to modify Matt's command.


> On Jul 7, 2021, at 12:18 PM, Mark Adams  wrote:
>
> Well, still getting these hsa errors:
>
> 13:07 jczhang/fix-kokkos-includes=
> /gpfs/alpine/csc314/scratch/adams/petsc$ !136
> cd
> /gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
> && CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0
> -I${ROCM_PATH}/include"
> prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos"
> ./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install
> Checking for shared library support...
> Building shared library libz.so.1.2.11 with cc.
> Checking for size_t... Yes.
> Checking for off64_t... Yes.
> Checking for fseeko... Yes.
> Checking for strerror... No.
> Checking for unistd.h... Yes.
> Checking for stdarg.h... Yes.
> Checking whether to use vs[n]printf() or s[n]printf()... using
> vs[n]printf().
> Checking for vsnprintf() in stdio.h... No.
>   WARNING: vsnprintf() not found, falling back to vsprintf(). zlib
>   can build but will be open to possible buffer-overflow security
>   vulnerabilities.
> Checking for return value of vsprintf()... Yes.
> Checking for attribute(visibility) support... Yes.
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o example.o
> test/example.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o adler32.o adler32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o crc32.o crc32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o deflate.o deflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o infback.o infback.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inffast.o inffast.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inflate.o inflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inftrees.o inftrees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o trees.o trees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o zutil.o zutil.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o compress.o compress.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o uncompr.o uncompr.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzclose.o gzclose.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzlib.o gzlib.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzread.o gzread.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzwrite.o gzwrite.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprint

Re: [petsc-users] download zlib error

2021-07-07 Thread Barry Smith

  You will need to pass the -L arguments appropriately to zlib's ./configure so 
it can link its shared library appropriately. That is, the zlib configure 
requires the value obtained with L'+os.environ['ROCM_PATH'],+'/lib 
-lhsa-runtime64', 

> On Jul 7, 2021, at 12:18 PM, Mark Adams  wrote:
> 
> Well, still getting these hsa errors:
> 
> 13:07 jczhang/fix-kokkos-includes= /gpfs/alpine/csc314/scratch/adams/petsc$ 
> !136
> cd 
> /gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
>  && CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I${ROCM_PATH}/include" 
> prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos" 
> ./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install
> Checking for shared library support...
> Building shared library libz.so.1.2.11 with cc.
> Checking for size_t... Yes.
> Checking for off64_t... Yes.
> Checking for fseeko... Yes.
> Checking for strerror... No.
> Checking for unistd.h... Yes.
> Checking for stdarg.h... Yes.
> Checking whether to use vs[n]printf() or s[n]printf()... using vs[n]printf().
> Checking for vsnprintf() in stdio.h... No.
>   WARNING: vsnprintf() not found, falling back to vsprintf(). zlib
>   can build but will be open to possible buffer-overflow security
>   vulnerabilities.
> Checking for return value of vsprintf()... Yes.
> Checking for attribute(visibility) support... Yes.
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o example.o test/example.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o adler32.o adler32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o crc32.o crc32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o deflate.o deflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o infback.o infback.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inffast.o inffast.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inflate.o inflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inftrees.o inftrees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o trees.o trees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o zutil.o zutil.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o compress.o compress.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o uncompr.o uncompr.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzclose.o gzclose.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzlib.o gzlib.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzread.o gzread.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzwrite.o gzwrite.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o minigzip.o 
> test/minigzip.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.

Re: [petsc-users] download zlib error

2021-07-07 Thread Mark Adams
Thanks, it was missing the /

'--LDFLAGS=-L'+os.environ['ROCM_PATH'],+'/lib -lhsa-runtime64',

On Wed, Jul 7, 2021 at 12:48 PM Matthew Knepley  wrote:

> Did you look in /sw/spock/spack-envs/views/rocm-4.1.0lib ?
>
>   Matt
>
> On Wed, Jul 7, 2021 at 12:29 PM Mark Adams  wrote:
>
>> Ok, I tried that but now I get this error.
>>
>> On Wed, Jul 7, 2021 at 12:13 PM Stefano Zampini <
>> stefano.zamp...@gmail.com> wrote:
>>
>>> There's an extra comma
>>>
>>> Il Mer 7 Lug 2021, 18:08 Mark Adams  ha scritto:
>>>
 Humm, I get this error (I just copied your whole file into here):

 12:06 jczhang/fix-kokkos-includes=
 /gpfs/alpine/csc314/scratch/adams/petsc$ ~/arch-spock-dbg-cray-kokkos.py
 Traceback (most recent call last):
   File "/ccs/home/adams/arch-spock-dbg-cray-kokkos.py", line 27, in
 
 '--LDFLAGS=-L'+os.environ['ROCM_PATH'],+'lib -lhsa-runtime64',
 TypeError: bad operand type for unary +: 'str'

 On Wed, Jul 7, 2021 at 11:08 AM Stefano Zampini <
 stefano.zamp...@gmail.com> wrote:

> Mark
>
> On Spock, you can use
> https://gitlab.com/petsc/petsc/-/blob/main/config/examples/arch-olcf-spock.py
>  as
> a template for your configuration. You need to add libraries as LDFLAGS to
> resolve the hsa symbols
>
> On Jul 7, 2021, at 5:04 PM, Mark Adams  wrote:
>
> Thanks,
>
> 08:30 jczhang/fix-kokkos-includes=
> /gpfs/alpine/csc314/scratch/adams/petsc$ cd
> /gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
> && CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0
> -I${ROCM_PATH}/include"
> prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos"
> ./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install
> Checking for shared library support...
> Building shared library libz.so.1.2.11 with cc.
> Checking for size_t... Yes.
> Checking for off64_t... Yes.
> Checking for fseeko... Yes.
> Checking for strerror... No.
> Checking for unistd.h... Yes.
> Checking for stdarg.h... Yes.
> Checking whether to use vs[n]printf() or s[n]printf()... using
> vs[n]printf().
> Checking for vsnprintf() in stdio.h... No.
>   WARNING: vsnprintf() not found, falling back to vsprintf(). zlib
>   can build but will be open to possible buffer-overflow security
>   vulnerabilities.
> Checking for return value of vsprintf()... Yes.
> Checking for attribute(visibility) support... Yes.
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o example.o
> test/example.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o adler32.o adler32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o crc32.o crc32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o deflate.o deflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o infback.o infback.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inffast.o inffast.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inflate.o inflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inftrees.o inftrees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o trees.o trees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o zutil.o zutil.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o compress.o compress.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spo

Re: [petsc-users] download zlib error

2021-07-07 Thread Matthew Knepley
Did you look in /sw/spock/spack-envs/views/rocm-4.1.0lib ?

  Matt

On Wed, Jul 7, 2021 at 12:29 PM Mark Adams  wrote:

> Ok, I tried that but now I get this error.
>
> On Wed, Jul 7, 2021 at 12:13 PM Stefano Zampini 
> wrote:
>
>> There's an extra comma
>>
>> Il Mer 7 Lug 2021, 18:08 Mark Adams  ha scritto:
>>
>>> Humm, I get this error (I just copied your whole file into here):
>>>
>>> 12:06 jczhang/fix-kokkos-includes=
>>> /gpfs/alpine/csc314/scratch/adams/petsc$ ~/arch-spock-dbg-cray-kokkos.py
>>> Traceback (most recent call last):
>>>   File "/ccs/home/adams/arch-spock-dbg-cray-kokkos.py", line 27, in
>>> 
>>> '--LDFLAGS=-L'+os.environ['ROCM_PATH'],+'lib -lhsa-runtime64',
>>> TypeError: bad operand type for unary +: 'str'
>>>
>>> On Wed, Jul 7, 2021 at 11:08 AM Stefano Zampini <
>>> stefano.zamp...@gmail.com> wrote:
>>>
 Mark

 On Spock, you can use
 https://gitlab.com/petsc/petsc/-/blob/main/config/examples/arch-olcf-spock.py
  as
 a template for your configuration. You need to add libraries as LDFLAGS to
 resolve the hsa symbols

 On Jul 7, 2021, at 5:04 PM, Mark Adams  wrote:

 Thanks,

 08:30 jczhang/fix-kokkos-includes=
 /gpfs/alpine/csc314/scratch/adams/petsc$ cd
 /gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
 && CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0
 -I${ROCM_PATH}/include"
 prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos"
 ./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install
 Checking for shared library support...
 Building shared library libz.so.1.2.11 with cc.
 Checking for size_t... Yes.
 Checking for off64_t... Yes.
 Checking for fseeko... Yes.
 Checking for strerror... No.
 Checking for unistd.h... Yes.
 Checking for stdarg.h... Yes.
 Checking whether to use vs[n]printf() or s[n]printf()... using
 vs[n]printf().
 Checking for vsnprintf() in stdio.h... No.
   WARNING: vsnprintf() not found, falling back to vsprintf(). zlib
   can build but will be open to possible buffer-overflow security
   vulnerabilities.
 Checking for return value of vsprintf()... Yes.
 Checking for attribute(visibility) support... Yes.
 cc -fPIC -fstack-protector -Qunused-arguments -g -O0
 -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o example.o
 test/example.c
 cc -fPIC -fstack-protector -Qunused-arguments -g -O0
 -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o adler32.o adler32.c
 cc -fPIC -fstack-protector -Qunused-arguments -g -O0
 -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o crc32.o crc32.c
 cc -fPIC -fstack-protector -Qunused-arguments -g -O0
 -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o deflate.o deflate.c
 cc -fPIC -fstack-protector -Qunused-arguments -g -O0
 -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o infback.o infback.c
 cc -fPIC -fstack-protector -Qunused-arguments -g -O0
 -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inffast.o inffast.c
 cc -fPIC -fstack-protector -Qunused-arguments -g -O0
 -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inflate.o inflate.c
 cc -fPIC -fstack-protector -Qunused-arguments -g -O0
 -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inftrees.o inftrees.c
 cc -fPIC -fstack-protector -Qunused-arguments -g -O0
 -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o trees.o trees.c
 cc -fPIC -fstack-protector -Qunused-arguments -g -O0
 -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o zutil.o zutil.c
 cc -fPIC -fstack-protector -Qunused-arguments -g -O0
 -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o compress.o compress.c
 cc -fPIC -fstack-protector -Qunused-arguments -g -O0
 -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o uncompr.o uncompr.c
 cc -fPIC -fstack-protector -Qunused-arguments -g -O0
 -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
 -DNO_STR

Re: [petsc-users] download zlib error

2021-07-07 Thread Stefano Zampini
There's an extra comma

Il Mer 7 Lug 2021, 18:08 Mark Adams  ha scritto:

> Humm, I get this error (I just copied your whole file into here):
>
> 12:06 jczhang/fix-kokkos-includes=
> /gpfs/alpine/csc314/scratch/adams/petsc$ ~/arch-spock-dbg-cray-kokkos.py
> Traceback (most recent call last):
>   File "/ccs/home/adams/arch-spock-dbg-cray-kokkos.py", line 27, in
> 
> '--LDFLAGS=-L'+os.environ['ROCM_PATH'],+'lib -lhsa-runtime64',
> TypeError: bad operand type for unary +: 'str'
>
> On Wed, Jul 7, 2021 at 11:08 AM Stefano Zampini 
> wrote:
>
>> Mark
>>
>> On Spock, you can use
>> https://gitlab.com/petsc/petsc/-/blob/main/config/examples/arch-olcf-spock.py
>>  as
>> a template for your configuration. You need to add libraries as LDFLAGS to
>> resolve the hsa symbols
>>
>> On Jul 7, 2021, at 5:04 PM, Mark Adams  wrote:
>>
>> Thanks,
>>
>> 08:30 jczhang/fix-kokkos-includes=
>> /gpfs/alpine/csc314/scratch/adams/petsc$ cd
>> /gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
>> && CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I${ROCM_PATH}/include"
>> prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos"
>> ./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install
>> Checking for shared library support...
>> Building shared library libz.so.1.2.11 with cc.
>> Checking for size_t... Yes.
>> Checking for off64_t... Yes.
>> Checking for fseeko... Yes.
>> Checking for strerror... No.
>> Checking for unistd.h... Yes.
>> Checking for stdarg.h... Yes.
>> Checking whether to use vs[n]printf() or s[n]printf()... using
>> vs[n]printf().
>> Checking for vsnprintf() in stdio.h... No.
>>   WARNING: vsnprintf() not found, falling back to vsprintf(). zlib
>>   can build but will be open to possible buffer-overflow security
>>   vulnerabilities.
>> Checking for return value of vsprintf()... Yes.
>> Checking for attribute(visibility) support... Yes.
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o example.o
>> test/example.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o adler32.o adler32.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o crc32.o crc32.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o deflate.o deflate.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o infback.o infback.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inffast.o inffast.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inflate.o inflate.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inftrees.o inftrees.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o trees.o trees.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o zutil.o zutil.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o compress.o compress.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o uncompr.o uncompr.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzclose.o gzclose.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzlib.o gzlib.c
>> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
>> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
>> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c 

Re: [petsc-users] download zlib error

2021-07-07 Thread Mark Adams
Humm, I get this error (I just copied your whole file into here):

12:06 jczhang/fix-kokkos-includes= /gpfs/alpine/csc314/scratch/adams/petsc$
~/arch-spock-dbg-cray-kokkos.py
Traceback (most recent call last):
  File "/ccs/home/adams/arch-spock-dbg-cray-kokkos.py", line 27, in 
'--LDFLAGS=-L'+os.environ['ROCM_PATH'],+'lib -lhsa-runtime64',
TypeError: bad operand type for unary +: 'str'

On Wed, Jul 7, 2021 at 11:08 AM Stefano Zampini 
wrote:

> Mark
>
> On Spock, you can use
> https://gitlab.com/petsc/petsc/-/blob/main/config/examples/arch-olcf-spock.py 
> as
> a template for your configuration. You need to add libraries as LDFLAGS to
> resolve the hsa symbols
>
> On Jul 7, 2021, at 5:04 PM, Mark Adams  wrote:
>
> Thanks,
>
> 08:30 jczhang/fix-kokkos-includes=
> /gpfs/alpine/csc314/scratch/adams/petsc$ cd
> /gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
> && CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0
> -I${ROCM_PATH}/include"
> prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos"
> ./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install
> Checking for shared library support...
> Building shared library libz.so.1.2.11 with cc.
> Checking for size_t... Yes.
> Checking for off64_t... Yes.
> Checking for fseeko... Yes.
> Checking for strerror... No.
> Checking for unistd.h... Yes.
> Checking for stdarg.h... Yes.
> Checking whether to use vs[n]printf() or s[n]printf()... using
> vs[n]printf().
> Checking for vsnprintf() in stdio.h... No.
>   WARNING: vsnprintf() not found, falling back to vsprintf(). zlib
>   can build but will be open to possible buffer-overflow security
>   vulnerabilities.
> Checking for return value of vsprintf()... Yes.
> Checking for attribute(visibility) support... Yes.
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o example.o
> test/example.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o adler32.o adler32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o crc32.o crc32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o deflate.o deflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o infback.o infback.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inffast.o inffast.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inflate.o inflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inftrees.o inftrees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o trees.o trees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o zutil.o zutil.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o compress.o compress.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o uncompr.o uncompr.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzclose.o gzclose.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzlib.o gzlib.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzread.o gzread.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprin

Re: [petsc-users] download zlib error

2021-07-07 Thread Matthew Knepley
Cray is idiotic! We have to add MPI and who knows what other libraries to
compile anything?

Matt

On Wed, Jul 7, 2021 at 11:08 AM Stefano Zampini 
wrote:

> Mark
>
> On Spock, you can use
> https://gitlab.com/petsc/petsc/-/blob/main/config/examples/arch-olcf-spock.py 
> as
> a template for your configuration. You need to add libraries as LDFLAGS to
> resolve the hsa symbols
>
> On Jul 7, 2021, at 5:04 PM, Mark Adams  wrote:
>
> Thanks,
>
> 08:30 jczhang/fix-kokkos-includes=
> /gpfs/alpine/csc314/scratch/adams/petsc$ cd
> /gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
> && CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0
> -I${ROCM_PATH}/include"
> prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos"
> ./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install
> Checking for shared library support...
> Building shared library libz.so.1.2.11 with cc.
> Checking for size_t... Yes.
> Checking for off64_t... Yes.
> Checking for fseeko... Yes.
> Checking for strerror... No.
> Checking for unistd.h... Yes.
> Checking for stdarg.h... Yes.
> Checking whether to use vs[n]printf() or s[n]printf()... using
> vs[n]printf().
> Checking for vsnprintf() in stdio.h... No.
>   WARNING: vsnprintf() not found, falling back to vsprintf(). zlib
>   can build but will be open to possible buffer-overflow security
>   vulnerabilities.
> Checking for return value of vsprintf()... Yes.
> Checking for attribute(visibility) support... Yes.
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o example.o
> test/example.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o adler32.o adler32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o crc32.o crc32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o deflate.o deflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o infback.o infback.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inffast.o inffast.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inflate.o inflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inftrees.o inftrees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o trees.o trees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o zutil.o zutil.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o compress.o compress.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o uncompr.o uncompr.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzclose.o gzclose.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzlib.o gzlib.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzread.o gzread.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzwrite.o gzwrite.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o minigzip.o
> test/minigzip.c
> cc -fPIC -fstack-protector -Qunu

Re: [petsc-users] download zlib error

2021-07-07 Thread Stefano Zampini
Mark

On Spock, you can use 
https://gitlab.com/petsc/petsc/-/blob/main/config/examples/arch-olcf-spock.py 
 
as a template for your configuration. You need to add libraries as LDFLAGS to 
resolve the hsa symbols

> On Jul 7, 2021, at 5:04 PM, Mark Adams  wrote:
> 
> Thanks,
> 
> 08:30 jczhang/fix-kokkos-includes= /gpfs/alpine/csc314/scratch/adams/petsc$ 
> cd 
> /gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
>  && CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I${ROCM_PATH}/include" 
> prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos" 
> ./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install
> Checking for shared library support...
> Building shared library libz.so.1.2.11 with cc.
> Checking for size_t... Yes.
> Checking for off64_t... Yes.
> Checking for fseeko... Yes.
> Checking for strerror... No.
> Checking for unistd.h... Yes.
> Checking for stdarg.h... Yes.
> Checking whether to use vs[n]printf() or s[n]printf()... using vs[n]printf().
> Checking for vsnprintf() in stdio.h... No.
>   WARNING: vsnprintf() not found, falling back to vsprintf(). zlib
>   can build but will be open to possible buffer-overflow security
>   vulnerabilities.
> Checking for return value of vsprintf()... Yes.
> Checking for attribute(visibility) support... Yes.
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o example.o test/example.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o adler32.o adler32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o crc32.o crc32.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o deflate.o deflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o infback.o infback.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inffast.o inffast.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inflate.o inflate.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inftrees.o inftrees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o trees.o trees.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o zutil.o zutil.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o compress.o compress.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o uncompr.o uncompr.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzclose.o gzclose.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzlib.o gzlib.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzread.o gzread.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzwrite.o gzwrite.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1 
> -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o minigzip.o 
> test/minigzip.c
> cc -fPIC -fstack-protector -Qunused-arguments -g -O0 
> -I/sw/spock/spack-envs/views

Re: [petsc-users] download zlib error

2021-07-07 Thread Mark Adams
Thanks,

08:30 jczhang/fix-kokkos-includes= /gpfs/alpine/csc314/scratch/adams/petsc$
cd
/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
&& CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0
-I${ROCM_PATH}/include"
prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos"
./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install
Checking for shared library support...
Building shared library libz.so.1.2.11 with cc.
Checking for size_t... Yes.
Checking for off64_t... Yes.
Checking for fseeko... Yes.
Checking for strerror... No.
Checking for unistd.h... Yes.
Checking for stdarg.h... Yes.
Checking whether to use vs[n]printf() or s[n]printf()... using
vs[n]printf().
Checking for vsnprintf() in stdio.h... No.
  WARNING: vsnprintf() not found, falling back to vsprintf(). zlib
  can build but will be open to possible buffer-overflow security
  vulnerabilities.
Checking for return value of vsprintf()... Yes.
Checking for attribute(visibility) support... Yes.
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o example.o
test/example.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o adler32.o adler32.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o crc32.o crc32.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o deflate.o deflate.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o infback.o infback.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inffast.o inffast.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inflate.o inflate.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o inftrees.o inftrees.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o trees.o trees.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o zutil.o zutil.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o compress.o compress.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o uncompr.o uncompr.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzclose.o gzclose.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzlib.o gzlib.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzread.o gzread.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -c -o gzwrite.o gzwrite.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -D_LARGEFILE64_SOURCE=1
-DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN -I. -c -o minigzip.o
test/minigzip.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -fPIC
-D_LARGEFILE64_SOURCE=1 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -DPIC
-c -o objs/adler32.o adler32.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -fPIC
-D_LARGEFILE64_SOURCE=1 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -DPIC
-c -o objs/crc32.o crc32.c
cc -fPIC -fstack-protector -Qunused-arguments -g -O0
-I/sw/spock/spack-envs/views/rocm-4.1.0/include -fPIC
-D_LARGEFILE64_SOURCE=1 -DNO_STRERROR -DNO_vsnprintf -DHAVE_HIDDEN  -DPIC
-c -o objs/deflate.o de

Re: [petsc-users] CUDA running out of memory in PtAP

2021-07-07 Thread Mark Adams
I think that is a good idea. I am trying to do it myself but it is getting
messy.
Thanks,

On Wed, Jul 7, 2021 at 9:50 AM Stefano Zampini 
wrote:

> Do you want me to open an MR to handle the sequential case?
>
> On Jul 7, 2021, at 3:39 PM, Mark Adams  wrote:
>
> OK, I found where its not protected in sequential.
>
> On Wed, Jul 7, 2021 at 9:25 AM Mark Adams  wrote:
>
>> Thanks, but that did not work.
>>
>> It looks like this is just in MPIAIJ, but I am using SeqAIJ. ex2 (below)
>> uses PETSC_COMM_SELF everywhere.
>>
>> + srun -G 1 -n 16 -c 1 --cpu-bind=cores --ntasks-per-core=2
>> /global/homes/m/madams/mps-wrapper.sh ../ex2 -dm_landau_device_type cuda
>> -dm_mat_type aijcusparse -dm_vec_type cuda -log_view -pc_type gamg
>> -ksp_type gmres -pc_gamg_reuse_interpolation *-matmatmult_backend_cpu
>> -matptap_backend_cpu *-dm_landau_ion_masses .0005,1,1,1,1,1,1,1,1
>> -dm_landau_ion_charges 1,2,3,4,5,6,7,8,9 -dm_landau_thermal_temps
>> 1,1,1,1,1,1,1,1,1,1 -dm_landau_n
>> 1.03,.5,1e-7,1e-7,1e-7,1e-7,1e-7,1e-7,1e-7,1e-7
>> 0 starting nvidia-cuda-mps-control on cgpu17
>> mps ready: 2021-07-07T06:17:36-07:00
>> masses:e= 9.109e-31; ions in proton mass units:5.000e-04
>>  1.000e+00 ...
>> charges:   e=-1.602e-19; charges in elementary units:  1.000e+00
>>  2.000e+00
>> thermal T (K): e= 1.160e+07 i= 1.160e+07 imp= 1.160e+07. v_0= 1.326e+07
>> n_0= 1.000e+20 t_0= 5.787e-06 domain= 5.000e+00
>> CalculateE j0=0. Ec = 0.050991
>> 0 TS dt 1. time 0.
>>   0) species-0: charge density= -1.6054532569865e+01 z-momentum=
>> -1.9059929215360e-19 energy=  2.4178543516210e+04
>>   0) species-1: charge density=  8.0258396545108e+00 z-momentum=
>>  7.0660527288120e-20 energy=  1.2082380663859e+04
>>   0) species-2: charge density=  6.3912608577597e-05 z-momentum=
>> -1.1513901010709e-24 energy=  3.5799558195524e-01
>>   0) species-3: charge density=  9.5868912866395e-05 z-momentum=
>> -1.1513901010709e-24 energy=  3.5799558195524e-01
>>   0) species-4: charge density=  1.2782521715519e-04 z-momentum=
>> -1.1513901010709e-24 energy=  3.5799558195524e-01
>> [7]PETSC ERROR: - Error Message
>> --
>> [7]PETSC ERROR: GPU resources unavailable
>> [7]PETSC ERROR: CUDA error 2 (cudaErrorMemoryAllocation) : out of memory.
>> Reports alloc failed; this indicates the GPU has run out resources
>> [7]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.
>> [7]PETSC ERROR: Petsc Development GIT revision: v3.15.1-569-g270a066c1e
>>  GIT Date: 2021-07-06 03:22:54 -0700
>> [7]PETSC ERROR: ../ex2 on a arch-cori-gpu-opt-gcc named cgpu17 by madams
>> Wed Jul  7 06:17:38 2021
>> [7]PETSC ERROR: Configure options
>> --with-mpi-dir=/usr/common/software/sles15_cgpu/openmpi/4.0.3/gcc
>> --with-cuda-dir=/usr/common/software/sles15_cgpu/cuda/11.1.1 --CFLAGS="
>> -g -DLANDAU_DIM=2 -DLANDAU_MAX_SPECIES=10 -DLANDAU_MAX_Q=4" --CXXFLAGS=" -g
>> -DLANDAU_DIM=2 -DLANDAU_MAX_SPECIES=10 -DLANDAU_MAX_Q=4" --CUDAFLAGS="-g
>> -Xcompiler -rdynamic -DLANDAU_DIM=2 -DLANDAU_MAX_SPECIES=10
>> -DLANDAU_MAX_Q=4" --FFLAGS="   -g " --COPTFLAGS="   -O3" --CXXOPTFLAGS="
>> -O3" --FOPTFLAGS="   -O3" --download-fblaslapack=1 --with-debugging=0
>> --with-mpiexec="srun -G 1" --with-cuda-gencodearch=70 --with-batch=0
>> --with-cuda=1 --download-p4est=1 --download-hypre=1 --with-zlib=1
>> PETSC_ARCH=arch-cori-gpu-opt-gcc
>>
>> *[7]PETSC ERROR: #1 MatProductSymbolic_SeqAIJCUSPARSE_SeqAIJCUSPARSE() at
>> /global/u2/m/madams/petsc/src/mat/impls/aij/seq/seqcusparse/aijcusparse.cu:2622
>> *[7]PETSC ERROR: #2
>> MatProductSymbolic_ABC_Basic() at
>> /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:1146
>> [7]PETSC ERROR: #3 MatProductSymbolic() at
>> /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:799
>> [7]PETSC ERROR: #4 MatPtAP() at
>> /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9626
>> [7]PETSC ERROR: #5 PCGAMGCreateLevel_GAMG() at
>> /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:87
>> [7]PETSC ERROR: #6 PCSetUp_GAMG() at
>> /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:663
>> [7]PETSC ERROR: #7 PCSetUp() at
>> /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:1014
>> [7]PETSC ERROR: #8 KSPSetUp() at
>> /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406
>> [7]PETSC ERROR: #9 KSPSolve_Private() at
>> /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:850
>> [7]PETSC ERROR: #10 KSPSolve() at
>> /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1084
>> [7]PETSC ERROR: #11 SNESSolve_NEWTONLS() at
>> /global/u2/m/madams/petsc/src/snes/impls/ls/ls.c:225
>> [7]PETSC ERROR: #12 SNESSolve() at
>> /global/u2/m/madams/petsc/src/snes/interface/snes.c:4769
>> [7]PETSC ERROR: #13 TSTheta_SNESSolve() at
>> /global/u2/m/madams/petsc/src/ts/impls/implicit/theta/theta.c:185
>> [7]PETSC ERROR: #14 TSStep_Theta() at
>> /global/u2/m/madams/petsc/src/ts/impls/implicit/the

Re: [petsc-users] CUDA running out of memory in PtAP

2021-07-07 Thread Stefano Zampini
Do you want me to open an MR to handle the sequential case?

> On Jul 7, 2021, at 3:39 PM, Mark Adams  wrote:
> 
> OK, I found where its not protected in sequential.
> 
> On Wed, Jul 7, 2021 at 9:25 AM Mark Adams  > wrote:
> Thanks, but that did not work. 
> 
> It looks like this is just in MPIAIJ, but I am using SeqAIJ. ex2 (below) uses 
> PETSC_COMM_SELF everywhere.
> 
> + srun -G 1 -n 16 -c 1 --cpu-bind=cores --ntasks-per-core=2 
> /global/homes/m/madams/mps-wrapper.sh ../ex2 -dm_landau_device_type cuda 
> -dm_mat_type aijcusparse -dm_vec_type cuda -log_view -pc_type gamg -ksp_type 
> gmres -pc_gamg_reuse_interpolation -matmatmult_backend_cpu 
> -matptap_backend_cpu -dm_landau_ion_masses .0005,1,1,1,1,1,1,1,1 
> -dm_landau_ion_charges 1,2,3,4,5,6,7,8,9 -dm_landau_thermal_temps 
> 1,1,1,1,1,1,1,1,1,1 -dm_landau_n 
> 1.03,.5,1e-7,1e-7,1e-7,1e-7,1e-7,1e-7,1e-7,1e-7
> 0 starting nvidia-cuda-mps-control on cgpu17
> mps ready: 2021-07-07T06:17:36-07:00
> masses:e= 9.109e-31; ions in proton mass units:5.000e-04  
> 1.000e+00 ...
> charges:   e=-1.602e-19; charges in elementary units:  1.000e+00  
> 2.000e+00
> thermal T (K): e= 1.160e+07 i= 1.160e+07 imp= 1.160e+07. v_0= 1.326e+07 n_0= 
> 1.000e+20 t_0= 5.787e-06 domain= 5.000e+00
> CalculateE j0=0. Ec = 0.050991
> 0 TS dt 1. time 0.
>   0) species-0: charge density= -1.6054532569865e+01 z-momentum= 
> -1.9059929215360e-19 energy=  2.4178543516210e+04
>   0) species-1: charge density=  8.0258396545108e+00 z-momentum=  
> 7.0660527288120e-20 energy=  1.2082380663859e+04
>   0) species-2: charge density=  6.3912608577597e-05 z-momentum= 
> -1.1513901010709e-24 energy=  3.5799558195524e-01
>   0) species-3: charge density=  9.5868912866395e-05 z-momentum= 
> -1.1513901010709e-24 energy=  3.5799558195524e-01
>   0) species-4: charge density=  1.2782521715519e-04 z-momentum= 
> -1.1513901010709e-24 energy=  3.5799558195524e-01
> [7]PETSC ERROR: - Error Message 
> --
> [7]PETSC ERROR: GPU resources unavailable 
> [7]PETSC ERROR: CUDA error 2 (cudaErrorMemoryAllocation) : out of memory. 
> Reports alloc failed; this indicates the GPU has run out resources
> [7]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html 
>  for trouble shooting.
> [7]PETSC ERROR: Petsc Development GIT revision: v3.15.1-569-g270a066c1e  GIT 
> Date: 2021-07-06 03:22:54 -0700
> [7]PETSC ERROR: ../ex2 on a arch-cori-gpu-opt-gcc named cgpu17 by madams Wed 
> Jul  7 06:17:38 2021
> [7]PETSC ERROR: Configure options 
> --with-mpi-dir=/usr/common/software/sles15_cgpu/openmpi/4.0.3/gcc 
> --with-cuda-dir=/usr/common/software/sles15_cgpu/cuda/11.1.1 --CFLAGS="   -g 
> -DLANDAU_DIM=2 -DLANDAU_MAX_SPECIES=10 -DLANDAU_MAX_Q=4" --CXXFLAGS=" -g 
> -DLANDAU_DIM=2 -DLANDAU_MAX_SPECIES=10 -DLANDAU_MAX_Q=4" --CUDAFLAGS="-g 
> -Xcompiler -rdynamic -DLANDAU_DIM=2 -DLANDAU_MAX_SPECIES=10 -DLANDAU_MAX_Q=4" 
> --FFLAGS="   -g " --COPTFLAGS="   -O3" --CXXOPTFLAGS=" -O3" --FOPTFLAGS="   
> -O3" --download-fblaslapack=1 --with-debugging=0 --with-mpiexec="srun -G 1" 
> --with-cuda-gencodearch=70 --with-batch=0 --with-cuda=1 --download-p4est=1 
> --download-hypre=1 --with-zlib=1 PETSC_ARCH=arch-cori-gpu-opt-gcc
> [7]PETSC ERROR: #1 MatProductSymbolic_SeqAIJCUSPARSE_SeqAIJCUSPARSE() at 
> /global/u2/m/madams/petsc/src/mat/impls/aij/seq/seqcusparse/aijcusparse.cu:2622
>  
> [7]PETSC ERROR: #2 MatProductSymbolic_ABC_Basic() at 
> /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:1146
> [7]PETSC ERROR: #3 MatProductSymbolic() at 
> /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:799
> [7]PETSC ERROR: #4 MatPtAP() at 
> /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9626
> [7]PETSC ERROR: #5 PCGAMGCreateLevel_GAMG() at 
> /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:87
> [7]PETSC ERROR: #6 PCSetUp_GAMG() at 
> /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:663
> [7]PETSC ERROR: #7 PCSetUp() at 
> /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:1014
> [7]PETSC ERROR: #8 KSPSetUp() at 
> /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406
> [7]PETSC ERROR: #9 KSPSolve_Private() at 
> /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:850
> [7]PETSC ERROR: #10 KSPSolve() at 
> /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1084
> [7]PETSC ERROR: #11 SNESSolve_NEWTONLS() at 
> /global/u2/m/madams/petsc/src/snes/impls/ls/ls.c:225
> [7]PETSC ERROR: #12 SNESSolve() at 
> /global/u2/m/madams/petsc/src/snes/interface/snes.c:4769
> [7]PETSC ERROR: #13 TSTheta_SNESSolve() at 
> /global/u2/m/madams/petsc/src/ts/impls/implicit/theta/theta.c:185
> [7]PETSC ERROR: #14 TSStep_Theta() at 
> /global/u2/m/madams/petsc/src/ts/impls/implicit/theta/theta.c:223
> [7]PETSC ERROR: #15 TSStep() at 
> /global/u2/m/madams/petsc/src/ts/interface/ts.c:3571
> [7]

Re: [petsc-users] CUDA running out of memory in PtAP

2021-07-07 Thread Mark Adams
OK, I found where its not protected in sequential.

On Wed, Jul 7, 2021 at 9:25 AM Mark Adams  wrote:

> Thanks, but that did not work.
>
> It looks like this is just in MPIAIJ, but I am using SeqAIJ. ex2 (below)
> uses PETSC_COMM_SELF everywhere.
>
> + srun -G 1 -n 16 -c 1 --cpu-bind=cores --ntasks-per-core=2
> /global/homes/m/madams/mps-wrapper.sh ../ex2 -dm_landau_device_type cuda
> -dm_mat_type aijcusparse -dm_vec_type cuda -log_view -pc_type gamg
> -ksp_type gmres -pc_gamg_reuse_interpolation *-matmatmult_backend_cpu
> -matptap_backend_cpu *-dm_landau_ion_masses .0005,1,1,1,1,1,1,1,1
> -dm_landau_ion_charges 1,2,3,4,5,6,7,8,9 -dm_landau_thermal_temps
> 1,1,1,1,1,1,1,1,1,1 -dm_landau_n
> 1.03,.5,1e-7,1e-7,1e-7,1e-7,1e-7,1e-7,1e-7,1e-7
> 0 starting nvidia-cuda-mps-control on cgpu17
> mps ready: 2021-07-07T06:17:36-07:00
> masses:e= 9.109e-31; ions in proton mass units:5.000e-04
>  1.000e+00 ...
> charges:   e=-1.602e-19; charges in elementary units:  1.000e+00
>  2.000e+00
> thermal T (K): e= 1.160e+07 i= 1.160e+07 imp= 1.160e+07. v_0= 1.326e+07
> n_0= 1.000e+20 t_0= 5.787e-06 domain= 5.000e+00
> CalculateE j0=0. Ec = 0.050991
> 0 TS dt 1. time 0.
>   0) species-0: charge density= -1.6054532569865e+01 z-momentum=
> -1.9059929215360e-19 energy=  2.4178543516210e+04
>   0) species-1: charge density=  8.0258396545108e+00 z-momentum=
>  7.0660527288120e-20 energy=  1.2082380663859e+04
>   0) species-2: charge density=  6.3912608577597e-05 z-momentum=
> -1.1513901010709e-24 energy=  3.5799558195524e-01
>   0) species-3: charge density=  9.5868912866395e-05 z-momentum=
> -1.1513901010709e-24 energy=  3.5799558195524e-01
>   0) species-4: charge density=  1.2782521715519e-04 z-momentum=
> -1.1513901010709e-24 energy=  3.5799558195524e-01
> [7]PETSC ERROR: - Error Message
> --
> [7]PETSC ERROR: GPU resources unavailable
> [7]PETSC ERROR: CUDA error 2 (cudaErrorMemoryAllocation) : out of memory.
> Reports alloc failed; this indicates the GPU has run out resources
> [7]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [7]PETSC ERROR: Petsc Development GIT revision: v3.15.1-569-g270a066c1e
>  GIT Date: 2021-07-06 03:22:54 -0700
> [7]PETSC ERROR: ../ex2 on a arch-cori-gpu-opt-gcc named cgpu17 by madams
> Wed Jul  7 06:17:38 2021
> [7]PETSC ERROR: Configure options
> --with-mpi-dir=/usr/common/software/sles15_cgpu/openmpi/4.0.3/gcc
> --with-cuda-dir=/usr/common/software/sles15_cgpu/cuda/11.1.1 --CFLAGS="
> -g -DLANDAU_DIM=2 -DLANDAU_MAX_SPECIES=10 -DLANDAU_MAX_Q=4" --CXXFLAGS=" -g
> -DLANDAU_DIM=2 -DLANDAU_MAX_SPECIES=10 -DLANDAU_MAX_Q=4" --CUDAFLAGS="-g
> -Xcompiler -rdynamic -DLANDAU_DIM=2 -DLANDAU_MAX_SPECIES=10
> -DLANDAU_MAX_Q=4" --FFLAGS="   -g " --COPTFLAGS="   -O3" --CXXOPTFLAGS="
> -O3" --FOPTFLAGS="   -O3" --download-fblaslapack=1 --with-debugging=0
> --with-mpiexec="srun -G 1" --with-cuda-gencodearch=70 --with-batch=0
> --with-cuda=1 --download-p4est=1 --download-hypre=1 --with-zlib=1
> PETSC_ARCH=arch-cori-gpu-opt-gcc
>
> *[7]PETSC ERROR: #1 MatProductSymbolic_SeqAIJCUSPARSE_SeqAIJCUSPARSE() at
> /global/u2/m/madams/petsc/src/mat/impls/aij/seq/seqcusparse/aijcusparse.cu:2622
> *[7]PETSC ERROR: #2
> MatProductSymbolic_ABC_Basic() at
> /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:1146
> [7]PETSC ERROR: #3 MatProductSymbolic() at
> /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:799
> [7]PETSC ERROR: #4 MatPtAP() at
> /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9626
> [7]PETSC ERROR: #5 PCGAMGCreateLevel_GAMG() at
> /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:87
> [7]PETSC ERROR: #6 PCSetUp_GAMG() at
> /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:663
> [7]PETSC ERROR: #7 PCSetUp() at
> /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:1014
> [7]PETSC ERROR: #8 KSPSetUp() at
> /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406
> [7]PETSC ERROR: #9 KSPSolve_Private() at
> /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:850
> [7]PETSC ERROR: #10 KSPSolve() at
> /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1084
> [7]PETSC ERROR: #11 SNESSolve_NEWTONLS() at
> /global/u2/m/madams/petsc/src/snes/impls/ls/ls.c:225
> [7]PETSC ERROR: #12 SNESSolve() at
> /global/u2/m/madams/petsc/src/snes/interface/snes.c:4769
> [7]PETSC ERROR: #13 TSTheta_SNESSolve() at
> /global/u2/m/madams/petsc/src/ts/impls/implicit/theta/theta.c:185
> [7]PETSC ERROR: #14 TSStep_Theta() at
> /global/u2/m/madams/petsc/src/ts/impls/implicit/theta/theta.c:223
> [7]PETSC ERROR: #15 TSStep() at
> /global/u2/m/madams/petsc/src/ts/interface/ts.c:3571
> [7]PETSC ERROR: #16 TSSolve() at
> /global/u2/m/madams/petsc/src/ts/interface/ts.c:3968
> [7]PETSC ERROR: #17 main() at ex2.c:699
> [7]PETSC ERROR: PETSc Option Table entries:
> [7]PETSC ERROR: -dm_landau_amr_levels_max 0
> [7]PETSC ERROR: -dm_land

Re: [petsc-users] CUDA running out of memory in PtAP

2021-07-07 Thread Mark Adams
Thanks, but that did not work.

It looks like this is just in MPIAIJ, but I am using SeqAIJ. ex2 (below)
uses PETSC_COMM_SELF everywhere.

+ srun -G 1 -n 16 -c 1 --cpu-bind=cores --ntasks-per-core=2
/global/homes/m/madams/mps-wrapper.sh ../ex2 -dm_landau_device_type cuda
-dm_mat_type aijcusparse -dm_vec_type cuda -log_view -pc_type gamg
-ksp_type gmres -pc_gamg_reuse_interpolation *-matmatmult_backend_cpu
-matptap_backend_cpu *-dm_landau_ion_masses .0005,1,1,1,1,1,1,1,1
-dm_landau_ion_charges 1,2,3,4,5,6,7,8,9 -dm_landau_thermal_temps
1,1,1,1,1,1,1,1,1,1 -dm_landau_n
1.03,.5,1e-7,1e-7,1e-7,1e-7,1e-7,1e-7,1e-7,1e-7
0 starting nvidia-cuda-mps-control on cgpu17
mps ready: 2021-07-07T06:17:36-07:00
masses:e= 9.109e-31; ions in proton mass units:5.000e-04
 1.000e+00 ...
charges:   e=-1.602e-19; charges in elementary units:  1.000e+00
 2.000e+00
thermal T (K): e= 1.160e+07 i= 1.160e+07 imp= 1.160e+07. v_0= 1.326e+07
n_0= 1.000e+20 t_0= 5.787e-06 domain= 5.000e+00
CalculateE j0=0. Ec = 0.050991
0 TS dt 1. time 0.
  0) species-0: charge density= -1.6054532569865e+01 z-momentum=
-1.9059929215360e-19 energy=  2.4178543516210e+04
  0) species-1: charge density=  8.0258396545108e+00 z-momentum=
 7.0660527288120e-20 energy=  1.2082380663859e+04
  0) species-2: charge density=  6.3912608577597e-05 z-momentum=
-1.1513901010709e-24 energy=  3.5799558195524e-01
  0) species-3: charge density=  9.5868912866395e-05 z-momentum=
-1.1513901010709e-24 energy=  3.5799558195524e-01
  0) species-4: charge density=  1.2782521715519e-04 z-momentum=
-1.1513901010709e-24 energy=  3.5799558195524e-01
[7]PETSC ERROR: - Error Message
--
[7]PETSC ERROR: GPU resources unavailable
[7]PETSC ERROR: CUDA error 2 (cudaErrorMemoryAllocation) : out of memory.
Reports alloc failed; this indicates the GPU has run out resources
[7]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
for trouble shooting.
[7]PETSC ERROR: Petsc Development GIT revision: v3.15.1-569-g270a066c1e
 GIT Date: 2021-07-06 03:22:54 -0700
[7]PETSC ERROR: ../ex2 on a arch-cori-gpu-opt-gcc named cgpu17 by madams
Wed Jul  7 06:17:38 2021
[7]PETSC ERROR: Configure options
--with-mpi-dir=/usr/common/software/sles15_cgpu/openmpi/4.0.3/gcc
--with-cuda-dir=/usr/common/software/sles15_cgpu/cuda/11.1.1 --CFLAGS="
-g -DLANDAU_DIM=2 -DLANDAU_MAX_SPECIES=10 -DLANDAU_MAX_Q=4" --CXXFLAGS=" -g
-DLANDAU_DIM=2 -DLANDAU_MAX_SPECIES=10 -DLANDAU_MAX_Q=4" --CUDAFLAGS="-g
-Xcompiler -rdynamic -DLANDAU_DIM=2 -DLANDAU_MAX_SPECIES=10
-DLANDAU_MAX_Q=4" --FFLAGS="   -g " --COPTFLAGS="   -O3" --CXXOPTFLAGS="
-O3" --FOPTFLAGS="   -O3" --download-fblaslapack=1 --with-debugging=0
--with-mpiexec="srun -G 1" --with-cuda-gencodearch=70 --with-batch=0
--with-cuda=1 --download-p4est=1 --download-hypre=1 --with-zlib=1
PETSC_ARCH=arch-cori-gpu-opt-gcc

*[7]PETSC ERROR: #1 MatProductSymbolic_SeqAIJCUSPARSE_SeqAIJCUSPARSE() at
/global/u2/m/madams/petsc/src/mat/impls/aij/seq/seqcusparse/aijcusparse.cu:2622
*[7]PETSC ERROR: #2
MatProductSymbolic_ABC_Basic() at
/global/u2/m/madams/petsc/src/mat/interface/matproduct.c:1146
[7]PETSC ERROR: #3 MatProductSymbolic() at
/global/u2/m/madams/petsc/src/mat/interface/matproduct.c:799
[7]PETSC ERROR: #4 MatPtAP() at
/global/u2/m/madams/petsc/src/mat/interface/matrix.c:9626
[7]PETSC ERROR: #5 PCGAMGCreateLevel_GAMG() at
/global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:87
[7]PETSC ERROR: #6 PCSetUp_GAMG() at
/global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:663
[7]PETSC ERROR: #7 PCSetUp() at
/global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:1014
[7]PETSC ERROR: #8 KSPSetUp() at
/global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406
[7]PETSC ERROR: #9 KSPSolve_Private() at
/global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:850
[7]PETSC ERROR: #10 KSPSolve() at
/global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1084
[7]PETSC ERROR: #11 SNESSolve_NEWTONLS() at
/global/u2/m/madams/petsc/src/snes/impls/ls/ls.c:225
[7]PETSC ERROR: #12 SNESSolve() at
/global/u2/m/madams/petsc/src/snes/interface/snes.c:4769
[7]PETSC ERROR: #13 TSTheta_SNESSolve() at
/global/u2/m/madams/petsc/src/ts/impls/implicit/theta/theta.c:185
[7]PETSC ERROR: #14 TSStep_Theta() at
/global/u2/m/madams/petsc/src/ts/impls/implicit/theta/theta.c:223
[7]PETSC ERROR: #15 TSStep() at
/global/u2/m/madams/petsc/src/ts/interface/ts.c:3571
[7]PETSC ERROR: #16 TSSolve() at
/global/u2/m/madams/petsc/src/ts/interface/ts.c:3968
[7]PETSC ERROR: #17 main() at ex2.c:699
[7]PETSC ERROR: PETSc Option Table entries:
[7]PETSC ERROR: -dm_landau_amr_levels_max 0
[7]PETSC ERROR: -dm_landau_amr_post_refine 5
[7]PETSC ERROR: -dm_landau_device_type cuda
[7]PETSC ERROR: -dm_landau_domain_radius 5
[7]PETSC ERROR: -dm_landau_Ez 0
[7]PETSC ERROR: -dm_landau_ion_charges 1,2,3,4,5,6,7,8,9
[7]PETSC ERROR: -dm_landau_ion_masses .0005,1,1,1,1,1,1,1,1
[7]PETSC ERROR: -dm_landau_n

Re: [petsc-users] download zlib error

2021-07-07 Thread Matthew Knepley
It is hard to see the error. I suspect it is something crazy with the
install. Can you run the build by hand?

cd
/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos/externalpackages/zlib-1.2.11
&& CC="cc" CFLAGS="-fPIC -fstack-protector -Qunused-arguments -g -O0
-I${ROCM_PATH}/include"
prefix="/gpfs/alpine/csc314/scratch/adams/petsc/arch-spock-opt-cray-kokkos"
./configure  && /usr/bin/gmake -j8 -l307.2 &&  /usr/bin/gmake install

and see what happens, and what the error code is?

  Thanks,

 Matt

On Wed, Jul 7, 2021 at 8:48 AM Mark Adams  wrote:

> Also, this is in jczhang/fix-kokkos-includes
>
> On Wed, Jul 7, 2021 at 8:46 AM Mark Adams  wrote:
>
>> Apparently the same error with
>> --download-zlib=/yourselectedlocation/zlib-1.2.11.tar.gz
>>
>> On Tue, Jul 6, 2021 at 11:53 PM Barry Smith  wrote:
>>
>>> $ curl http://www.zlib.net/zlib-1.2.11.tar.gz > zlib-1.2.11.tar.gz
>>>   % Total% Received % Xferd  Average Speed   TimeTime Time
>>> Current
>>>  Dload  Upload   Total   SpentLeft
>>> Speed
>>> 100  593k  100  593k0 0   835k  0 --:--:-- --:--:--
>>> --:--:--  834k
>>> ~/Src/petsc* (barry/2021-07-03/demonstrate-network-parallel-build=)*
>>> arch-demonstrate-network-parallel-build
>>> $ tar -zxf zlib-1.2.11.tar.gz
>>> ~/Src/petsc* (barry/2021-07-03/demonstrate-network-parallel-build=)*
>>> arch-demonstrate-network-parallel-build
>>> $ ls zlib-1.2.11
>>> CMakeLists.txt  adler32.c   deflate.c   gzread.c
>>> inflate.h   os400   watcom  zlib.h
>>> ChangeLog   amiga   deflate.h   gzwrite.c
>>> inftrees.c  qnx win32   zlib.map
>>> FAQ compress.c  doc infback.c
>>> inftrees.h  testzconf.h zlib.pc.cmakein
>>> INDEX   configure   examplesinffast.c
>>> make_vms.comtreebuild.xml   zconf.h.cmakein zlib.pc.in
>>> Makefilecontrib gzclose.c   inffast.h   msdos
>>> trees.c zconf.h.in  zlib2ansi
>>> Makefile.in crc32.c gzguts.hinffixed.h
>>> nintendods  trees.h zlib.3  zutil.c
>>> README  crc32.h gzlib.c inflate.c   old
>>> uncompr.c   zlib.3.pdf  zutil.h
>>>
>>>
>>>
>>> On Jul 6, 2021, at 7:57 PM, Mark Adams  wrote:
>>>
>>>
>>>
>>> On Tue, Jul 6, 2021 at 6:42 PM Barry Smith  wrote:
>>>

   Mark,

You can try what the configure error message should be suggesting
 (it is not clear if that is being printed to your screen or no).

 ERROR: Unable to download package ZLIB from:
 http://www.zlib.net/zlib-1.2.11.tar.gz
>>>
>>>
>>> My browser can not open this and I could not see a download button on
>>> this site.
>>>
>>> Can you download this?
>>>
>>>

 * If URL specified manually - perhaps there is a typo?
 * If your network is disconnected - please reconnect and rerun
 ./configure
 * Or perhaps you have a firewall blocking the download
 * You can run with --with-packages-download-dir=/adirectory and
 ./configure will instruct you what packages to download manually
 * or you can download the above URL manually, to
 /yourselectedlocation/zlib-1.2.11.tar.gz
   and use the configure option:
   --download-zlib=/yourselectedlocation/zlib-1.2.11.tar.gz

   Barry


 > On Jul 6, 2021, at 4:29 PM, Mark Adams  wrote:
 >
 > I am getting some sort of error in build zlib on Spock at ORNL.
 > Other libraries are downloaded and I am sure the network is fine.
 > Any ideas?
 > Thanks,
 > Mark
 > 


>>>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] download zlib error

2021-07-07 Thread Mark Adams
Also, this is in jczhang/fix-kokkos-includes

On Wed, Jul 7, 2021 at 8:46 AM Mark Adams  wrote:

> Apparently the same error with
> --download-zlib=/yourselectedlocation/zlib-1.2.11.tar.gz
>
> On Tue, Jul 6, 2021 at 11:53 PM Barry Smith  wrote:
>
>> $ curl http://www.zlib.net/zlib-1.2.11.tar.gz > zlib-1.2.11.tar.gz
>>   % Total% Received % Xferd  Average Speed   TimeTime Time
>> Current
>>  Dload  Upload   Total   SpentLeft
>> Speed
>> 100  593k  100  593k0 0   835k  0 --:--:-- --:--:-- --:--:--
>> 834k
>> ~/Src/petsc* (barry/2021-07-03/demonstrate-network-parallel-build=)*
>> arch-demonstrate-network-parallel-build
>> $ tar -zxf zlib-1.2.11.tar.gz
>> ~/Src/petsc* (barry/2021-07-03/demonstrate-network-parallel-build=)*
>> arch-demonstrate-network-parallel-build
>> $ ls zlib-1.2.11
>> CMakeLists.txt  adler32.c   deflate.c   gzread.cinflate.h
>>   os400   watcom  zlib.h
>> ChangeLog   amiga   deflate.h   gzwrite.c
>> inftrees.c  qnx win32   zlib.map
>> FAQ compress.c  doc infback.c
>> inftrees.h  testzconf.h zlib.pc.cmakein
>> INDEX   configure   examplesinffast.c
>> make_vms.comtreebuild.xml   zconf.h.cmakein zlib.pc.in
>> Makefilecontrib gzclose.c   inffast.h   msdos
>> trees.c zconf.h.in  zlib2ansi
>> Makefile.in crc32.c gzguts.hinffixed.h
>> nintendods  trees.h zlib.3  zutil.c
>> README  crc32.h gzlib.c inflate.c   old
>> uncompr.c   zlib.3.pdf  zutil.h
>>
>>
>>
>> On Jul 6, 2021, at 7:57 PM, Mark Adams  wrote:
>>
>>
>>
>> On Tue, Jul 6, 2021 at 6:42 PM Barry Smith  wrote:
>>
>>>
>>>   Mark,
>>>
>>>You can try what the configure error message should be suggesting (it
>>> is not clear if that is being printed to your screen or no).
>>>
>>> ERROR: Unable to download package ZLIB from:
>>> http://www.zlib.net/zlib-1.2.11.tar.gz
>>
>>
>> My browser can not open this and I could not see a download button on
>> this site.
>>
>> Can you download this?
>>
>>
>>>
>>> * If URL specified manually - perhaps there is a typo?
>>> * If your network is disconnected - please reconnect and rerun
>>> ./configure
>>> * Or perhaps you have a firewall blocking the download
>>> * You can run with --with-packages-download-dir=/adirectory and
>>> ./configure will instruct you what packages to download manually
>>> * or you can download the above URL manually, to
>>> /yourselectedlocation/zlib-1.2.11.tar.gz
>>>   and use the configure option:
>>>   --download-zlib=/yourselectedlocation/zlib-1.2.11.tar.gz
>>>
>>>   Barry
>>>
>>>
>>> > On Jul 6, 2021, at 4:29 PM, Mark Adams  wrote:
>>> >
>>> > I am getting some sort of error in build zlib on Spock at ORNL.
>>> > Other libraries are downloaded and I am sure the network is fine.
>>> > Any ideas?
>>> > Thanks,
>>> > Mark
>>> > 
>>>
>>>
>>


Re: [petsc-users] CUDA running out of memory in PtAP

2021-07-07 Thread Stefano Zampini
This will select the CPU path

-matmatmult_backend_cpu -matptap_backend_cpu

> On Jul 7, 2021, at 2:43 AM, Mark Adams  wrote:
> 
> Can I turn off using cuSprarse for RAP?
> 
> On Tue, Jul 6, 2021 at 6:25 PM Barry Smith  > wrote:
> 
>   Stefano has mentioned this before. He reported cuSparse matrix-matrix 
> vector products use a very amount of memory.
> 
>> On Jul 6, 2021, at 4:33 PM, Mark Adams > > wrote:
>> 
>> I am running out of memory in GAMG. It looks like this is from the new 
>> cuSparse RAP.
>> I was able to run Hypre with twice as much work on the GPU as this run.
>> Are there parameters to tweek for this perhaps or can I disable it?
>> 
>> Thanks,
>> Mark 
>> 
>>0 SNES Function norm 5.442539952302e-04 
>> [2]PETSC ERROR: - Error Message 
>> --
>> [2]PETSC ERROR: GPU resources unavailable 
>> [2]PETSC ERROR: CUDA error 2 (cudaErrorMemoryAllocation) : out of memory. 
>> Reports alloc failed; this indicates the GPU has run out resources
>> [2]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html 
>>  for trouble shooting.
>> [2]PETSC ERROR: Petsc Development GIT revision: v3.15.1-569-g270a066c1e  GIT 
>> Date: 2021-07-06 03:22:54 -0700
>> [2]PETSC ERROR: ../ex2 on a arch-cori-gpu-opt-gcc named cgpu11 by madams Tue 
>> Jul  6 13:37:43 2021
>> [2]PETSC ERROR: Configure options 
>> --with-mpi-dir=/usr/common/software/sles15_cgpu/openmpi/4.0.3/gcc 
>> --with-cuda-dir=/usr/common/software/sles15_cgpu/cuda/11.1.1 --CFLAGS="   -g 
>> -DLANDAU_DIM=2 -DLANDAU_MAX_SPECI
>> ES=10 -DLANDAU_MAX_Q=4" --CXXFLAGS=" -g -DLANDAU_DIM=2 
>> -DLANDAU_MAX_SPECIES=10 -DLANDAU_MAX_Q=4" --CUDAFLAGS="-g -Xcompiler 
>> -rdynamic -DLANDAU_DIM=2 -DLANDAU_MAX_SPECIES=10 -DLANDAU_MAX_Q=4" 
>> --FFLAGS="   -g " -
>> -COPTFLAGS="   -O3" --CXXOPTFLAGS=" -O3" --FOPTFLAGS="   -O3" 
>> --download-fblaslapack=1 --with-debugging=0 --with-mpiexec="srun -G 1" 
>> --with-cuda-gencodearch=70 --with-batch=0 --with-cuda=1 --download-p4est=1 --
>> download-hypre=1 --with-zlib=1 PETSC_ARCH=arch-cori-gpu-opt-gcc
>> [2]PETSC ERROR: #1 MatProductSymbolic_SeqAIJCUSPARSE_SeqAIJCUSPARSE() at 
>> /global/u2/m/madams/petsc/src/mat/impls/aij/seq/seqcusparse/aijcusparse.cu:2622
>>  
>> [2]PETSC ERROR: #2 MatProductSymbolic_ABC_Basic() at 
>> /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:1159
>> [2]PETSC ERROR: #3 MatProductSymbolic() at 
>> /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:799
>> [2]PETSC ERROR: #4 MatPtAP() at 
>> /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9626
>> [2]PETSC ERROR: #5 PCGAMGCreateLevel_GAMG() at 
>> /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:87
>> [2]PETSC ERROR: #6 PCSetUp_GAMG() at 
>> /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:663
>> [2]PETSC ERROR: #7 PCSetUp() at 
>> /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:1014
>> [2]PETSC ERROR: #8 KSPSetUp() at 
>> /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406
>> [2]PETSC ERROR: #9 KSPSolve_Private() at 
>> /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:850
>> [2]PETSC ERROR: #10 KSPSolve() at 
>> /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1084
>> [2]PETSC ERROR: #11 SNESSolve_NEWTONLS() at 
>> /global/u2/m/madams/petsc/src/snes/impls/ls/ls.c:225
>> [2]PETSC ERROR: #12 SNESSolve() at 
>> /global/u2/m/madams/petsc/src/snes/interface/snes.c:4769
>> [2]PETSC ERROR: #13 TSTheta_SNESSolve() at 
>> /global/u2/m/madams/petsc/src/ts/impls/implicit/theta/theta.c:185
>> [2]PETSC ERROR: #14 TSStep_Theta() at 
>> /global/u2/m/madams/petsc/src/ts/impls/implicit/theta/theta.c:223
>> [2]PETSC ERROR: #15 TSStep() at 
>> /global/u2/m/madams/petsc/src/ts/interface/ts.c:3571
>> [2]PETSC ERROR: #16 TSSolve() at 
>> /global/u2/m/madams/petsc/src/ts/interface/ts.c:3968
>> [2]PETSC ERROR: #17 main() at ex2.c:699
> 



Re: [petsc-users] [SLEPc] Computing Smallest Eigenvalue+Eigenvector of Many Small Matrices

2021-07-07 Thread Jose E. Roman
cuSolver has syevjBatched, which seems to fit your purpose. But I have never 
used it.

Lanczos is not competitive for such small matrices.

Jose


> El 6 jul 2021, a las 21:56, Jed Brown  escribió:
> 
> Have you tried just calling LAPACK directly? (You could try dsyevx to see if 
> there's something to gain by computing less than all the eigenvalues.) I'm 
> not aware of a batched interface at this time, but that's what you'd want for 
> performance.
> 
> Jacob Faibussowitsch  writes:
> 
>> Hello PETSc/SLEPc users,
>> 
>> Similar to a recent question I am looking for an algorithm to compute the 
>> smallest eigenvalue and eigenvector for a bunch of matrices however I have a 
>> few extra “restrictions”. All matrices have the following properties:
>> 
>> - All matrices are the same size
>> - All matrices are small (perhaps no larger than 12x12)
>> - All matrices are SPD
>> - I only need the smallest eigenpair
>> 
>> So far my best bet seems to be Lanczos but I’m wondering if there is some 
>> wunder method I’ve overlooked.
>> 
>> Best regards,
>> 
>> Jacob Faibussowitsch
>> (Jacob Fai - booss - oh - vitch)