Re: [petsc-users] Tweaking my code for CUDA

2018-03-14 Thread Manuel Valera
Ok so, i went back and erased the old libpetsc.so.3 i think it was the one causing problems, i had --with-shared-libraries=0 and the installation complained of not having that file, then reinstalled with --with-shared-libraries=1 and it is finally recognizing my system installation with only CUDA,

Re: [petsc-users] Tweaking my code for CUDA

2018-03-14 Thread Matthew Knepley
On Thu, Mar 15, 2018 at 4:01 AM, Manuel Valera wrote: > Ok well, it turns out the $PETSC_DIR points to the testpetsc directory, > and it makes, install and tests without problems (only a problem on ex5f) > but trying to reconfigure on valera/petsc directory asks me to

[petsc-users] petsc_gen_xdmf.py error

2018-03-14 Thread Matteo Semplice
Hi everybody,     I am trying to follow the advice for output given in the recent thread on this list https://lists.mcs.anl.gov/pipermail/petsc-users/2018-February/034546.html At the end of each timestep in my code I do   {   PetscViewer outputToFile;   char filename[50];  

Re: [petsc-users] Tweaking my code for CUDA

2018-03-14 Thread Manuel Valera
Ok well, it turns out the $PETSC_DIR points to the testpetsc directory, and it makes, install and tests without problems (only a problem on ex5f) but trying to reconfigure on valera/petsc directory asks me to change the $PETSC_DIR variable, Meanwhile the system installation still points to the

Re: [petsc-users] Tweaking my code for CUDA

2018-03-14 Thread Matthew Knepley
On Thu, Mar 15, 2018 at 3:25 AM, Manuel Valera wrote: > yeah that worked, > > [valera@node50 tutorials]$ ./ex19 -dm_vec_type seqcuda -dm_mat_type > seqaijcusparse > lid velocity = 0.0625, prandtl # = 1., grashof # = 1. > Number of SNES iterations = 2 > [valera@node50

Re: [petsc-users] Tweaking my code for CUDA

2018-03-14 Thread Manuel Valera
yeah that worked, [valera@node50 tutorials]$ ./ex19 -dm_vec_type seqcuda -dm_mat_type seqaijcusparse lid velocity = 0.0625, prandtl # = 1., grashof # = 1. Number of SNES iterations = 2 [valera@node50 tutorials]$ How do i make sure the other program refer to this installation? using the same

Re: [petsc-users] Tweaking my code for CUDA

2018-03-14 Thread Matthew Knepley
On Thu, Mar 15, 2018 at 3:19 AM, Manuel Valera wrote: > Yes, this is the system installation that is being correctly linked (the > linear solver and model are not linking the correct installation idk why > yet) i configured with only CUDA this time because of the message

Re: [petsc-users] Tweaking my code for CUDA

2018-03-14 Thread Manuel Valera
Yes, this is the system installation that is being correctly linked (the linear solver and model are not linking the correct installation idk why yet) i configured with only CUDA this time because of the message Karl Rupp posted on my installation thread, where he says only one type of library

Re: [petsc-users] Tweaking my code for CUDA

2018-03-14 Thread Matthew Knepley
On Thu, Mar 15, 2018 at 3:12 AM, Manuel Valera wrote: > Thanks, got this error: > Did you not configure with CUSP? It looks like you have CUDA, so use -dm_vec_type seqcuda Thanks, Matt > [valera@node50 testpetsc]$ cd src/snes/examples/tutorials/ >

Re: [petsc-users] Tweaking my code for CUDA

2018-03-14 Thread Manuel Valera
Thanks, got this error: [valera@node50 testpetsc]$ cd src/snes/examples/tutorials/ [valera@node50 tutorials]$ PETSC_ARCH="" make ex19 /usr/lib64/openmpi/bin/mpicc -o ex19.o -c -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden -O2

Re: [petsc-users] Tweaking my code for CUDA

2018-03-14 Thread Matthew Knepley
On Thu, Mar 15, 2018 at 2:46 AM, Manuel Valera wrote: > Ok lets try that, if i go to /home/valera/testpetsc/ > arch-linux2-c-opt/tests/src/snes/examples/tutorials there is runex19.sh > and a lot of other ex19 variantes, but if i run that i get: >

Re: [petsc-users] Tweaking my code for CUDA

2018-03-14 Thread Manuel Valera
Ok lets try that, if i go to /home/valera/testpetsc/arch-linux2-c-opt/tests/src/snes/examples/tutorials there is runex19.sh and a lot of other ex19 variantes, but if i run that i get: [valera@node50 tutorials]$ ./runex19.sh not ok snes_tutorials-ex19_1 #

Re: [petsc-users] Tweaking my code for CUDA

2018-03-14 Thread Matthew Knepley
On Thu, Mar 15, 2018 at 2:27 AM, Manuel Valera wrote: > Ok thanks Matt, i made a smaller case with only the linear solver and a > 25x25 matrix, the error i have in this case is: > Ah, it appears that not all parts of your problem are taking the type options. If you want

Re: [petsc-users] Tweaking my code for CUDA

2018-03-14 Thread Manuel Valera
Ok thanks Matt, i made a smaller case with only the linear solver and a 25x25 matrix, the error i have in this case is: [valera@node50 alone]$ mpirun -n 1 ./linsolve -vec_type cusp -mat_type aijcusparse laplacian.petsc ! TrivSoln loaded, size: 125 / 125 RHS loaded, size:

Re: [petsc-users] PC FieldSplit - How to Count the Iterations

2018-03-14 Thread Matthew Knepley
On Wed, Mar 14, 2018 at 11:21 PM, Sonia Pozzi wrote: > Dear Barry, > > thank you for the answer. That helped a lot. Just a second curiosity. > I’m setting A00 to be solved with preonly+lu. I obtain the following > ksp_0 KSPGetTotalIterations: 26 > ksp_1

Re: [petsc-users] PC FieldSplit - How to Count the Iterations

2018-03-14 Thread Smith, Barry F.
Matt can likely answer this but it is always better to try to figure out things yourself. Everything in PETSc is knowable to the user if you know the correct places to look. I would run in the debugger and put a break point in KSPSolve() then do a bt (or where) at each call to

Re: [petsc-users] Non deterministic results with MUMPS?

2018-03-14 Thread Fande Kong
We had the similar problem before with superlu_dist, but it happened only when the number of processor cores is larger than 2. Direct solvers, in our experiences, often involve more messages (especially non-block communication). Then this causes different operation orders, and have different

Re: [petsc-users] PC FieldSplit - How to Count the Iterations

2018-03-14 Thread Sonia Pozzi
Dear Barry, thank you for the answer. That helped a lot. Just a second curiosity. I’m setting A00 to be solved with preonly+lu. I obtain the following ksp_0 KSPGetTotalIterations: 26 ksp_1 KSPGetTotalIterations: 22 Residual ksp_0: 0 Reason ksp_0: 4 Solution ksp_0 : Convergence in 1

Re: [petsc-users] PC FieldSplit - How to Count the Iterations

2018-03-14 Thread Smith, Barry F.
> On Mar 14, 2018, at 8:47 AM, Sonia Pozzi wrote: > > Dear PETSc Developers and Users, > > I’m working with PCFieldSplit preconditioner (Schur complement based > approach). > To count the number of iterations I’m taking the info from subksp_0 and > subksp_1. > I

[petsc-users] PC FieldSplit - How to Count the Iterations

2018-03-14 Thread Sonia Pozzi
Dear PETSc Developers and Users, I’m working with PCFieldSplit preconditioner (Schur complement based approach). To count the number of iterations I’m taking the info from subksp_0 and subksp_1. I understand that the number of its for subksp_1 are related to the call for the solution of the

Re: [petsc-users] Get LU decomposition of a rectangular matrix

2018-03-14 Thread Smith, Barry F.
Please email the code that fails to petsc-ma...@mcs.anl.gov > On Mar 14, 2018, at 3:56 AM, Natacha BEREUX wrote: > > Thanks for your answer > In between I tried to call directly MatLUFactorSymbolic then > MatLUFactorNumeric to avoid MatGetOrdering, and the code

Re: [petsc-users] Non deterministic results with MUMPS?

2018-03-14 Thread Tim Steinhoff
I guess that the partioning is fixed, as two results can also differ when I call two successive solves, where matrix and rhs vector and everything is identical. In that case the factorization/partitioning is reused by MUMPS and only the solving phase is executed twice, which alone leads to

Re: [petsc-users] Get LU decomposition of a rectangular matrix

2018-03-14 Thread Natacha BEREUX
Thanks for your answer In between I tried to call directly MatLUFactorSymbolic then MatLUFactorNumeric to avoid MatGetOrdering, and the code fails later (in the call to SuperLu routine pdgssvx). I would prefer to use PETSc for the computation of the nullbasis: the input matrix is a PETSc "Mat"