Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-03 Thread Pierre Jolivet


> On 3 Nov 2023, at 7:28 PM, Victoria Rolandi  
> wrote:
> 
> Pierre, 
> 
> Sure, I have now installed PETSc with MUMPS and  PT-SCHOTCH, I got some 
> errors at the beginning but then it worked adding 
> --COPTFLAGS="-D_POSIX_C_SOURCE=199309L" to the configuration. 
> Also, I have compilation errors when I try to use newer versions, so I kept 
> the 3.17.0 for the moment.

You should ask for assistance to get the latest version.
(Par)METIS snapshots may have not changed, but the MUMPS one did, with 
performance improvements.

> Now the parallel ordering works with PT-SCOTCH, however, is it normal that I 
> do not see any difference in the performance compared to sequential ordering 
> ? 

Impossible to tell without you providing actual figures (number of nnz, number 
of processes, timings with sequential ordering, etc.), but 699k is not that big 
of a problem, so that is not extremely surprising.

> Also, could the error using Metis/Parmetis be due to the fact that my main 
> code (to which I linked PETSc) uses a different ParMetis than the one 
> separately installed by PETSC during the configuration?

Yes.

> Hence should I configure PETSc linking ParMetis to the same library used by 
> my main code?

Yes.

Thanks,
Pierre

> Thanks,
> Victoria 
> 
> Il giorno gio 2 nov 2023 alle ore 09:35 Pierre Jolivet  > ha scritto:
>> 
>>> On 2 Nov 2023, at 5:29 PM, Victoria Rolandi >> > wrote:
>>> 
>>> Pierre, 
>>> Yes, sorry, I'll keep the list in copy.
>>> Launching with those options (-mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2) 
>>> I get an error during the analysis step. I also launched increasing the 
>>> memory and I still have the error.
>> 
>> Oh, OK, that’s bad.
>> Would you be willing to give SCOTCH and/or PT-SCOTCH a try?
>> You’d need to reconfigure/recompile with --download-ptscotch (and maybe 
>> --download-bison depending on your system).
>> Then, the option would become either -mat_mumps_icntl_28 2 
>> -mat_mumps_icntl_29 2 (PT-SCOTCH) or -mat_mumps_icntl_7 3 (SCOTCH).
>> It may be worth updating PETSc as well (you are using 3.17.0, we are at 
>> 3.20.1), though I’m not sure we updated the METIS/ParMETIS snapshots since 
>> then, so it may not fix the present issue.
>> 
>> Thanks,
>> Pierre
>> 
>>> The calculations stops at :
>>> 
>>> Entering CMUMPS 5.4.1 from C interface with JOB, N =   1  699150
>>>   executing #MPI =  2, without OMP
>>> 
>>>  =
>>>  MUMPS compiled with option -Dmetis
>>>  MUMPS compiled with option -Dparmetis
>>>  =
>>> L U Solver for unsymmetric matrices
>>> Type of parallelism: Working host
>>> 
>>>  ** ANALYSIS STEP 
>>> 
>>>  ** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed
>>>  Using ParMETIS for parallel ordering
>>>  Structural symmetry is: 90%
>>> 
>>> 
>>> The error:
>>> 
>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
>>> probably memory access out of range
>>> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>>> [0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind
>>> [0]PETSC ERROR: or try http://valgrind.org  on 
>>> GNU/linux and Apple MacOS to find memory corruption errors
>>> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and 
>>> run
>>> [0]PETSC ERROR: to get more information on the crash.
>>> [0]PETSC ERROR: - Error Message 
>>> --
>>> [0]PETSC ERROR: Signal received
>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
>>> [0]PETSC ERROR: Petsc Release Version 3.17.0, unknown
>>> [0]PETSC ERROR: ./charlin.exe on a  named n1056 by vrolandi Wed Nov  1 
>>> 11:38:28 2023
>>> [0]PETSC ERROR: Configure options 
>>> --prefix=/u/home/v/vrolandi/CODES/LIBRARY/packages/petsc/installationDir 
>>> --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort CXXOPTFLAGS=-O3 
>>> --with-scalar-type=complex --with-debugging=0 --with-precision=single 
>>> --download-mumps --download-scalapack --download-parmetis --download-metis
>>> 
>>> [0]PETSC ERROR: #1 User provided function() at unknown file:0
>>> [0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is 
>>> causing the crash.
>>> Abort(59) on node 0 (rank 0 in comm 0): application called 
>>> MPI_Abort(MPI_COMM_WORLD, 59) - process 0
>>> 
>>> 
>>> Thanks, 
>>> Victoria 
>>> 
>>> Il giorno mer 1 nov 2023 alle ore 10:33 Pierre Jolivet >> > ha scritto:
 Victoria, please keep the list in copy.
 
> I am not understanding how can I switch to ParMetis if it does not appear 
> in the options of -mat_mumps_icntl_7.In the options I only have Metis and 
> not ParMetis.
 
 
 You need to use -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2
 
 

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-03 Thread Victoria Rolandi
Pierre,

Sure, I have now installed PETSc with MUMPS and  PT-SCHOTCH, I got some
errors at the beginning but then it worked adding
--COPTFLAGS="-D_POSIX_C_SOURCE=199309L"
to the configuration.
Also, I have compilation errors when I try to use newer versions, so I kept
the 3.17.0 for the moment.

Now the parallel ordering works with PT-SCOTCH, however, is it normal that
I do not see any difference in the performance compared to sequential
ordering ?
Also, could the error using Metis/Parmetis be due to the fact that my main
code (to which I linked PETSc) uses a different ParMetis than the one
separately installed by PETSC during the configuration?
Hence should I configure PETSc linking ParMetis to the same library used by
my main code?

Thanks,
Victoria

Il giorno gio 2 nov 2023 alle ore 09:35 Pierre Jolivet  ha
scritto:

>
> On 2 Nov 2023, at 5:29 PM, Victoria Rolandi 
> wrote:
>
> Pierre,
> Yes, sorry, I'll keep the list in copy.
> Launching with those options (-mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2)
> I get an error during the analysis step. I also launched increasing the
> memory and I still have the error.
>
>
> Oh, OK, that’s bad.
> Would you be willing to give SCOTCH and/or PT-SCOTCH a try?
> You’d need to reconfigure/recompile with --download-ptscotch (and maybe
> --download-bison depending on your system).
> Then, the option would become either -mat_mumps_icntl_28 2
> -mat_mumps_icntl_29 2 (PT-SCOTCH) or -mat_mumps_icntl_7 3 (SCOTCH).
> It may be worth updating PETSc as well (you are using 3.17.0, we are at
> 3.20.1), though I’m not sure we updated the METIS/ParMETIS snapshots since
> then, so it may not fix the present issue.
>
> Thanks,
> Pierre
>
> *The calculations stops at :*
>
> Entering CMUMPS 5.4.1 from C interface with JOB, N =   1  699150
>   executing #MPI =  2, without OMP
>
>  =
>  MUMPS compiled with option -Dmetis
>  MUMPS compiled with option -Dparmetis
>  =
> L U Solver for unsymmetric matrices
> Type of parallelism: Working host
>
>  ** ANALYSIS STEP 
>
>  ** Maximum transversal (ICNTL(6)) not allowed because matrix is
> distributed
>  Using ParMETIS for parallel ordering
>  Structural symmetry is: 90%
>
>
> *The error:*
>
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind
> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple MacOS
> to find memory corruption errors
> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and
> run
> [0]PETSC ERROR: to get more information on the crash.
> [0]PETSC ERROR: - Error Message
> --
> [0]PETSC ERROR: Signal received
> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.17.0, unknown
> [0]PETSC ERROR: ./charlin.exe on a  named n1056 by vrolandi Wed Nov  1
> 11:38:28 2023
> [0]PETSC ERROR: Configure options
> --prefix=/u/home/v/vrolandi/CODES/LIBRARY/packages/petsc/installationDir
> --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort CXXOPTFLAGS=-O3
> --with-scalar-type=complex --with-debugging=0 --with-precision=single
> --download-mumps --download-scalapack --download-parmetis --download-metis
>
> [0]PETSC ERROR: #1 User provided function() at unknown file:0
> [0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is
> causing the crash.
> Abort(59) on node 0 (rank 0 in comm 0): application called
> MPI_Abort(MPI_COMM_WORLD, 59) - process 0
>
>
> Thanks,
> Victoria
>
> Il giorno mer 1 nov 2023 alle ore 10:33 Pierre Jolivet 
> ha scritto:
>
>> Victoria, please keep the list in copy.
>>
>> I am not understanding how can I switch to ParMetis if it does not appear
>> in the options of -mat_mumps_icntl_7.In the options I only have Metis and
>> not ParMetis.
>>
>>
>> You need to use -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2
>>
>> Barry, I don’t think we can programmatically shut off this warning, it’s
>> guarded by a bunch of KEEP() values, see src/dana_driver.F:4707, which are
>> only settable/gettable by people with access to consortium releases.
>> I’ll ask the MUMPS people for confirmation.
>> Note that this warning is only printed to screen with the option
>> -mat_mumps_icntl_4 2 (or higher), so this won’t show up for standard runs.
>>
>> Thanks,
>> Pierre
>>
>> On 1 Nov 2023, at 5:52 PM, Barry Smith  wrote:
>>
>>
>>   Pierre,
>>
>>Could the PETSc MUMPS interface "turn-off" ICNTL(6) in this situation
>> so as to not trigger the confusing warning message from MUMPS?
>>
>>   Barry
>>
>> On Nov 1, 2023, at 12:17 PM, Pierre Jolivet  wrote:
>>
>>
>>
>> On 1 Nov 2023, at 3:33 PM, Zhang, Hong via petsc-users <
>> 

Re: [petsc-users] Fieldsplit on MATNEST

2023-11-03 Thread Matthew Knepley
On Wed, Nov 1, 2023 at 9:32 PM Matthew Knepley  wrote:

> On Wed, Nov 1, 2023 at 3:59 PM Tang, Qi via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
>> Hi,
>>
>> I have a block matrix with type of MATNEST, but can I call
>> -pc_fieldsplit_detect_saddle_point to change its IS? I assume it is not
>> possible.
>>
>
> The detection part will work since MatGetDiagonal() is supported, but we
> do not have code in there
> to take arbitrary blocks from MATNEST since the whole point is to get
> no-copy access. Detection is
> not needed in the case that the blocks line up.
>
>
>> Also, I notice there is a small but important typo in the new fieldsplit
>> doc:
>> https://petsc.org/release/manualpages/PC/PCFIELDSPLIT/
>> The first matrix of the full factorization misses A10. I believe it
>> should be
>> [I -ksp(A00) A01]
>> [I]
>>
>
> Yes, that is right.
>

Fixed: https://gitlab.com/petsc/petsc/-/merge_requests/6993

  THanks,

Matt


>   Thanks,
>
>  Matt
>
>
>> Best,
>> Qi
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


[petsc-users] Domain decomposition in PETSc for Molecular Dynamics

2023-11-03 Thread MIGUEL MOLINOS PEREZ
Dear all, 

I am currently working on the development of a in-house molecular dynamics code 
using PETSc and C++. So far the code works great, however it is a little bit 
slow since I am not exploiting MPI for PETSc vectors. I was wondering if there 
is a way to perform the domain decomposition efficiently using some PETSc 
functionality. Any feedback is highly appreciated.

Best regards,
Miguel