Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-07 Thread Victoria Rolandi
Great! It compiles now and both the commands -mat_mumps_icntl_7 2 and
 -mat_mumps_icntl_29 2 work, and perform better compared to the other
ordering types.

Thank you Pierre!

As you suggested, I'll also send a new email concerning the errors I have
with newer PETSc versions.

Best,
Victoria

Il giorno mar 7 nov 2023 alle ore 12:25 Pierre Jolivet  ha
scritto:

>
>
> On 7 Nov 2023, at 8:47 PM, Victoria Rolandi 
> wrote:
>
> Hi Pierre,
>
> Thanks for your reply. I am now trying to configure PETSc with the same
> METIS/ParMETIS of my main code.
>
> I get the following error, and I still get it even if I change the option
> --with-precision=double/--with-precision=single
>
> Metis specified is incompatible!
> IDXTYPEWIDTH=64 metis build appears to be specified for a default
> 32-bit-indices build of PETSc.
> Suggest using --download-metis for a compatible metis
>
> ***
>
> In the cofigure.log I have:
>
> compilation aborted for
> /tmp/petsc-yxtl_gwd/config.packages.metis/conftest.c (code 2)
> Source:
> #include "confdefs.h"
> #include "conffix.h"
> #include "metis.h"
>
> int main() {
> #if (IDXTYPEWIDTH != 32)
> #error incompatible IDXTYPEWIDTH
> #endif;
>   return 0;
> }
>
>
> How could I proceed?
>
>
> I would use --download-metis and then have your code use METIS from PETSc,
> not the other way around.
>
> Thanks,
> Pierre
>
> Thanks,
> Victoria
>
>
>
> Il giorno ven 3 nov 2023 alle ore 11:34 Pierre Jolivet 
> ha scritto:
>
>>
>>
>> On 3 Nov 2023, at 7:28 PM, Victoria Rolandi 
>> wrote:
>>
>> Pierre,
>>
>> Sure, I have now installed PETSc with MUMPS and  PT-SCHOTCH, I got some
>> errors at the beginning but then it worked adding 
>> --COPTFLAGS="-D_POSIX_C_SOURCE=199309L"
>> to the configuration.
>> Also, I have compilation errors when I try to use newer versions, so I
>> kept the 3.17.0 for the moment.
>>
>>
>> You should ask for assistance to get the latest version.
>> (Par)METIS snapshots may have not changed, but the MUMPS one did, with
>> performance improvements.
>>
>> Now the parallel ordering works with PT-SCOTCH, however, is it normal
>> that I do not see any difference in the performance compared to sequential
>> ordering ?
>>
>>
>> Impossible to tell without you providing actual figures (number of nnz,
>> number of processes, timings with sequential ordering, etc.), but 699k is
>> not that big of a problem, so that is not extremely surprising.
>>
>> Also, could the error using Metis/Parmetis be due to the fact that my
>> main code (to which I linked PETSc) uses a different ParMetis than the one
>> separately installed by PETSC during the configuration?
>>
>>
>> Yes.
>>
>> Hence should I configure PETSc linking ParMetis to the same library used
>> by my main code?
>>
>>
>> Yes.
>>
>> Thanks,
>> Pierre
>>
>> Thanks,
>> Victoria
>>
>> Il giorno gio 2 nov 2023 alle ore 09:35 Pierre Jolivet 
>> ha scritto:
>>
>>>
>>> On 2 Nov 2023, at 5:29 PM, Victoria Rolandi <
>>> victoria.roland...@gmail.com> wrote:
>>>
>>> Pierre,
>>> Yes, sorry, I'll keep the list in copy.
>>> Launching with those options (-mat_mumps_icntl_28 2 -mat_mumps_icntl_29
>>> 2) I get an error during the analysis step. I also launched increasing the
>>> memory and I still have the error.
>>>
>>>
>>> Oh, OK, that’s bad.
>>> Would you be willing to give SCOTCH and/or PT-SCOTCH a try?
>>> You’d need to reconfigure/recompile with --download-ptscotch (and maybe
>>> --download-bison depending on your system).
>>> Then, the option would become either -mat_mumps_icntl_28 2
>>> -mat_mumps_icntl_29 2 (PT-SCOTCH) or -mat_mumps_icntl_7 3 (SCOTCH).
>>> It may be worth updating PETSc as well (you are using 3.17.0, we are at
>>> 3.20.1), though I’m not sure we updated the METIS/ParMETIS snapshots since
>>> then, so it may not fix the present issue.
>>>
>>> Thanks,
>>> Pierre
>>>
>>> *The calculations stops at :*
>>>
>>> Entering CMUMPS 5.4.1 from C interface with JOB, N =   1  699150
>>>   executing #MPI =  2, without OMP
>>>
>>>  =
>>>  MUMPS compiled with option -Dmetis
>>>  MUMPS compiled with option -Dparmetis
>>>  =

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-07 Thread Victoria Rolandi
Hi Pierre,

Thanks for your reply. I am now trying to configure PETSc with the same
METIS/ParMETIS of my main code.

I get the following error, and I still get it even if I change the option
--with-precision=double/--with-precision=single

Metis specified is incompatible!
IDXTYPEWIDTH=64 metis build appears to be specified for a default
32-bit-indices build of PETSc.
Suggest using --download-metis for a compatible metis
***

In the cofigure.log I have:

compilation aborted for
/tmp/petsc-yxtl_gwd/config.packages.metis/conftest.c (code 2)
Source:
#include "confdefs.h"
#include "conffix.h"
#include "metis.h"

int main() {
#if (IDXTYPEWIDTH != 32)
#error incompatible IDXTYPEWIDTH
#endif;
  return 0;
}


How could I proceed?

Thanks,
Victoria



Il giorno ven 3 nov 2023 alle ore 11:34 Pierre Jolivet  ha
scritto:

>
>
> On 3 Nov 2023, at 7:28 PM, Victoria Rolandi 
> wrote:
>
> Pierre,
>
> Sure, I have now installed PETSc with MUMPS and  PT-SCHOTCH, I got some
> errors at the beginning but then it worked adding 
> --COPTFLAGS="-D_POSIX_C_SOURCE=199309L"
> to the configuration.
> Also, I have compilation errors when I try to use newer versions, so I
> kept the 3.17.0 for the moment.
>
>
> You should ask for assistance to get the latest version.
> (Par)METIS snapshots may have not changed, but the MUMPS one did, with
> performance improvements.
>
> Now the parallel ordering works with PT-SCOTCH, however, is it normal that
> I do not see any difference in the performance compared to sequential
> ordering ?
>
>
> Impossible to tell without you providing actual figures (number of nnz,
> number of processes, timings with sequential ordering, etc.), but 699k is
> not that big of a problem, so that is not extremely surprising.
>
> Also, could the error using Metis/Parmetis be due to the fact that my main
> code (to which I linked PETSc) uses a different ParMetis than the one
> separately installed by PETSC during the configuration?
>
>
> Yes.
>
> Hence should I configure PETSc linking ParMetis to the same library used
> by my main code?
>
>
> Yes.
>
> Thanks,
> Pierre
>
> Thanks,
> Victoria
>
> Il giorno gio 2 nov 2023 alle ore 09:35 Pierre Jolivet 
> ha scritto:
>
>>
>> On 2 Nov 2023, at 5:29 PM, Victoria Rolandi 
>> wrote:
>>
>> Pierre,
>> Yes, sorry, I'll keep the list in copy.
>> Launching with those options (-mat_mumps_icntl_28 2 -mat_mumps_icntl_29
>> 2) I get an error during the analysis step. I also launched increasing the
>> memory and I still have the error.
>>
>>
>> Oh, OK, that’s bad.
>> Would you be willing to give SCOTCH and/or PT-SCOTCH a try?
>> You’d need to reconfigure/recompile with --download-ptscotch (and maybe
>> --download-bison depending on your system).
>> Then, the option would become either -mat_mumps_icntl_28 2
>> -mat_mumps_icntl_29 2 (PT-SCOTCH) or -mat_mumps_icntl_7 3 (SCOTCH).
>> It may be worth updating PETSc as well (you are using 3.17.0, we are at
>> 3.20.1), though I’m not sure we updated the METIS/ParMETIS snapshots since
>> then, so it may not fix the present issue.
>>
>> Thanks,
>> Pierre
>>
>> *The calculations stops at :*
>>
>> Entering CMUMPS 5.4.1 from C interface with JOB, N =   1  699150
>>   executing #MPI =  2, without OMP
>>
>>  =
>>  MUMPS compiled with option -Dmetis
>>  MUMPS compiled with option -Dparmetis
>>  =
>> L U Solver for unsymmetric matrices
>> Type of parallelism: Working host
>>
>>  ** ANALYSIS STEP 
>>
>>  ** Maximum transversal (ICNTL(6)) not allowed because matrix is
>> distributed
>>  Using ParMETIS for parallel ordering
>>  Structural symmetry is: 90%
>>
>>
>> *The error:*
>>
>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>> probably memory access out of range
>> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>> [0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind
>> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple MacOS
>> to find memory corruption errors
>> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link,
>> and run
>> [0]PETSC ERROR: to get more information on the crash.
>> [0]PETSC ERROR: - Error Message
>> --
>> [0]PETSC ERROR: Signal receiv

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-03 Thread Victoria Rolandi
Pierre,

Sure, I have now installed PETSc with MUMPS and  PT-SCHOTCH, I got some
errors at the beginning but then it worked adding
--COPTFLAGS="-D_POSIX_C_SOURCE=199309L"
to the configuration.
Also, I have compilation errors when I try to use newer versions, so I kept
the 3.17.0 for the moment.

Now the parallel ordering works with PT-SCOTCH, however, is it normal that
I do not see any difference in the performance compared to sequential
ordering ?
Also, could the error using Metis/Parmetis be due to the fact that my main
code (to which I linked PETSc) uses a different ParMetis than the one
separately installed by PETSC during the configuration?
Hence should I configure PETSc linking ParMetis to the same library used by
my main code?

Thanks,
Victoria

Il giorno gio 2 nov 2023 alle ore 09:35 Pierre Jolivet  ha
scritto:

>
> On 2 Nov 2023, at 5:29 PM, Victoria Rolandi 
> wrote:
>
> Pierre,
> Yes, sorry, I'll keep the list in copy.
> Launching with those options (-mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2)
> I get an error during the analysis step. I also launched increasing the
> memory and I still have the error.
>
>
> Oh, OK, that’s bad.
> Would you be willing to give SCOTCH and/or PT-SCOTCH a try?
> You’d need to reconfigure/recompile with --download-ptscotch (and maybe
> --download-bison depending on your system).
> Then, the option would become either -mat_mumps_icntl_28 2
> -mat_mumps_icntl_29 2 (PT-SCOTCH) or -mat_mumps_icntl_7 3 (SCOTCH).
> It may be worth updating PETSc as well (you are using 3.17.0, we are at
> 3.20.1), though I’m not sure we updated the METIS/ParMETIS snapshots since
> then, so it may not fix the present issue.
>
> Thanks,
> Pierre
>
> *The calculations stops at :*
>
> Entering CMUMPS 5.4.1 from C interface with JOB, N =   1  699150
>   executing #MPI =  2, without OMP
>
>  =
>  MUMPS compiled with option -Dmetis
>  MUMPS compiled with option -Dparmetis
>  =
> L U Solver for unsymmetric matrices
> Type of parallelism: Working host
>
>  ** ANALYSIS STEP 
>
>  ** Maximum transversal (ICNTL(6)) not allowed because matrix is
> distributed
>  Using ParMETIS for parallel ordering
>  Structural symmetry is: 90%
>
>
> *The error:*
>
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind
> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple MacOS
> to find memory corruption errors
> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and
> run
> [0]PETSC ERROR: to get more information on the crash.
> [0]PETSC ERROR: - Error Message
> --
> [0]PETSC ERROR: Signal received
> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.17.0, unknown
> [0]PETSC ERROR: ./charlin.exe on a  named n1056 by vrolandi Wed Nov  1
> 11:38:28 2023
> [0]PETSC ERROR: Configure options
> --prefix=/u/home/v/vrolandi/CODES/LIBRARY/packages/petsc/installationDir
> --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort CXXOPTFLAGS=-O3
> --with-scalar-type=complex --with-debugging=0 --with-precision=single
> --download-mumps --download-scalapack --download-parmetis --download-metis
>
> [0]PETSC ERROR: #1 User provided function() at unknown file:0
> [0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is
> causing the crash.
> Abort(59) on node 0 (rank 0 in comm 0): application called
> MPI_Abort(MPI_COMM_WORLD, 59) - process 0
>
>
> Thanks,
> Victoria
>
> Il giorno mer 1 nov 2023 alle ore 10:33 Pierre Jolivet 
> ha scritto:
>
>> Victoria, please keep the list in copy.
>>
>> I am not understanding how can I switch to ParMetis if it does not appear
>> in the options of -mat_mumps_icntl_7.In the options I only have Metis and
>> not ParMetis.
>>
>>
>> You need to use -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2
>>
>> Barry, I don’t think we can programmatically shut off this warning, it’s
>> guarded by a bunch of KEEP() values, see src/dana_driver.F:4707, which are
>> only settable/gettable by people with access to consortium releases.
>> I’ll ask the MUMPS people for confirmation.
>> Note that this warning is only printed to screen with the option
>> -mat_mumps_icntl_4 2 (or higher), so this won’t show up for standard runs.
>>
>> Thanks,
>> Pierre
>>
&g

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-02 Thread Victoria Rolandi
Pierre,
Yes, sorry, I'll keep the list in copy.
Launching with those options (-mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2)
I get an error during the analysis step. I also launched increasing the
memory and I still have the error.

*The calculations stops at :*

Entering CMUMPS 5.4.1 from C interface with JOB, N =   1  699150
  executing #MPI =  2, without OMP

 =
 MUMPS compiled with option -Dmetis
 MUMPS compiled with option -Dparmetis
 =
L U Solver for unsymmetric matrices
Type of parallelism: Working host

 ** ANALYSIS STEP 

 ** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed
 Using ParMETIS for parallel ordering
 Structural symmetry is: 90%


*The error:*

[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple MacOS to
find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and
run
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: - Error Message
--
[0]PETSC ERROR: Signal received
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.17.0, unknown
[0]PETSC ERROR: ./charlin.exe on a  named n1056 by vrolandi Wed Nov  1
11:38:28 2023
[0]PETSC ERROR: Configure options
--prefix=/u/home/v/vrolandi/CODES/LIBRARY/packages/petsc/installationDir
--with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort CXXOPTFLAGS=-O3
--with-scalar-type=complex --with-debugging=0 --with-precision=single
--download-mumps --download-scalapack --download-parmetis --download-metis

[0]PETSC ERROR: #1 User provided function() at unknown file:0
[0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is
causing the crash.
Abort(59) on node 0 (rank 0 in comm 0): application called
MPI_Abort(MPI_COMM_WORLD, 59) - process 0


Thanks,
Victoria

Il giorno mer 1 nov 2023 alle ore 10:33 Pierre Jolivet  ha
scritto:

> Victoria, please keep the list in copy.
>
> I am not understanding how can I switch to ParMetis if it does not appear
> in the options of -mat_mumps_icntl_7.In the options I only have Metis and
> not ParMetis.
>
>
> You need to use -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2
>
> Barry, I don’t think we can programmatically shut off this warning, it’s
> guarded by a bunch of KEEP() values, see src/dana_driver.F:4707, which are
> only settable/gettable by people with access to consortium releases.
> I’ll ask the MUMPS people for confirmation.
> Note that this warning is only printed to screen with the option
> -mat_mumps_icntl_4 2 (or higher), so this won’t show up for standard runs.
>
> Thanks,
> Pierre
>
> On 1 Nov 2023, at 5:52 PM, Barry Smith  wrote:
>
>
>   Pierre,
>
>Could the PETSc MUMPS interface "turn-off" ICNTL(6) in this situation
> so as to not trigger the confusing warning message from MUMPS?
>
>   Barry
>
> On Nov 1, 2023, at 12:17 PM, Pierre Jolivet  wrote:
>
>
>
> On 1 Nov 2023, at 3:33 PM, Zhang, Hong via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
> Victoria,
> "** Maximum transversal (ICNTL(6)) not allowed because matrix is
> distributed
> Ordering based on METIS"
>
>
> This warning is benign and appears for every run using a sequential
> partitioner in MUMPS with a MATMPIAIJ.
> (I’m not saying switching to ParMETIS will not make the issue go away)
>
> Thanks,
> Pierre
>
> $ ../../../../arch-darwin-c-debug-real/bin/mpirun -n 2 ./ex2 -pc_type lu
> -mat_mumps_icntl_4 2
> Entering DMUMPS 5.6.2 from C interface with JOB, N =   1  56
>   executing #MPI =  2, without OMP
>
>  =
>  MUMPS compiled with option -Dmetis
>  MUMPS compiled with option -Dparmetis
>  MUMPS compiled with option -Dpord
>  MUMPS compiled with option -Dptscotch
>  MUMPS compiled with option -Dscotch
>  =
> L U Solver for unsymmetric matrices
> Type of parallelism: Working host
>
>  ** ANALYSIS STEP 
>
>  ** Maximum transversal (ICNTL(6)) not allowed because matrix is
> distributed
>  Processing a graph of size:56 with   194 edges
>  Ordering based on AMF
>  WARNING: Largest root node of size26 not selected for parallel
> execution
>
> Leaving analysis phase with  ...
>  INFOG(1)   =   0
&

[petsc-users] Error using Metis with PETSc installed with MUMPS

2023-10-31 Thread Victoria Rolandi
Hi,

I'm solving a large sparse linear system in parallel and I am using PETSc
with MUMPS. I am trying to test different options, like the ordering of the
matrix. Everything works if I use the *-mat_mumps_icntl_7 2  *or
*-mat_mumps_icntl_7
0 *options (with the first one, AMF, performing better than AMD), however
when I test METIS *-mat_mumps_icntl_7** 5 *I get an error (reported at the
end of the email).

I have configured PETSc with the following options:

--with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort
 --with-scalar-type=complex --with-debugging=0 --with-precision=single
--download-mumps --download-scalapack --download-parmetis --download-metis

and the installation didn't give any problems.

Could you help me understand why metis is not working?

Thank you in advance,
Victoria

Error:

 ** ANALYSIS STEP 
 ** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed
 Processing a graph of size:699150 with  69238690 edges
 Ordering based on METIS
510522 37081376 [100] [10486 699150]
Error! Unknown CType: -1