Re: [petsc-users] PETSc and Windows 10

2020-07-05 Thread Satish Balay via petsc-users
Sounds like there are different mingw tools and msys2 tools.

So I guess one could use mingw compilers even from cygwin [using cygwin tools] 
- i.e mingw compilers don't really need msys2 tools to work.

Satish

On Sun, 5 Jul 2020, Paolo Lampitella wrote:

> Unfortunately, even PETSC_ARCH=i didn't work out. And while 
> with-single-library=0 wasn't really appealing to me, it worked but only to 
> later fail on make test.
> 
> I guess all these differences are due to the fortran bindings and/or gcc 10.
> 
> However, until I discover how they are different, I guess I'll be fine with 
> /usr/bin/ar
> 
> Paolo
> 
> 
> 
> Inviato da smartphone Samsung Galaxy.
> 
> 
> 
>  Messaggio originale 
> Da: Paolo Lampitella 
> Data: 05/07/20 14:00 (GMT+01:00)
> A: Pierre Jolivet 
> Cc: Matthew Knepley , petsc-users 
> Oggetto: RE: [petsc-users] PETSc and Windows 10
> 
> Thank you very much Pierre.
> 
> I'll keep you informed in case I see any relevant change from the tests when 
> using your suggestion.
> 
> Paolo
> 
> 
> 
> Inviato da smartphone Samsung Galaxy.
> 
> 
> 
>  Messaggio originale 
> Da: Pierre Jolivet 
> Data: 05/07/20 13:45 (GMT+01:00)
> A: Paolo Lampitella 
> Cc: Matthew Knepley , petsc-users 
> Oggetto: Re: [petsc-users] PETSc and Windows 10
> 
> Hello Paolo,
> 
> On 5 Jul 2020, at 1:15 PM, Paolo Lampitella 
> mailto:paololampite...@hotmail.com>> wrote:
> 
> Dear all,
> 
> I just want to update you on my journey to PETSc compilation in Windows under 
> MSYS2+MINGW64
> 
> Unfortunately, I haven’t been able to compile petsc-slepc trough Freefem but, 
> as my final goal required also Fortran bindings (but I only needed blas, 
> lapack, metis and hypre), I decided to follow my own route using the useful 
> information from Pierre.
> 
> 
>   *   I started by installing MPI from 
> https://www.microsoft.com/en-us/download/details.aspx?id=100593. I don’t 
> think the SDK is actually needed in my specific workflow, but I installed it 
> as well together with mpisetup.
>   *   Then I installed MSYS2 just following the wizard. Opened the MSYS2 
> terminal and updated with pacman -Syuu, closed if asked, reopened it and used 
> again pacman -Syuu several times until no more updates were available. Closed 
> it and opened it back.
>   *   Under the MSYS2 terminal installed just the following packages:
> 
> 
> 
>  *   pacman -S base-devel git gcc gcc-fortran
>  *   pacman -S mingw-w64-x86_64-toolchain
>  *   pacman -S mingw-w64-x86_64-cmake
>  *   pacman -S mingw-w64-x86_64-msmpi
> 
> 
> 
>   *   Closed the MSYS2 terminal and opened the MINGW64 one, went to 
> /mingw64/include and compiled my mpi module following 
> https://www.scivision.dev/windows-mpi-msys2/:
> 
> 
> 
>  *   gfortran -c mpi.f90 -fno-range-check -fallow-invalid-boz
> 
> 
> However, I will keep an eye on the MS-MPI GitHub repository because the 
> fortran side seems to be far from perfect.
> 
> 
>   *   Then I downloaded the 3.13.3 version of petsc and configured it, still 
> under the MINGW64 terminal, with the following command:
> 
> 
> /usr/bin/python ./configure --prefix=/home/paolo/petsc --with-ar=/usr/bin/ar
> --with-shared-libraries=0 --with-debugging=0 --with-windows-graphics=0 
> --with-x=0
> COPTFLAGS="-O3 -mtune=native"
> CXXOPTFLAGS="-O3 -mtune=native"
> FOPTFLAGS="-O3 -mtune=native"
> FFLAGS=-fallow-invalid-boz
> --with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"
> --download-fblaslapack --download-metis --download-hypre
> --download-metis-cmake-arguments='-G "MSYS Makefiles"'
> --download-hypre-configure-arguments="--build=x86_64-linux-gnu 
> --host=x86_64-linux-gnu"
> 
> Note that I just bypassed uninstalling python in mingw64 (which doesn’t work) 
> by using /usr/bin/python and that, as opposed to Pierre, I needed to also use 
> the MSYS2 archiver (/usr/bin/ar) as opposed to the mingw64 one 
> (/mingw64/bin/ar that shows up in the Pierre configure) as also mentioned 
> here 
> http://hillyuan.blogspot.com/2017/11/build-petsc-in-windows-under-mingw64.html,
>  probably because of this issue 
> https://stackoverflow.com/questions/37504625/ar-on-msys2-shell-receives-truncated-paths-when-called-from-makefile.
> 
> You are right that you can avoid deinstalling mingw-w64-x86_64-python if you 
> can supply the proper Python yourself (we don’t have that luxury in our 
> Makefile).
> If you want to avoid using that AR, and stick to /mingw64/bin/ar (not sure 
> what the pros and cons are), you can either:
> - use another PETSC_ARCH (very short, like pw, for petsc-windows);
> - use --with-single-library=0.
> See this post on GitLab 
> https://gitlab.com/petsc/petsc/-/issues/647#note_373507681
> The OS I’m referring to is indeed my Windows + MSYS2 box.
> 
> Thanks,
> Pierre
> 
> Then make all, make install and make check all went smooth. Also, I don’t 
> know exactly what with-x=0 and with-windows-graphics=0 do, but I think it is 
> stuff that I don’t need (yet configure worked with 

Re: [petsc-users] PETSc and Windows 10

2020-07-05 Thread Paolo Lampitella
Unfortunately, even PETSC_ARCH=i didn't work out. And while 
with-single-library=0 wasn't really appealing to me, it worked but only to 
later fail on make test.

I guess all these differences are due to the fortran bindings and/or gcc 10.

However, until I discover how they are different, I guess I'll be fine with 
/usr/bin/ar

Paolo



Inviato da smartphone Samsung Galaxy.



 Messaggio originale 
Da: Paolo Lampitella 
Data: 05/07/20 14:00 (GMT+01:00)
A: Pierre Jolivet 
Cc: Matthew Knepley , petsc-users 
Oggetto: RE: [petsc-users] PETSc and Windows 10

Thank you very much Pierre.

I'll keep you informed in case I see any relevant change from the tests when 
using your suggestion.

Paolo



Inviato da smartphone Samsung Galaxy.



 Messaggio originale 
Da: Pierre Jolivet 
Data: 05/07/20 13:45 (GMT+01:00)
A: Paolo Lampitella 
Cc: Matthew Knepley , petsc-users 
Oggetto: Re: [petsc-users] PETSc and Windows 10

Hello Paolo,

On 5 Jul 2020, at 1:15 PM, Paolo Lampitella 
mailto:paololampite...@hotmail.com>> wrote:

Dear all,

I just want to update you on my journey to PETSc compilation in Windows under 
MSYS2+MINGW64

Unfortunately, I haven’t been able to compile petsc-slepc trough Freefem but, 
as my final goal required also Fortran bindings (but I only needed blas, 
lapack, metis and hypre), I decided to follow my own route using the useful 
information from Pierre.


  *   I started by installing MPI from 
https://www.microsoft.com/en-us/download/details.aspx?id=100593. I don’t think 
the SDK is actually needed in my specific workflow, but I installed it as well 
together with mpisetup.
  *   Then I installed MSYS2 just following the wizard. Opened the MSYS2 
terminal and updated with pacman -Syuu, closed if asked, reopened it and used 
again pacman -Syuu several times until no more updates were available. Closed 
it and opened it back.
  *   Under the MSYS2 terminal installed just the following packages:



 *   pacman -S base-devel git gcc gcc-fortran
 *   pacman -S mingw-w64-x86_64-toolchain
 *   pacman -S mingw-w64-x86_64-cmake
 *   pacman -S mingw-w64-x86_64-msmpi



  *   Closed the MSYS2 terminal and opened the MINGW64 one, went to 
/mingw64/include and compiled my mpi module following 
https://www.scivision.dev/windows-mpi-msys2/:



 *   gfortran -c mpi.f90 -fno-range-check -fallow-invalid-boz


However, I will keep an eye on the MS-MPI GitHub repository because the fortran 
side seems to be far from perfect.


  *   Then I downloaded the 3.13.3 version of petsc and configured it, still 
under the MINGW64 terminal, with the following command:


/usr/bin/python ./configure --prefix=/home/paolo/petsc --with-ar=/usr/bin/ar
--with-shared-libraries=0 --with-debugging=0 --with-windows-graphics=0 
--with-x=0
COPTFLAGS="-O3 -mtune=native"
CXXOPTFLAGS="-O3 -mtune=native"
FOPTFLAGS="-O3 -mtune=native"
FFLAGS=-fallow-invalid-boz
--with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"
--download-fblaslapack --download-metis --download-hypre
--download-metis-cmake-arguments='-G "MSYS Makefiles"'
--download-hypre-configure-arguments="--build=x86_64-linux-gnu 
--host=x86_64-linux-gnu"

Note that I just bypassed uninstalling python in mingw64 (which doesn’t work) 
by using /usr/bin/python and that, as opposed to Pierre, I needed to also use 
the MSYS2 archiver (/usr/bin/ar) as opposed to the mingw64 one (/mingw64/bin/ar 
that shows up in the Pierre configure) as also mentioned here 
http://hillyuan.blogspot.com/2017/11/build-petsc-in-windows-under-mingw64.html, 
probably because of this issue 
https://stackoverflow.com/questions/37504625/ar-on-msys2-shell-receives-truncated-paths-when-called-from-makefile.

You are right that you can avoid deinstalling mingw-w64-x86_64-python if you 
can supply the proper Python yourself (we don’t have that luxury in our 
Makefile).
If you want to avoid using that AR, and stick to /mingw64/bin/ar (not sure what 
the pros and cons are), you can either:
- use another PETSC_ARCH (very short, like pw, for petsc-windows);
- use --with-single-library=0.
See this post on GitLab 
https://gitlab.com/petsc/petsc/-/issues/647#note_373507681
The OS I’m referring to is indeed my Windows + MSYS2 box.

Thanks,
Pierre

Then make all, make install and make check all went smooth. Also, I don’t know 
exactly what with-x=0 and with-windows-graphics=0 do, but I think it is stuff 
that I don’t need (yet configure worked with windows-graphics as well).


  *   Finally I launched make test. As some tests failed, I replicated the same 
install procedure on all the systems I have available on this same Windows 
machine (Ubuntu 20.04 and Centos 8 under a VirtualBox 6.0.22 VM, Ubuntu 20.04 
under WSL1 and the MSYS2-MINGW64 toolchain). I am attaching a file with the 
results printed to screen (not sure about which file should be used for a 
comparison/check). Note, however, that the tests in MSYS2 started with some 
cyclic reference issues for some 

Re: [petsc-users] Performance of ISCreateGeneral()

2020-07-05 Thread Barry Smith

  Mark is correct. Since ISCreateGeneral is labeled in the manual page as 
Collective this means that all processes synchronize during the call. Hence if 
any process gets to the call long before the other processes it will look like 
a great deal of time is being spent in the call, when in fact it is just 
waiting for the other processes. You should check the code before the call the 
ISCreateGeneral() for load balancing issues.

   Barry


> On Jul 5, 2020, at 3:27 PM, Mark Adams  wrote:
> 
> The column after the time in the PETSC output, on the  ISCreateGeneral row, 
> with -log_view, shows the ratio of the max time (shown) to min time.
> If you are using your own timers just report the min and max time that you 
> measure.
> You can also put an MPI_Barrier before the call to ISCreateGeneral to get 
> clean data.
> And I trust that the problem is well load balance, that is, all processors 
> have the same amount of data.
> 
> On Sun, Jul 5, 2020 at 4:00 PM Y. Shidi  > wrote:
> Thank you for your reply Mark.
> What do you mean by ratio (max/min)?
> Do I need to turn on some options for this?
> 
> Kind regards,
> Shidi
> 
> On 2020-07-05 20:15, Mark Adams wrote:
> > ISCreateGeneral just takes your indices and caches them. But it is a
> > synch point. What ratio (max/min) is PETSc reporting in this data?
> > 
> > On Sun, Jul 5, 2020 at 1:51 PM Y. Shidi  > > wrote:
> > 
> >> Dear developers,
> >> 
> >> I am currently doing a weak scaling test, and find that
> >> the weak scaling results for ISCreateGeneral() is very bad.
> >> For 2, 4, 8 processors, the running times for completing
> >> ISCreateGeneral() are:
> >> 0.687494, 3.00597 and 10.0613.
> >> I am not sure if this is normal.
> >> 
> >> Thank you for your time.
> >> 
> >> Kind Regards,
> >> Shidi



Re: [petsc-users] Performance of ISCreateGeneral()

2020-07-05 Thread Mark Adams
The column after the time in the PETSC output, on the  ISCreateGeneral row,
with -log_view, shows the ratio of the max time (shown) to min time.
If you are using your own timers just report the min and max time that you
measure.
You can also put an MPI_Barrier before the call to ISCreateGeneral to get
clean data.
And I trust that the problem is well load balance, that is, all processors
have the same amount of data.

On Sun, Jul 5, 2020 at 4:00 PM Y. Shidi  wrote:

> Thank you for your reply Mark.
> What do you mean by ratio (max/min)?
> Do I need to turn on some options for this?
>
> Kind regards,
> Shidi
>
> On 2020-07-05 20:15, Mark Adams wrote:
> > ISCreateGeneral just takes your indices and caches them. But it is a
> > synch point. What ratio (max/min) is PETSc reporting in this data?
> >
> > On Sun, Jul 5, 2020 at 1:51 PM Y. Shidi  wrote:
> >
> >> Dear developers,
> >>
> >> I am currently doing a weak scaling test, and find that
> >> the weak scaling results for ISCreateGeneral() is very bad.
> >> For 2, 4, 8 processors, the running times for completing
> >> ISCreateGeneral() are:
> >> 0.687494, 3.00597 and 10.0613.
> >> I am not sure if this is normal.
> >>
> >> Thank you for your time.
> >>
> >> Kind Regards,
> >> Shidi
>


Re: [petsc-users] Performance of ISCreateGeneral()

2020-07-05 Thread Mark Adams
ISCreateGeneral just takes your indices and caches them. But it is a synch
point. What ratio (max/min) is PETSc reporting in this data?

On Sun, Jul 5, 2020 at 1:51 PM Y. Shidi  wrote:

> Dear developers,
>
> I am currently doing a weak scaling test, and find that
> the weak scaling results for ISCreateGeneral() is very bad.
> For 2, 4, 8 processors, the running times for completing
> ISCreateGeneral() are:
> 0.687494, 3.00597 and 10.0613.
> I am not sure if this is normal.
>
> Thank you for your time.
>
> Kind Regards,
> Shidi
>


[petsc-users] Performance of ISCreateGeneral()

2020-07-05 Thread Y. Shidi

Dear developers,

I am currently doing a weak scaling test, and find that
the weak scaling results for ISCreateGeneral() is very bad.
For 2, 4, 8 processors, the running times for completing 
ISCreateGeneral() are:

0.687494, 3.00597 and 10.0613.
I am not sure if this is normal.

Thank you for your time.

Kind Regards,
Shidi


Re: [petsc-users] PETSc and Windows 10

2020-07-05 Thread Paolo Lampitella
Thank you very much Pierre.

I'll keep you informed in case I see any relevant change from the tests when 
using your suggestion.

Paolo



Inviato da smartphone Samsung Galaxy.



 Messaggio originale 
Da: Pierre Jolivet 
Data: 05/07/20 13:45 (GMT+01:00)
A: Paolo Lampitella 
Cc: Matthew Knepley , petsc-users 
Oggetto: Re: [petsc-users] PETSc and Windows 10

Hello Paolo,

On 5 Jul 2020, at 1:15 PM, Paolo Lampitella 
mailto:paololampite...@hotmail.com>> wrote:

Dear all,

I just want to update you on my journey to PETSc compilation in Windows under 
MSYS2+MINGW64

Unfortunately, I haven’t been able to compile petsc-slepc trough Freefem but, 
as my final goal required also Fortran bindings (but I only needed blas, 
lapack, metis and hypre), I decided to follow my own route using the useful 
information from Pierre.


  *   I started by installing MPI from 
https://www.microsoft.com/en-us/download/details.aspx?id=100593. I don’t think 
the SDK is actually needed in my specific workflow, but I installed it as well 
together with mpisetup.
  *   Then I installed MSYS2 just following the wizard. Opened the MSYS2 
terminal and updated with pacman -Syuu, closed if asked, reopened it and used 
again pacman -Syuu several times until no more updates were available. Closed 
it and opened it back.
  *   Under the MSYS2 terminal installed just the following packages:



 *   pacman -S base-devel git gcc gcc-fortran
 *   pacman -S mingw-w64-x86_64-toolchain
 *   pacman -S mingw-w64-x86_64-cmake
 *   pacman -S mingw-w64-x86_64-msmpi



  *   Closed the MSYS2 terminal and opened the MINGW64 one, went to 
/mingw64/include and compiled my mpi module following 
https://www.scivision.dev/windows-mpi-msys2/:



 *   gfortran -c mpi.f90 -fno-range-check -fallow-invalid-boz


However, I will keep an eye on the MS-MPI GitHub repository because the fortran 
side seems to be far from perfect.


  *   Then I downloaded the 3.13.3 version of petsc and configured it, still 
under the MINGW64 terminal, with the following command:


/usr/bin/python ./configure --prefix=/home/paolo/petsc --with-ar=/usr/bin/ar
--with-shared-libraries=0 --with-debugging=0 --with-windows-graphics=0 
--with-x=0
COPTFLAGS="-O3 -mtune=native"
CXXOPTFLAGS="-O3 -mtune=native"
FOPTFLAGS="-O3 -mtune=native"
FFLAGS=-fallow-invalid-boz
--with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"
--download-fblaslapack --download-metis --download-hypre
--download-metis-cmake-arguments='-G "MSYS Makefiles"'
--download-hypre-configure-arguments="--build=x86_64-linux-gnu 
--host=x86_64-linux-gnu"

Note that I just bypassed uninstalling python in mingw64 (which doesn’t work) 
by using /usr/bin/python and that, as opposed to Pierre, I needed to also use 
the MSYS2 archiver (/usr/bin/ar) as opposed to the mingw64 one (/mingw64/bin/ar 
that shows up in the Pierre configure) as also mentioned here 
http://hillyuan.blogspot.com/2017/11/build-petsc-in-windows-under-mingw64.html, 
probably because of this issue 
https://stackoverflow.com/questions/37504625/ar-on-msys2-shell-receives-truncated-paths-when-called-from-makefile.

You are right that you can avoid deinstalling mingw-w64-x86_64-python if you 
can supply the proper Python yourself (we don’t have that luxury in our 
Makefile).
If you want to avoid using that AR, and stick to /mingw64/bin/ar (not sure what 
the pros and cons are), you can either:
- use another PETSC_ARCH (very short, like pw, for petsc-windows);
- use --with-single-library=0.
See this post on GitLab 
https://gitlab.com/petsc/petsc/-/issues/647#note_373507681
The OS I’m referring to is indeed my Windows + MSYS2 box.

Thanks,
Pierre

Then make all, make install and make check all went smooth. Also, I don’t know 
exactly what with-x=0 and with-windows-graphics=0 do, but I think it is stuff 
that I don’t need (yet configure worked with windows-graphics as well).


  *   Finally I launched make test. As some tests failed, I replicated the same 
install procedure on all the systems I have available on this same Windows 
machine (Ubuntu 20.04 and Centos 8 under a VirtualBox 6.0.22 VM, Ubuntu 20.04 
under WSL1 and the MSYS2-MINGW64 toolchain). I am attaching a file with the 
results printed to screen (not sure about which file should be used for a 
comparison/check). Note, however, that the tests in MSYS2 started with some 
cyclic reference issues for some .mod files, but this doesn’t show up in any 
file I could check.


I am still left with some doubts about the archiver, the cyclic reference 
errors and the differences in the test results, but I am able to link my code 
with petsc. Unfortunately, as this Windows porting is part of a large code 
restructuring, I can’t do much more with it, now, from my code. But if you can 
suggest some specific tutorial to use as test also for the parallel, I would be 
glad to dig deeper into the matter.

Best regards

Paolo

Inviato da Posta 

Re: [petsc-users] PETSc and Windows 10

2020-07-05 Thread Pierre Jolivet
Hello Paolo,

> On 5 Jul 2020, at 1:15 PM, Paolo Lampitella  
> wrote:
> 
> Dear all,
>  
> I just want to update you on my journey to PETSc compilation in Windows under 
> MSYS2+MINGW64
>  
> Unfortunately, I haven’t been able to compile petsc-slepc trough Freefem but, 
> as my final goal required also Fortran bindings (but I only needed blas, 
> lapack, metis and hypre), I decided to follow my own route using the useful 
> information from Pierre.
>  
> I started by installing MPI from 
> https://www.microsoft.com/en-us/download/details.aspx?id=100593 
> . I don’t 
> think the SDK is actually needed in my specific workflow, but I installed it 
> as well together with mpisetup.
> Then I installed MSYS2 just following the wizard. Opened the MSYS2 terminal 
> and updated with pacman -Syuu, closed if asked, reopened it and used again 
> pacman -Syuu several times until no more updates were available. Closed it 
> and opened it back.
> Under the MSYS2 terminal installed just the following packages:
>  
> pacman -S base-devel git gcc gcc-fortran
> pacman -S mingw-w64-x86_64-toolchain
> pacman -S mingw-w64-x86_64-cmake
> pacman -S mingw-w64-x86_64-msmpi
>  
> Closed the MSYS2 terminal and opened the MINGW64 one, went to 
> /mingw64/include and compiled my mpi module following 
> https://www.scivision.dev/windows-mpi-msys2/ 
> :
>  
> gfortran -c mpi.f90 -fno-range-check -fallow-invalid-boz
>  
> However, I will keep an eye on the MS-MPI GitHub repository because the 
> fortran side seems to be far from perfect.
>  
> Then I downloaded the 3.13.3 version of petsc and configured it, still under 
> the MINGW64 terminal, with the following command:
>  
> /usr/bin/python ./configure --prefix=/home/paolo/petsc --with-ar=/usr/bin/ar
> --with-shared-libraries=0 --with-debugging=0 --with-windows-graphics=0 
> --with-x=0
> COPTFLAGS="-O3 -mtune=native" 
> CXXOPTFLAGS="-O3 -mtune=native" 
> FOPTFLAGS="-O3 -mtune=native" 
> FFLAGS=-fallow-invalid-boz 
> --with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"
> --download-fblaslapack --download-metis --download-hypre
> --download-metis-cmake-arguments='-G "MSYS Makefiles"'
> --download-hypre-configure-arguments="--build=x86_64-linux-gnu 
> --host=x86_64-linux-gnu"
>  
> Note that I just bypassed uninstalling python in mingw64 (which doesn’t work) 
> by using /usr/bin/python and that, as opposed to Pierre, I needed to also use 
> the MSYS2 archiver (/usr/bin/ar) as opposed to the mingw64 one 
> (/mingw64/bin/ar that shows up in the Pierre configure) as also mentioned 
> here 
> http://hillyuan.blogspot.com/2017/11/build-petsc-in-windows-under-mingw64.html
>  
> ,
>  probably because of this issue 
> https://stackoverflow.com/questions/37504625/ar-on-msys2-shell-receives-truncated-paths-when-called-from-makefile
>  
> .

You are right that you can avoid deinstalling mingw-w64-x86_64-python if you 
can supply the proper Python yourself (we don’t have that luxury in our 
Makefile).
If you want to avoid using that AR, and stick to /mingw64/bin/ar (not sure what 
the pros and cons are), you can either:
- use another PETSC_ARCH (very short, like pw, for petsc-windows);
- use --with-single-library=0.
See this post on GitLab 
https://gitlab.com/petsc/petsc/-/issues/647#note_373507681 

The OS I’m referring to is indeed my Windows + MSYS2 box.

Thanks,
Pierre

> Then make all, make install and make check all went smooth. Also, I don’t 
> know exactly what with-x=0 and with-windows-graphics=0 do, but I think it is 
> stuff that I don’t need (yet configure worked with windows-graphics as well).
>  
> Finally I launched make test. As some tests failed, I replicated the same 
> install procedure on all the systems I have available on this same Windows 
> machine (Ubuntu 20.04 and Centos 8 under a VirtualBox 6.0.22 VM, Ubuntu 20.04 
> under WSL1 and the MSYS2-MINGW64 toolchain). I am attaching a file with the 
> results printed to screen (not sure about which file should be used for a 
> comparison/check). Note, however, that the tests in MSYS2 started with some 
> cyclic reference issues for some .mod files, but this doesn’t show up in any 
> file I could check.
>  
> I am still left with some doubts about the archiver, the cyclic reference 
> errors and the differences in the test results, but I am able to link my code 
> with petsc. Unfortunately, as this Windows porting is part of a large code 
> restructuring, I can’t do much more with it, now, from my code. But if you 
> can suggest some specific tutorial to use as test also for the parallel, I 
> would be glad to dig deeper into the matter.
>  
> Best regards
>  
> 

[petsc-users] R: PETSc and Windows 10

2020-07-05 Thread Paolo Lampitella
Dear all,

I just want to update you on my journey to PETSc compilation in Windows under 
MSYS2+MINGW64

Unfortunately, I haven’t been able to compile petsc-slepc trough Freefem but, 
as my final goal required also Fortran bindings (but I only needed blas, 
lapack, metis and hypre), I decided to follow my own route using the useful 
information from Pierre.


  *   I started by installing MPI from 
https://www.microsoft.com/en-us/download/details.aspx?id=100593. I don’t think 
the SDK is actually needed in my specific workflow, but I installed it as well 
together with mpisetup.
  *   Then I installed MSYS2 just following the wizard. Opened the MSYS2 
terminal and updated with pacman -Syuu, closed if asked, reopened it and used 
again pacman -Syuu several times until no more updates were available. Closed 
it and opened it back.
  *   Under the MSYS2 terminal installed just the following packages:



 *   pacman -S base-devel git gcc gcc-fortran
 *   pacman -S mingw-w64-x86_64-toolchain
 *   pacman -S mingw-w64-x86_64-cmake
 *   pacman -S mingw-w64-x86_64-msmpi


  *   Closed the MSYS2 terminal and opened the MINGW64 one, went to 
/mingw64/include and compiled my mpi module following 
https://www.scivision.dev/windows-mpi-msys2/:



 *   gfortran -c mpi.f90 -fno-range-check -fallow-invalid-boz


However, I will keep an eye on the MS-MPI GitHub repository because the fortran 
side seems to be far from perfect.



  *   Then I downloaded the 3.13.3 version of petsc and configured it, still 
under the MINGW64 terminal, with the following command:



/usr/bin/python ./configure --prefix=/home/paolo/petsc --with-ar=/usr/bin/ar

--with-shared-libraries=0 --with-debugging=0 --with-windows-graphics=0 
--with-x=0

COPTFLAGS="-O3 -mtune=native"

CXXOPTFLAGS="-O3 -mtune=native"

FOPTFLAGS="-O3 -mtune=native"

FFLAGS=-fallow-invalid-boz

--with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"

--download-fblaslapack --download-metis --download-hypre

--download-metis-cmake-arguments='-G "MSYS Makefiles"'

--download-hypre-configure-arguments="--build=x86_64-linux-gnu 
--host=x86_64-linux-gnu"



Note that I just bypassed uninstalling python in mingw64 (which doesn’t work) 
by using /usr/bin/python and that, as opposed to Pierre, I needed to also use 
the MSYS2 archiver (/usr/bin/ar) as opposed to the mingw64 one (/mingw64/bin/ar 
that shows up in the Pierre configure) as also mentioned here 
http://hillyuan.blogspot.com/2017/11/build-petsc-in-windows-under-mingw64.html, 
probably because of this issue 
https://stackoverflow.com/questions/37504625/ar-on-msys2-shell-receives-truncated-paths-when-called-from-makefile.
 Then make all, make install and make check all went smooth. Also, I don’t know 
exactly what with-x=0 and with-windows-graphics=0 do, but I think it is stuff 
that I don’t need (yet configure worked with windows-graphics as well).



  *   Finally I launched make test. As some tests failed, I replicated the same 
install procedure on all the systems I have available on this same Windows 
machine (Ubuntu 20.04 and Centos 8 under a VirtualBox 6.0.22 VM, Ubuntu 20.04 
under WSL1 and the MSYS2-MINGW64 toolchain). I am attaching a file with the 
results printed to screen (not sure about which file should be used for a 
comparison/check). Note, however, that the tests in MSYS2 started with some 
cyclic reference issues for some .mod files, but this doesn’t show up in any 
file I could check.

I am still left with some doubts about the archiver, the cyclic reference 
errors and the differences in the test results, but I am able to link my code 
with petsc. Unfortunately, as this Windows porting is part of a large code 
restructuring, I can’t do much more with it, now, from my code. But if you can 
suggest some specific tutorial to use as test also for the parallel, I would be 
glad to dig deeper into the matter.

Best regards

Paolo

Inviato da Posta per Windows 10

Da: Pierre Jolivet
Inviato: martedì 30 giugno 2020 15:22
A: Paolo Lampitella
Cc: Matthew Knepley; 
petsc-users
Oggetto: Re: [petsc-users] PETSc and Windows 10

Please use the 3.13.2 tarball, this was fixed by Satish in the previous commit 
I already linked 
(https://gitlab.com/petsc/petsc/-/commit/2cd8068296b34e127f055bb32f556e3599f17523).
(If you want FreeFEM to do the dirty work for you, just switch to the develop 
branch, and redo “make petsc-slepc”)
But I think you’ve got everything you need now for a smooth compilation :)

Thanks,
Pierre


On 30 Jun 2020, at 3:09 PM, Paolo Lampitella 
mailto:paololampite...@hotmail.com>> wrote:

Dear Pierre,

thanks for the fast response. Unfortunately it still fails, but now in the 
configure of ScaLAPACK
(which means that it went ok for slepc, tetgen, metis, parmetis, ptscotch, 
superlu and suitesparse).

The