Re: [petsc-users] PETSc and Windows 10

2020-07-05 Thread Satish Balay via petsc-users
Sounds like there are different mingw tools and msys2 tools.

So I guess one could use mingw compilers even from cygwin [using cygwin tools] 
- i.e mingw compilers don't really need msys2 tools to work.

Satish

On Sun, 5 Jul 2020, Paolo Lampitella wrote:

> Unfortunately, even PETSC_ARCH=i didn't work out. And while 
> with-single-library=0 wasn't really appealing to me, it worked but only to 
> later fail on make test.
> 
> I guess all these differences are due to the fortran bindings and/or gcc 10.
> 
> However, until I discover how they are different, I guess I'll be fine with 
> /usr/bin/ar
> 
> Paolo
> 
> 
> 
> Inviato da smartphone Samsung Galaxy.
> 
> 
> 
>  Messaggio originale 
> Da: Paolo Lampitella 
> Data: 05/07/20 14:00 (GMT+01:00)
> A: Pierre Jolivet 
> Cc: Matthew Knepley , petsc-users 
> Oggetto: RE: [petsc-users] PETSc and Windows 10
> 
> Thank you very much Pierre.
> 
> I'll keep you informed in case I see any relevant change from the tests when 
> using your suggestion.
> 
> Paolo
> 
> 
> 
> Inviato da smartphone Samsung Galaxy.
> 
> 
> 
>  Messaggio originale 
> Da: Pierre Jolivet 
> Data: 05/07/20 13:45 (GMT+01:00)
> A: Paolo Lampitella 
> Cc: Matthew Knepley , petsc-users 
> Oggetto: Re: [petsc-users] PETSc and Windows 10
> 
> Hello Paolo,
> 
> On 5 Jul 2020, at 1:15 PM, Paolo Lampitella 
> mailto:paololampite...@hotmail.com>> wrote:
> 
> Dear all,
> 
> I just want to update you on my journey to PETSc compilation in Windows under 
> MSYS2+MINGW64
> 
> Unfortunately, I haven’t been able to compile petsc-slepc trough Freefem but, 
> as my final goal required also Fortran bindings (but I only needed blas, 
> lapack, metis and hypre), I decided to follow my own route using the useful 
> information from Pierre.
> 
> 
>   *   I started by installing MPI from 
> https://www.microsoft.com/en-us/download/details.aspx?id=100593. I don’t 
> think the SDK is actually needed in my specific workflow, but I installed it 
> as well together with mpisetup.
>   *   Then I installed MSYS2 just following the wizard. Opened the MSYS2 
> terminal and updated with pacman -Syuu, closed if asked, reopened it and used 
> again pacman -Syuu several times until no more updates were available. Closed 
> it and opened it back.
>   *   Under the MSYS2 terminal installed just the following packages:
> 
> 
> 
>  *   pacman -S base-devel git gcc gcc-fortran
>  *   pacman -S mingw-w64-x86_64-toolchain
>  *   pacman -S mingw-w64-x86_64-cmake
>  *   pacman -S mingw-w64-x86_64-msmpi
> 
> 
> 
>   *   Closed the MSYS2 terminal and opened the MINGW64 one, went to 
> /mingw64/include and compiled my mpi module following 
> https://www.scivision.dev/windows-mpi-msys2/:
> 
> 
> 
>  *   gfortran -c mpi.f90 -fno-range-check -fallow-invalid-boz
> 
> 
> However, I will keep an eye on the MS-MPI GitHub repository because the 
> fortran side seems to be far from perfect.
> 
> 
>   *   Then I downloaded the 3.13.3 version of petsc and configured it, still 
> under the MINGW64 terminal, with the following command:
> 
> 
> /usr/bin/python ./configure --prefix=/home/paolo/petsc --with-ar=/usr/bin/ar
> --with-shared-libraries=0 --with-debugging=0 --with-windows-graphics=0 
> --with-x=0
> COPTFLAGS="-O3 -mtune=native"
> CXXOPTFLAGS="-O3 -mtune=native"
> FOPTFLAGS="-O3 -mtune=native"
> FFLAGS=-fallow-invalid-boz
> --with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"
> --download-fblaslapack --download-metis --download-hypre
> --download-metis-cmake-arguments='-G "MSYS Makefiles"'
> --download-hypre-configure-arguments="--build=x86_64-linux-gnu 
> --host=x86_64-linux-gnu"
> 
> Note that I just bypassed uninstalling python in mingw64 (which doesn’t work) 
> by using /usr/bin/python and that, as opposed to Pierre, I needed to also use 
> the MSYS2 archiver (/usr/bin/ar) as opposed to the mingw64 one 
> (/mingw64/bin/ar that shows up in the Pierre configure) as also mentioned 
> here 
> http://hillyuan.blogspot.com/2017/11/build-petsc-in-windows-under-mingw64.html,
>  probably because of this issue 
> https://stackoverflow.com/questions/37504625/ar-on-msys2-shell-receives-truncated-paths-when-called-from-makefile.
> 
> You are right that you can avoid deinstalling mingw-w64-x86_64-python if you 
> can supply the proper Python yourself (we don’t have that luxury in our 
> Makefile).
> If you want to avoid using that AR, and stick to /mingw64/bin/ar (not sure 
> what the pros and cons are), you can either:
> - use another PETSC_ARCH (very short, l

Re: [petsc-users] PETSc and Windows 10

2020-07-05 Thread Paolo Lampitella
Unfortunately, even PETSC_ARCH=i didn't work out. And while 
with-single-library=0 wasn't really appealing to me, it worked but only to 
later fail on make test.

I guess all these differences are due to the fortran bindings and/or gcc 10.

However, until I discover how they are different, I guess I'll be fine with 
/usr/bin/ar

Paolo



Inviato da smartphone Samsung Galaxy.



 Messaggio originale 
Da: Paolo Lampitella 
Data: 05/07/20 14:00 (GMT+01:00)
A: Pierre Jolivet 
Cc: Matthew Knepley , petsc-users 
Oggetto: RE: [petsc-users] PETSc and Windows 10

Thank you very much Pierre.

I'll keep you informed in case I see any relevant change from the tests when 
using your suggestion.

Paolo



Inviato da smartphone Samsung Galaxy.



 Messaggio originale 
Da: Pierre Jolivet 
Data: 05/07/20 13:45 (GMT+01:00)
A: Paolo Lampitella 
Cc: Matthew Knepley , petsc-users 
Oggetto: Re: [petsc-users] PETSc and Windows 10

Hello Paolo,

On 5 Jul 2020, at 1:15 PM, Paolo Lampitella 
mailto:paololampite...@hotmail.com>> wrote:

Dear all,

I just want to update you on my journey to PETSc compilation in Windows under 
MSYS2+MINGW64

Unfortunately, I haven’t been able to compile petsc-slepc trough Freefem but, 
as my final goal required also Fortran bindings (but I only needed blas, 
lapack, metis and hypre), I decided to follow my own route using the useful 
information from Pierre.


  *   I started by installing MPI from 
https://www.microsoft.com/en-us/download/details.aspx?id=100593. I don’t think 
the SDK is actually needed in my specific workflow, but I installed it as well 
together with mpisetup.
  *   Then I installed MSYS2 just following the wizard. Opened the MSYS2 
terminal and updated with pacman -Syuu, closed if asked, reopened it and used 
again pacman -Syuu several times until no more updates were available. Closed 
it and opened it back.
  *   Under the MSYS2 terminal installed just the following packages:



 *   pacman -S base-devel git gcc gcc-fortran
 *   pacman -S mingw-w64-x86_64-toolchain
 *   pacman -S mingw-w64-x86_64-cmake
 *   pacman -S mingw-w64-x86_64-msmpi



  *   Closed the MSYS2 terminal and opened the MINGW64 one, went to 
/mingw64/include and compiled my mpi module following 
https://www.scivision.dev/windows-mpi-msys2/:



 *   gfortran -c mpi.f90 -fno-range-check -fallow-invalid-boz


However, I will keep an eye on the MS-MPI GitHub repository because the fortran 
side seems to be far from perfect.


  *   Then I downloaded the 3.13.3 version of petsc and configured it, still 
under the MINGW64 terminal, with the following command:


/usr/bin/python ./configure --prefix=/home/paolo/petsc --with-ar=/usr/bin/ar
--with-shared-libraries=0 --with-debugging=0 --with-windows-graphics=0 
--with-x=0
COPTFLAGS="-O3 -mtune=native"
CXXOPTFLAGS="-O3 -mtune=native"
FOPTFLAGS="-O3 -mtune=native"
FFLAGS=-fallow-invalid-boz
--with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"
--download-fblaslapack --download-metis --download-hypre
--download-metis-cmake-arguments='-G "MSYS Makefiles"'
--download-hypre-configure-arguments="--build=x86_64-linux-gnu 
--host=x86_64-linux-gnu"

Note that I just bypassed uninstalling python in mingw64 (which doesn’t work) 
by using /usr/bin/python and that, as opposed to Pierre, I needed to also use 
the MSYS2 archiver (/usr/bin/ar) as opposed to the mingw64 one (/mingw64/bin/ar 
that shows up in the Pierre configure) as also mentioned here 
http://hillyuan.blogspot.com/2017/11/build-petsc-in-windows-under-mingw64.html, 
probably because of this issue 
https://stackoverflow.com/questions/37504625/ar-on-msys2-shell-receives-truncated-paths-when-called-from-makefile.

You are right that you can avoid deinstalling mingw-w64-x86_64-python if you 
can supply the proper Python yourself (we don’t have that luxury in our 
Makefile).
If you want to avoid using that AR, and stick to /mingw64/bin/ar (not sure what 
the pros and cons are), you can either:
- use another PETSC_ARCH (very short, like pw, for petsc-windows);
- use --with-single-library=0.
See this post on GitLab 
https://gitlab.com/petsc/petsc/-/issues/647#note_373507681
The OS I’m referring to is indeed my Windows + MSYS2 box.

Thanks,
Pierre

Then make all, make install and make check all went smooth. Also, I don’t know 
exactly what with-x=0 and with-windows-graphics=0 do, but I think it is stuff 
that I don’t need (yet configure worked with windows-graphics as well).


  *   Finally I launched make test. As some tests failed, I replicated the same 
install procedure on all the systems I have available on this same Windows 
machine (Ubuntu 20.04 and Centos 8 under a VirtualBox 6.0.22 VM, Ubuntu 20.04 
under WSL1 and the MSYS2-MINGW64 toolchain). I am attaching a file with the 
results printed to screen (not sure about which file should be used for a 
comparison/check). Note, however, that the 

Re: [petsc-users] PETSc and Windows 10

2020-07-05 Thread Paolo Lampitella
Thank you very much Pierre.

I'll keep you informed in case I see any relevant change from the tests when 
using your suggestion.

Paolo



Inviato da smartphone Samsung Galaxy.



 Messaggio originale 
Da: Pierre Jolivet 
Data: 05/07/20 13:45 (GMT+01:00)
A: Paolo Lampitella 
Cc: Matthew Knepley , petsc-users 
Oggetto: Re: [petsc-users] PETSc and Windows 10

Hello Paolo,

On 5 Jul 2020, at 1:15 PM, Paolo Lampitella 
mailto:paololampite...@hotmail.com>> wrote:

Dear all,

I just want to update you on my journey to PETSc compilation in Windows under 
MSYS2+MINGW64

Unfortunately, I haven’t been able to compile petsc-slepc trough Freefem but, 
as my final goal required also Fortran bindings (but I only needed blas, 
lapack, metis and hypre), I decided to follow my own route using the useful 
information from Pierre.


  *   I started by installing MPI from 
https://www.microsoft.com/en-us/download/details.aspx?id=100593. I don’t think 
the SDK is actually needed in my specific workflow, but I installed it as well 
together with mpisetup.
  *   Then I installed MSYS2 just following the wizard. Opened the MSYS2 
terminal and updated with pacman -Syuu, closed if asked, reopened it and used 
again pacman -Syuu several times until no more updates were available. Closed 
it and opened it back.
  *   Under the MSYS2 terminal installed just the following packages:



 *   pacman -S base-devel git gcc gcc-fortran
 *   pacman -S mingw-w64-x86_64-toolchain
 *   pacman -S mingw-w64-x86_64-cmake
 *   pacman -S mingw-w64-x86_64-msmpi



  *   Closed the MSYS2 terminal and opened the MINGW64 one, went to 
/mingw64/include and compiled my mpi module following 
https://www.scivision.dev/windows-mpi-msys2/:



 *   gfortran -c mpi.f90 -fno-range-check -fallow-invalid-boz


However, I will keep an eye on the MS-MPI GitHub repository because the fortran 
side seems to be far from perfect.


  *   Then I downloaded the 3.13.3 version of petsc and configured it, still 
under the MINGW64 terminal, with the following command:


/usr/bin/python ./configure --prefix=/home/paolo/petsc --with-ar=/usr/bin/ar
--with-shared-libraries=0 --with-debugging=0 --with-windows-graphics=0 
--with-x=0
COPTFLAGS="-O3 -mtune=native"
CXXOPTFLAGS="-O3 -mtune=native"
FOPTFLAGS="-O3 -mtune=native"
FFLAGS=-fallow-invalid-boz
--with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"
--download-fblaslapack --download-metis --download-hypre
--download-metis-cmake-arguments='-G "MSYS Makefiles"'
--download-hypre-configure-arguments="--build=x86_64-linux-gnu 
--host=x86_64-linux-gnu"

Note that I just bypassed uninstalling python in mingw64 (which doesn’t work) 
by using /usr/bin/python and that, as opposed to Pierre, I needed to also use 
the MSYS2 archiver (/usr/bin/ar) as opposed to the mingw64 one (/mingw64/bin/ar 
that shows up in the Pierre configure) as also mentioned here 
http://hillyuan.blogspot.com/2017/11/build-petsc-in-windows-under-mingw64.html, 
probably because of this issue 
https://stackoverflow.com/questions/37504625/ar-on-msys2-shell-receives-truncated-paths-when-called-from-makefile.

You are right that you can avoid deinstalling mingw-w64-x86_64-python if you 
can supply the proper Python yourself (we don’t have that luxury in our 
Makefile).
If you want to avoid using that AR, and stick to /mingw64/bin/ar (not sure what 
the pros and cons are), you can either:
- use another PETSC_ARCH (very short, like pw, for petsc-windows);
- use --with-single-library=0.
See this post on GitLab 
https://gitlab.com/petsc/petsc/-/issues/647#note_373507681
The OS I’m referring to is indeed my Windows + MSYS2 box.

Thanks,
Pierre

Then make all, make install and make check all went smooth. Also, I don’t know 
exactly what with-x=0 and with-windows-graphics=0 do, but I think it is stuff 
that I don’t need (yet configure worked with windows-graphics as well).


  *   Finally I launched make test. As some tests failed, I replicated the same 
install procedure on all the systems I have available on this same Windows 
machine (Ubuntu 20.04 and Centos 8 under a VirtualBox 6.0.22 VM, Ubuntu 20.04 
under WSL1 and the MSYS2-MINGW64 toolchain). I am attaching a file with the 
results printed to screen (not sure about which file should be used for a 
comparison/check). Note, however, that the tests in MSYS2 started with some 
cyclic reference issues for some .mod files, but this doesn’t show up in any 
file I could check.


I am still left with some doubts about the archiver, the cyclic reference 
errors and the differences in the test results, but I am able to link my code 
with petsc. Unfortunately, as this Windows porting is part of a large code 
restructuring, I can’t do much more with it, now, from my code. But if you can 
suggest some specific tutorial to use as test also for the parallel, I would be 
glad to dig deeper into the matter.

Best regards

Paol

Re: [petsc-users] PETSc and Windows 10

2020-07-05 Thread Pierre Jolivet
ults, but I am able to link my code 
> with petsc. Unfortunately, as this Windows porting is part of a large code 
> restructuring, I can’t do much more with it, now, from my code. But if you 
> can suggest some specific tutorial to use as test also for the parallel, I 
> would be glad to dig deeper into the matter.
>  
> Best regards
>  
> Paolo
>  
> Inviato da Posta <https://go.microsoft.com/fwlink/?LinkId=550986> per Windows 
> 10
>  
> Da: Pierre Jolivet <mailto:pierre.joli...@enseeiht.fr>
> Inviato: martedì 30 giugno 2020 15:22
> A: Paolo Lampitella <mailto:paololampite...@hotmail.com>
> Cc: Matthew Knepley <mailto:knep...@gmail.com>; petsc-users 
> <mailto:petsc-users@mcs.anl.gov>
> Oggetto: Re: [petsc-users] PETSc and Windows 10
>  
> Please use the 3.13.2 tarball, this was fixed by Satish in the previous 
> commit I already linked 
> (https://gitlab.com/petsc/petsc/-/commit/2cd8068296b34e127f055bb32f556e3599f17523
>  
> <https://gitlab.com/petsc/petsc/-/commit/2cd8068296b34e127f055bb32f556e3599f17523>).
> (If you want FreeFEM to do the dirty work for you, just switch to the develop 
> branch, and redo “make petsc-slepc”)
> But I think you’ve got everything you need now for a smooth compilation :)
>  
> Thanks,
> Pierre
> 
> 
> On 30 Jun 2020, at 3:09 PM, Paolo Lampitella  <mailto:paololampite...@hotmail.com>> wrote:
>  
> Dear Pierre,
>  
> thanks for the fast response. Unfortunately it still fails, but now in the 
> configure of ScaLAPACK
> (which means that it went ok for slepc, tetgen, metis, parmetis, ptscotch, 
> superlu and suitesparse).
>  
> The way I applied the modification is by manually editing the Makefile in the 
> 3rdparty/ff-petsc folder, adding -fallow-invalid-boz to both CFLAGS and 
> FFLAGS (this entry added by me). Then executed make petsc-slepc.
>  
> As my project is much less ambitious, I have a good feeling that I will be 
> able to use your Makefile successfully, but as I am kind of slow I tought 
> that it would have been useful for you to know. The configure.log is 
> attached. This time the error is:
>  
> Rank mismatch between actual argument at (1) and actual argument at (2) 
> (scalar and rank-1)
>  
> in subroutine pclarf.f of ScaLAPACK.
>  
> However, before attampting with my project, I have few questions about your 
> Makefile, in particular this piece:
>  
> --with-mpi-lib=/c/Windows/System32/msmpi.dll 
> --with-mpi-include=/home/paolo/FreeFem-sources/3rdparty/include/msmpi 
> --with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"
>  
> I see from MPI.py that I should not use ‘--with-mpi-lib/include’ if I want to 
> use my now working mpi wrappers. Is this correct?
>  
> Paolo 
>  
> Inviato da Posta <https://go.microsoft.com/fwlink/?LinkId=550986> per Windows 
> 10
>  
> Da: Pierre Jolivet <mailto:pierre.joli...@enseeiht.fr>
> Inviato: lunedì 29 giugno 2020 21:37
> A: Paolo Lampitella <mailto:paololampite...@hotmail.com>
> Cc: Matthew Knepley <mailto:knep...@gmail.com>; petsc-users 
> <mailto:petsc-users@mcs.anl.gov>
> Oggetto: Re: [petsc-users] PETSc and Windows 10
>  
> I do not give up easily on Windows problems:
> 1) that’s around 50% of our (FreeFEM) user-base (and I want them to use PETSc 
> and SLEPc, ofc…)
> 2) most people I work with from corporations just have Windows 
> laptops/desktops and I always recommend MSYS because it’s very lightweight 
> and you can pass .exe around
> 3) I’ve bothered enough Satish, Jed, and Matt on GitLab to take (at least 
> partially) the blame now when it doesn’t work on MSYS
>  
> That being said, the magic keyword is the added flag 
> FFLAGS="-fallow-invalid-boz" (see, I told you ./configure issues were easier 
> to deal with than the others).
> Here you’ll see that everything goes through just fine (sorry, it took me a 
> long time to post this because everything is slow on my VM):
> 1) http://jolivet.perso.enseeiht.fr/win10/configure.log 
> <http://jolivet.perso.enseeiht.fr/win10/configure.log>
> 2) http://jolivet.perso.enseeiht.fr/win10/make.log 
> <http://jolivet.perso.enseeiht.fr/win10/make.log> (both steps #1 and #2 in 
> MSYS terminal, gcc/gfortran 10, MS-MPI see screenshot)
> 3) http://jolivet.perso.enseeiht.fr/win10/ex2.txt 
> <http://jolivet.perso.enseeiht.fr/win10/ex2.txt> (Command Prompt, 4 processes 
> + MUMPS, I can send you the .exe if you want to try on your machine)
> I just realize that I didn’t generate the Fortran bindings, but you can see I 
> compiled MUMPS and ScaLAPACK, so that shouldn’t be a problem.
> Or if there is a problem, we will need to fix this in PETSc.
>  
> I’ll

Re: [petsc-users] PETSc and Windows 10

2020-06-30 Thread Pierre Jolivet
Please use the 3.13.2 tarball, this was fixed by Satish in the previous commit 
I already linked 
(https://gitlab.com/petsc/petsc/-/commit/2cd8068296b34e127f055bb32f556e3599f17523
 
<https://gitlab.com/petsc/petsc/-/commit/2cd8068296b34e127f055bb32f556e3599f17523>).
(If you want FreeFEM to do the dirty work for you, just switch to the develop 
branch, and redo “make petsc-slepc”)
But I think you’ve got everything you need now for a smooth compilation :)

Thanks,
Pierre

> On 30 Jun 2020, at 3:09 PM, Paolo Lampitella  
> wrote:
> 
> Dear Pierre,
>  
> thanks for the fast response. Unfortunately it still fails, but now in the 
> configure of ScaLAPACK
> (which means that it went ok for slepc, tetgen, metis, parmetis, ptscotch, 
> superlu and suitesparse).
>  
> The way I applied the modification is by manually editing the Makefile in the 
> 3rdparty/ff-petsc folder, adding -fallow-invalid-boz to both CFLAGS and 
> FFLAGS (this entry added by me). Then executed make petsc-slepc.
>  
> As my project is much less ambitious, I have a good feeling that I will be 
> able to use your Makefile successfully, but as I am kind of slow I tought 
> that it would have been useful for you to know. The configure.log is 
> attached. This time the error is:
>  
> Rank mismatch between actual argument at (1) and actual argument at (2) 
> (scalar and rank-1)
>  
> in subroutine pclarf.f of ScaLAPACK.
>  
> However, before attampting with my project, I have few questions about your 
> Makefile, in particular this piece:
>  
> --with-mpi-lib=/c/Windows/System32/msmpi.dll 
> --with-mpi-include=/home/paolo/FreeFem-sources/3rdparty/include/msmpi 
> --with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"
>  
> I see from MPI.py that I should not use ‘--with-mpi-lib/include’ if I want to 
> use my now working mpi wrappers. Is this correct?
>  
> Paolo 
>  
> Inviato da Posta <https://go.microsoft.com/fwlink/?LinkId=550986> per Windows 
> 10
>  
> Da: Pierre Jolivet <mailto:pierre.joli...@enseeiht.fr>
> Inviato: lunedì 29 giugno 2020 21:37
> A: Paolo Lampitella <mailto:paololampite...@hotmail.com>
> Cc: Matthew Knepley <mailto:knep...@gmail.com>; petsc-users 
> <mailto:petsc-users@mcs.anl.gov>
> Oggetto: Re: [petsc-users] PETSc and Windows 10
>  
> I do not give up easily on Windows problems:
> 1) that’s around 50% of our (FreeFEM) user-base (and I want them to use PETSc 
> and SLEPc, ofc…)
> 2) most people I work with from corporations just have Windows 
> laptops/desktops and I always recommend MSYS because it’s very lightweight 
> and you can pass .exe around
> 3) I’ve bothered enough Satish, Jed, and Matt on GitLab to take (at least 
> partially) the blame now when it doesn’t work on MSYS
>  
> That being said, the magic keyword is the added flag 
> FFLAGS="-fallow-invalid-boz" (see, I told you ./configure issues were easier 
> to deal with than the others).
> Here you’ll see that everything goes through just fine (sorry, it took me a 
> long time to post this because everything is slow on my VM):
> 1) http://jolivet.perso.enseeiht.fr/win10/configure.log 
> <http://jolivet.perso.enseeiht.fr/win10/configure.log>
> 2) http://jolivet.perso.enseeiht.fr/win10/make.log 
> <http://jolivet.perso.enseeiht.fr/win10/make.log> (both steps #1 and #2 in 
> MSYS terminal, gcc/gfortran 10, MS-MPI see screenshot)
> 3) http://jolivet.perso.enseeiht.fr/win10/ex2.txt 
> <http://jolivet.perso.enseeiht.fr/win10/ex2.txt> (Command Prompt, 4 processes 
> + MUMPS, I can send you the .exe if you want to try on your machine)
> I just realize that I didn’t generate the Fortran bindings, but you can see I 
> compiled MUMPS and ScaLAPACK, so that shouldn’t be a problem.
> Or if there is a problem, we will need to fix this in PETSc.
>  
> I’ll push this added flag to the FreeFEM repo, thanks for reminding me of the 
> brokenness of gcc/gfortran 10 + MS-MPI.
> Here is to hoping this won’t affect PETSc ./configure with previous 
> gcc/gfortran version (unlikely, this option is apparently 13-year old 
> https://gcc.gnu.org/bugzilla/show_bug.cgi?id=29471 
> <https://gcc.gnu.org/bugzilla/show_bug.cgi?id=29471>)
>  
> Let me know of the next hiccup, if any.
> Thanks,
> Pierre
> 
> 
> On 29 Jun 2020, at 8:09 PM, Paolo Lampitella  <mailto:paololampite...@hotmail.com>> wrote:
>  
> Dear Pierre,
>  
> thanks again for your time
>  
> I guess there is no way for me to use the toolchain you are using (I don’t 
> remember having any choice on which version of MSYS or GCC I could install)
>  
> Paolo
>  
> Inviato da Posta <https://go.microsoft.com/fwlink/?LinkId=550986> per Windows 
> 

Re: [petsc-users] PETSc and Windows 10

2020-06-29 Thread Pierre Jolivet


> On 29 Jun 2020, at 9:37 PM, Pierre Jolivet  wrote:
> 
> I do not give up easily on Windows problems:
> 1) that’s around 50% of our (FreeFEM) user-base (and I want them to use PETSc 
> and SLEPc, ofc…)
> 2) most people I work with from corporations just have Windows 
> laptops/desktops and I always recommend MSYS because it’s very lightweight 
> and you can pass .exe around
> 3) I’ve bothered enough Satish, Jed, and Matt on GitLab to take (at least 
> partially) the blame now when it doesn’t work on MSYS
> 
> That being said, the magic keyword is the added flag 
> FFLAGS="-fallow-invalid-boz" (see, I told you ./configure issues were easier 
> to deal with than the others).
> Here you’ll see that everything goes through just fine (sorry, it took me a 
> long time to post this because everything is slow on my VM):
> 1) http://jolivet.perso.enseeiht.fr/win10/configure.log 
> <http://jolivet.perso.enseeiht.fr/win10/configure.log>
> 2) http://jolivet.perso.enseeiht.fr/win10/make.log 
> <http://jolivet.perso.enseeiht.fr/win10/make.log> (both steps #1 and #2 in 
> MSYS terminal, gcc/gfortran 10, MS-MPI see screenshot)
> 3) http://jolivet.perso.enseeiht.fr/win10/ex2.txt 
> <http://jolivet.perso.enseeiht.fr/win10/ex2.txt> (Command Prompt, 4 processes 
> + MUMPS, I can send you the .exe if you want to try on your machine)
> I just realize that I didn’t generate the Fortran bindings, but you can see I 
> compiled MUMPS and ScaLAPACK, so that shouldn’t be a problem.
> Or if there is a problem, we will need to fix this in PETSc.
> 
> I’ll push this added flag to the FreeFEM repo

Sorry for the noise, but maybe it’s better to put this in PETSc ./configure, 
like you did here Satish 
https://gitlab.com/petsc/petsc/-/commit/2cd8068296b34e127f055bb32f556e3599f17523
 
<https://gitlab.com/petsc/petsc/-/commit/2cd8068296b34e127f055bb32f556e3599f17523>
 ?
If Gfortran100 && MS-MPI, then FFLAGS += "-fallow-invalid-boz"
WDY(PETSc-)GT?

Thanks,
Pierre

> thanks for reminding me of the brokenness of gcc/gfortran 10 + MS-MPI.
> Here is to hoping this won’t affect PETSc ./configure with previous 
> gcc/gfortran version (unlikely, this option is apparently 13-year old 
> https://gcc.gnu.org/bugzilla/show_bug.cgi?id=29471 
> <https://gcc.gnu.org/bugzilla/show_bug.cgi?id=29471>)
> 
> Let me know of the next hiccup, if any.
> Thanks,
> Pierre
> 
>> On 29 Jun 2020, at 8:09 PM, Paolo Lampitella > <mailto:paololampite...@hotmail.com>> wrote:
>> 
>> Dear Pierre,
>>  
>> thanks again for your time
>>  
>> I guess there is no way for me to use the toolchain you are using (I don’t 
>> remember having any choice on which version of MSYS or GCC I could install)
>>  
>> Paolo
>>  
>> Inviato da Posta <https://go.microsoft.com/fwlink/?LinkId=550986> per 
>> Windows 10
>>  
>> Da: Pierre Jolivet <mailto:pierre.joli...@enseeiht.fr>
>> Inviato: lunedì 29 giugno 2020 20:01
>> A: Matthew Knepley <mailto:knep...@gmail.com>
>> Cc: Paolo Lampitella <mailto:paololampite...@hotmail.com>; petsc-users 
>> <mailto:petsc-users@mcs.anl.gov>
>> Oggetto: Re: [petsc-users] PETSc and Windows 10
>>  
>>  
>> 
>> 
>> On 29 Jun 2020, at 7:47 PM, Matthew Knepley > <mailto:knep...@gmail.com>> wrote:
>>  
>> On Mon, Jun 29, 2020 at 1:35 PM Paolo Lampitella 
>> mailto:paololampite...@hotmail.com>> wrote:
>> Dear Pierre, sorry to bother you, but I already have some issues. What I did:
>>  
>> pacman -R mingw-w64-x86_64-python mingw-w64-x86_64-gdb (is gdb also 
>> troublesome?)
>> Followed points 6 and 7 at 
>> https://doc.freefem.org/introduction/installation.html#compilation-on-windows
>>  
>> <https://doc.freefem.org/introduction/installation.html#compilation-on-windows>
>> I first got a warning on the configure at point 6, as –disable-hips is not 
>> recognized. Then, on make ‘petsc-slepc’ of point 7 (no SUDO=sudo flag was 
>> necessary) I got to this point:
>>  
>> tar xzf ../pkg/petsc-lite-3.13.0.tar.gz
>> patch -p1 < petsc-suitesparse.patch
>> patching file petsc-3.13.0/config/BuildSystem/config/packages/SuiteSparse.py
>> touch petsc-3.13.0/tag-tar
>> cd petsc-3.13.0 && ./configure MAKEFLAGS='' \
>> --prefix=/home/paolo/freefem/ff-petsc//r \
>> --with-debugging=0 COPTFLAGS='-O3 -mtune=generic' CXXOPTFLAGS='-O3 
>> -mtune=generic' FOPTFLAGS='-O3 -mtune=generic' --with-cxx-dialect=C++11 
>> --with-ssl=0 --with-x=0 --with-fortran-bindings=0 --with-shared-libraries=0 
>> --with-cc='gcc' --with-cxx='g++' --w

Re: [petsc-users] PETSc and Windows 10

2020-06-29 Thread Pierre Jolivet
I do not give up easily on Windows problems:
1) that’s around 50% of our (FreeFEM) user-base (and I want them to use PETSc 
and SLEPc, ofc…)
2) most people I work with from corporations just have Windows laptops/desktops 
and I always recommend MSYS because it’s very lightweight and you can pass .exe 
around
3) I’ve bothered enough Satish, Jed, and Matt on GitLab to take (at least 
partially) the blame now when it doesn’t work on MSYS

That being said, the magic keyword is the added flag 
FFLAGS="-fallow-invalid-boz" (see, I told you ./configure issues were easier to 
deal with than the others).
Here you’ll see that everything goes through just fine (sorry, it took me a 
long time to post this because everything is slow on my VM):
1) http://jolivet.perso.enseeiht.fr/win10/configure.log 
<http://jolivet.perso.enseeiht.fr/win10/configure.log>
2) http://jolivet.perso.enseeiht.fr/win10/make.log 
<http://jolivet.perso.enseeiht.fr/win10/make.log> (both steps #1 and #2 in MSYS 
terminal, gcc/gfortran 10, MS-MPI see screenshot)
3) http://jolivet.perso.enseeiht.fr/win10/ex2.txt 
<http://jolivet.perso.enseeiht.fr/win10/ex2.txt> (Command Prompt, 4 processes + 
MUMPS, I can send you the .exe if you want to try on your machine)
I just realize that I didn’t generate the Fortran bindings, but you can see I 
compiled MUMPS and ScaLAPACK, so that shouldn’t be a problem.
Or if there is a problem, we will need to fix this in PETSc.

I’ll push this added flag to the FreeFEM repo, thanks for reminding me of the 
brokenness of gcc/gfortran 10 + MS-MPI.
Here is to hoping this won’t affect PETSc ./configure with previous 
gcc/gfortran version (unlikely, this option is apparently 13-year old 
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=29471 
<https://gcc.gnu.org/bugzilla/show_bug.cgi?id=29471>)

Let me know of the next hiccup, if any.
Thanks,
Pierre

> On 29 Jun 2020, at 8:09 PM, Paolo Lampitella  
> wrote:
> 
> Dear Pierre,
>  
> thanks again for your time
>  
> I guess there is no way for me to use the toolchain you are using (I don’t 
> remember having any choice on which version of MSYS or GCC I could install)
>  
> Paolo
>  
> Inviato da Posta <https://go.microsoft.com/fwlink/?LinkId=550986> per Windows 
> 10
>  
> Da: Pierre Jolivet <mailto:pierre.joli...@enseeiht.fr>
> Inviato: lunedì 29 giugno 2020 20:01
> A: Matthew Knepley <mailto:knep...@gmail.com>
> Cc: Paolo Lampitella <mailto:paololampite...@hotmail.com>; petsc-users 
> <mailto:petsc-users@mcs.anl.gov>
> Oggetto: Re: [petsc-users] PETSc and Windows 10
>  
>  
> 
> 
> On 29 Jun 2020, at 7:47 PM, Matthew Knepley  <mailto:knep...@gmail.com>> wrote:
>  
> On Mon, Jun 29, 2020 at 1:35 PM Paolo Lampitella  <mailto:paololampite...@hotmail.com>> wrote:
> Dear Pierre, sorry to bother you, but I already have some issues. What I did:
>  
> pacman -R mingw-w64-x86_64-python mingw-w64-x86_64-gdb (is gdb also 
> troublesome?)
> Followed points 6 and 7 at 
> https://doc.freefem.org/introduction/installation.html#compilation-on-windows 
> <https://doc.freefem.org/introduction/installation.html#compilation-on-windows>
> I first got a warning on the configure at point 6, as –disable-hips is not 
> recognized. Then, on make ‘petsc-slepc’ of point 7 (no SUDO=sudo flag was 
> necessary) I got to this point:
>  
> tar xzf ../pkg/petsc-lite-3.13.0.tar.gz
> patch -p1 < petsc-suitesparse.patch
> patching file petsc-3.13.0/config/BuildSystem/config/packages/SuiteSparse.py
> touch petsc-3.13.0/tag-tar
> cd petsc-3.13.0 && ./configure MAKEFLAGS='' \
> --prefix=/home/paolo/freefem/ff-petsc//r \
> --with-debugging=0 COPTFLAGS='-O3 -mtune=generic' CXXOPTFLAGS='-O3 
> -mtune=generic' FOPTFLAGS='-O3 -mtune=generic' --with-cxx-dialect=C++11 
> --with-ssl=0 --with-x=0 --with-fortran-bindings=0 --with-shared-libraries=0 
> --with-cc='gcc' --with-cxx='g++' --with-fc='gfortran' 
> CXXFLAGS='-fno-stack-protector' CFLAGS='-fno-stack-protector' 
> --with-scalar-type=real --with-mpi-lib='/c/Windows/System32/msmpi.dll' 
> --with-mpi-include='/home/paolo/FreeFem-sources/3rdparty/include/msmpi' 
> --with-mpiexec='/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec' 
> --with-blaslapack-include='' 
> --with-blaslapack-lib='/mingw64/bin/libopenblas.dll' --download-scalapack 
> --download-metis --download-ptscotch --download-mumps --download-hypre 
> --download-parmetis --download-superlu --download-suitesparse 
> --download-tetgen --download-slepc '--download-metis-cmake-arguments=-G "MSYS 
> Makefiles"' '--download-parmetis-cmake-arguments=-G "MSYS Makefiles"' 
> '--download-superlu-cmake-arguments=-G "MSYS Makefiles"' 
> '--download-hypre-configure-arguments=--build=x86_64-linux-gnu 
> -

Re: [petsc-users] PETSc and Windows 10

2020-06-29 Thread Pierre Jolivet
1
> Error: BOZ literal constant at (1) is neither a data-stmt-constant nor an 
> actual argument to INT, REAL, DBLE, or CMPLX intrinsic function [see 
> '-fno-allow-invalid-boz']
> C:/msys64/home/paolo/FreeFem-sources/3rdparty/include/msmpi/mpif.h:303:27:
> 
>   303 |PARAMETER (MPI_CHAR=z'4c000101')
>   |   1
> Error: BOZ literal constant at (1) is neither a data-stmt-constant nor an 
> actual argument to INT, REAL, DBLE, or CMPLX intrinsic function [see 
> '-fno-allow-invalid-boz']
> C:/msys64/home/paolo/FreeFem-sources/3rdparty/include/msmpi/mpif.h:305:36:
> 
>   305 |PARAMETER (MPI_UNSIGNED_CHAR=z'4c000102')
>   |1
> 
>   Thanks,
> 
>  Matt
>  
> Thanks
> 
>  
> 
> Paolo
> 
>  
> 
> Inviato da Posta <https://go.microsoft.com/fwlink/?LinkId=550986> per Windows 
> 10
> 
>  
> 
> Da: Pierre Jolivet <mailto:pierre.joli...@enseeiht.fr>
> Inviato: lunedì 29 giugno 2020 18:34
> A: Paolo Lampitella <mailto:paololampite...@hotmail.com>
> Cc: Satish Balay <mailto:ba...@mcs.anl.gov>; petsc-users 
> <mailto:petsc-users@mcs.anl.gov>
> Oggetto: Re: [petsc-users] PETSc and Windows 10
> 
>  
> 
>  
> 
> 
> 
> 
> On 29 Jun 2020, at 6:27 PM, Paolo Lampitella  <mailto:paololampite...@hotmail.com>> wrote:
> 
>  
> 
> I think I made the first step of having mingw64 from msys2 working with 
> ms-mpi.
> 
>  
> 
> I found that the issue I was having was related to:
> 
>  
> 
> https://gcc.gnu.org/bugzilla/show_bug.cgi?id=91556 
> <https://gcc.gnu.org/bugzilla/show_bug.cgi?id=91556>
>  
> 
> and, probably (but impossible to check now), I was using an msys2 and/or 
> mingw mpi package before this fix:
> 
>  
> 
> https://github.com/msys2/MINGW-packages/commit/11b4cff3d2ec7411037b692b0ad5a9f3e9b9978d#diff-eac59989e3096be97d940c8f47b50fba
>  
> <https://github.com/msys2/MINGW-packages/commit/11b4cff3d2ec7411037b692b0ad5a9f3e9b9978d#diff-eac59989e3096be97d940c8f47b50fba>
>  
> 
> Admittedly, I never used gcc 10 before on any machine. Still, I feel that 
> reporting that sort of error in that way is,
> 
> at least, misleading (I would have preferred the initial implementation as 
> mentioned in the gcc bug track).
> 
>  
> 
> A second thing that I was not used to, and made me more uncertain of the 
> procedure I was following, is having to compile myself the mpi module. There 
> are several version of this out there, but I decided to stick with this one:
> 
>  
> 
> https://www.scivision.dev/windows-mpi-msys2/ 
> <https://www.scivision.dev/windows-mpi-msys2/>
>  
> 
> even if there seems to be no need to include -fno-range-check and the current 
> mpi.f90 version is different from the mpif.h as reported here:
> 
>  
> 
> https://github.com/microsoft/Microsoft-MPI/issues/33 
> <https://github.com/microsoft/Microsoft-MPI/issues/33>
>  
> 
> which, to me, are both signs of lack of attention on the fortran side by 
> those that maintain this thing.
> 
>  
> 
> In summary, this is the procedure I followed so far (on a 64 bit machine with 
> Windows 10):
> 
>  
> 
> Install MSYS2 from https://www.msys2.org/ <https://www.msys2.org/> and just 
> follow the install wizard
> Open the MSYS2 terminal and execute: pacman -Syuu
> Close the terminal when asked and reopen it
> Keep executing ‘pacman -Syuu’ until nothing else needs to be updated
> Close the MSYS2 terminal and reopen it (I guess because was in paranoid 
> mode), then install packages with:
>  
> 
> pacman -S base-devel git gcc gcc-fortran bsdcpio lndir pax-git unzip
> 
> pacman -S mingw-w64-x86_64-toolchain
> 
> pacman -S mingw-w64-x86_64-msmpi
> 
> pacman -S mingw-w64-x86_64-cmake
> 
> pacman -S mingw-w64-x86_64-freeglut
> 
> pacman -S mingw-w64-x86_64-gsl
> 
> pacman -S mingw-w64-x86_64-libmicroutils
> 
> pacman -S mingw-w64-x86_64-hdf5
> 
> pacman -S mingw-w64-x86_64-openblas
> 
> pacman -S mingw-w64-x86_64-arpack
> 
> pacman -S mingw-w64-x86_64-jq
> 
>  
> 
> This set should include all the libraries mentioned by Pierre and/or used by 
> his Jenkins, as the final scope here is to have PETSc and dependencies 
> working. But I think that for pure MPI one could stop to msmpi (even, maybe, 
> just install msmpi and have the dependencies figured out by pacman). 
> Honestly, I don’t remember the exact order I used to install the packages, 
> but this should not affect things. Also, as I was still in paranoid mode, I 
> kept executing ‘pacman -Syuu’ after each package was ins

Re: [petsc-users] PETSc and Windows 10

2020-06-29 Thread Pierre Jolivet


> On 29 Jun 2020, at 6:27 PM, Paolo Lampitella  
> wrote:
> 
> I think I made the first step of having mingw64 from msys2 working with 
> ms-mpi.
>  
> I found that the issue I was having was related to:
>  
> https://gcc.gnu.org/bugzilla/show_bug.cgi?id=91556 
> 
>  
> and, probably (but impossible to check now), I was using an msys2 and/or 
> mingw mpi package before this fix:
>  
> https://github.com/msys2/MINGW-packages/commit/11b4cff3d2ec7411037b692b0ad5a9f3e9b9978d#diff-eac59989e3096be97d940c8f47b50fba
>  
> 
>  
> Admittedly, I never used gcc 10 before on any machine. Still, I feel that 
> reporting that sort of error in that way is,
> at least, misleading (I would have preferred the initial implementation as 
> mentioned in the gcc bug track).
>  
> A second thing that I was not used to, and made me more uncertain of the 
> procedure I was following, is having to compile myself the mpi module. There 
> are several version of this out there, but I decided to stick with this one:
>  
> https://www.scivision.dev/windows-mpi-msys2/ 
> 
>  
> even if there seems to be no need to include -fno-range-check and the current 
> mpi.f90 version is different from the mpif.h as reported here:
>  
> https://github.com/microsoft/Microsoft-MPI/issues/33 
> 
>  
> which, to me, are both signs of lack of attention on the fortran side by 
> those that maintain this thing.
>  
> In summary, this is the procedure I followed so far (on a 64 bit machine with 
> Windows 10):
>  
> Install MSYS2 from https://www.msys2.org/  and just 
> follow the install wizard
> Open the MSYS2 terminal and execute: pacman -Syuu
> Close the terminal when asked and reopen it
> Keep executing ‘pacman -Syuu’ until nothing else needs to be updated
> Close the MSYS2 terminal and reopen it (I guess because was in paranoid 
> mode), then install packages with:
>  
> pacman -S base-devel git gcc gcc-fortran bsdcpio lndir pax-git unzip
> pacman -S mingw-w64-x86_64-toolchain
> pacman -S mingw-w64-x86_64-msmpi
> pacman -S mingw-w64-x86_64-cmake
> pacman -S mingw-w64-x86_64-freeglut
> pacman -S mingw-w64-x86_64-gsl
> pacman -S mingw-w64-x86_64-libmicroutils
> pacman -S mingw-w64-x86_64-hdf5
> pacman -S mingw-w64-x86_64-openblas
> pacman -S mingw-w64-x86_64-arpack
> pacman -S mingw-w64-x86_64-jq
>  
> This set should include all the libraries mentioned by Pierre and/or used by 
> his Jenkins, as the final scope here is to have PETSc and dependencies 
> working. But I think that for pure MPI one could stop to msmpi (even, maybe, 
> just install msmpi and have the dependencies figured out by pacman). 
> Honestly, I don’t remember the exact order I used to install the packages, 
> but this should not affect things. Also, as I was still in paranoid mode, I 
> kept executing ‘pacman -Syuu’ after each package was installed. After this, 
> close the MSYS2 terminal.
>  
> Open the MINGW64 terminal and create the .mod file out of the mpi.f90 file, 
> as mentioned here https://www.scivision.dev/windows-mpi-msys2/ 
> , with:
>  
> cd /mingw64/include
> gfortran mpif90 -c -fno-range-check -fallow-invalid-boz

Ah, yes, that’s new to gfortran 10 (we use gfortran 9 on our workers), which is 
now what’s ship with MSYS2 (we haven’t updated yet). Sorry that I forgot about 
that.

> This is needed to ‘USE mpi’ (as opposed to INCLUDE ‘mpif.h’)
>  
> Install the latest MS-MPI (both sdk and setup) from 
> https://www.microsoft.com/en-us/download/details.aspx?id=100593 
> 
>  
> At this point I’ve been able to compile (using the MINGW64 terminal) 
> different mpi test programs and they run as expected in the classical Windows 
> prompt. I added this function to my .bashrc in MSYS2 in order to easily copy 
> the required dependencies out of MSYS:
>  
> function copydep() { ldd $1 | grep "=> /$2" | awk '{print $3}' | xargs -I 
> '{}' cp -v '{}' .; }
>  
> which can be used, with the MINGW64 terminal, by navigating to the folder 
> where the final executable, say, my.exe, resides (even if under a Windows 
> path) and executing:
>  
> copydep my.exe mingw64
>  
> This, of course, must be done before actually trying to execute the .exe in 
> the windows cmd prompt.
>  
> Hopefully, I should now be able to follow Pierre’s instructions for PETSc 
> (but first I wanna give a try to the system python before removing it)

Looks like the hard part is over. It’s usually easier to deal with ./configure 
issues.
If you have weird errors like “incomplete Cygwin install” or whatever, this is 
the kind of issues I was referring to earlier.
In that case, what I’d suggest is just, as 

Re: [petsc-users] PETSc and Windows 10

2020-06-28 Thread Paolo Lampitella
Not sure if I did the same. I first made the petsc install in a folder in my 
cygwin home and then copied the mpich executables, their .dll dependencies, my 
executable and its dependencies in a folder on my desktop.

Then I went there with both terminals (cygwin and window) and launched using 
mpiexec.hydra.exe in the folder (noting the difference between using ./ and .\ 
prepended to both executables in the two terminals).

With cygwin terminal things worked as expected. It is kind of premarure now for 
testing performances, but I feel that some compromise here can be admitted, 
considering the different constraints. I didn't pay too much attention in this 
phase but I haven't seen nothing suspiciously slow as well (the point is that I 
don't have a native linux install now to make a meaningful comparison).

However, running from the Windows terminal, things worked differently for me. 
It seems that it worked, but I had to give a second Enter hit... maybe I'm 
missing something behind the lines.

I still have to recompile with OpenMPI to have a meaningful comparison

Thanks

Paolo



Inviato da smartphone Samsung Galaxy.

 Messaggio originale 
Da: Satish Balay 
Data: 28/06/20 18:17 (GMT+01:00)
A: Satish Balay via petsc-users 
Cc: Paolo Lampitella , Pierre Jolivet 

Oggetto: Re: [petsc-users] PETSc and Windows 10

On Sun, 28 Jun 2020, Satish Balay via petsc-users wrote:

> On Sun, 28 Jun 2020, Paolo Lampitella wrote:

> >  *   For my Cygwin-GNU route (basically what is mentioned in PFLOTRAN 
> > documentation), am I expected to then run from the cygwin terminal or 
> > should the windows prompt work as well? Is the fact that I require a second 
> > Enter hit and the mismanagement of serial executables the sign of something 
> > wrong with the Windows prompt?
>
> I would think Cygwin-GNU route should work. I'll have to see if I can 
> reproduce the issues you have.

I attempted a couple of builds - one with mpich and the other with 
cygwin-openmpi

mpich compiled petsc example works sequentially - however mpiexec appears to 
require cygwin env.

>>>>>>>>
C:\petsc-install\bin>ex5f
Number of SNES iterations = 4

C:\petsc-install\bin>mpiexec -n 1 ex5f
[cli_0]: write_line error; fd=448 buf=:cmd=init pmi_version=1 pmi_subversion=1
:
system msg for write_line failure : Bad file descriptor
[cli_0]: Unable to write to PMI_fd
[cli_0]: write_line error; fd=448 buf=:cmd=get_appnum
:
system msg for write_line failure : Bad file descriptor
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(467):
MPID_Init(140)...: channel initialization failed
MPID_Init(421)...: PMI_Get_appnum returned -1
[cli_0]: aborting job:
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(467):
MPID_Init(140)...: channel initialization failed
MPID_Init(421)...: PMI_Get_appnum returned -1

C:\petsc-install\bin>
<<<<<<

cygwin-openmpi compiled petsc example binary gives errors even for sequential 
run

>>>>>>>>
C:\Users\balay\test>ex5f
Warning: '/dev/shm' does not exists or is not a directory.

POSIX shared memory objects require the existance of this directory.
Create the directory '/dev/shm' and set the permissions to 01777.
For instance on the command line: mkdir -m 01777 /dev/shm
[ps5:00560] [[INVALID],INVALID] ORTE_ERROR_LOG: A system-required executable 
either could not be found or was not executable by this user in file 
/cygdrive/d/cyg_pub/devel/openmpi/v3.1/prova/openmpi-3.1.6-1.x86_64/src/openmpi-3.1.6/orte/mca/ess/singleton/ess_singleton_module.c
 at line 388
[ps5:00560] [[INVALID],INVALID] ORTE_ERROR_LOG: A system-required executable 
either could not be found or was not executable by this user in file 
/cygdrive/d/cyg_pub/devel/openmpi/v3.1/prova/openmpi-3.1.6-1.x86_64/src/openmpi-3.1.6/orte/mca/ess/singleton/ess_singleton_module.c
 at line 166
--
Sorry!  You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/share/openmpi/help-orte-runtime: No such file or directory.  Sorry!
--
<<<<<<<

So looks like you would need cygwin installed to run Cygwin-MPI binaries.. Also 
I don't know how cygwin/windows interaction overhead will affect parallel 
performance.

Satish


Re: [petsc-users] PETSc and Windows 10

2020-06-28 Thread Satish Balay via petsc-users
On Sun, 28 Jun 2020, Satish Balay via petsc-users wrote:

> On Sun, 28 Jun 2020, Paolo Lampitella wrote:

> >  *   For my Cygwin-GNU route (basically what is mentioned in PFLOTRAN 
> > documentation), am I expected to then run from the cygwin terminal or 
> > should the windows prompt work as well? Is the fact that I require a second 
> > Enter hit and the mismanagement of serial executables the sign of something 
> > wrong with the Windows prompt?
> 
> I would think Cygwin-GNU route should work. I'll have to see if I can 
> reproduce the issues you have.

I attempted a couple of builds - one with mpich and the other with 
cygwin-openmpi

mpich compiled petsc example works sequentially - however mpiexec appears to 
require cygwin env.


C:\petsc-install\bin>ex5f
Number of SNES iterations = 4

C:\petsc-install\bin>mpiexec -n 1 ex5f
[cli_0]: write_line error; fd=448 buf=:cmd=init pmi_version=1 pmi_subversion=1
:
system msg for write_line failure : Bad file descriptor
[cli_0]: Unable to write to PMI_fd
[cli_0]: write_line error; fd=448 buf=:cmd=get_appnum
:
system msg for write_line failure : Bad file descriptor
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(467):
MPID_Init(140)...: channel initialization failed
MPID_Init(421)...: PMI_Get_appnum returned -1
[cli_0]: aborting job:
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(467):
MPID_Init(140)...: channel initialization failed
MPID_Init(421)...: PMI_Get_appnum returned -1

C:\petsc-install\bin>
<<

cygwin-openmpi compiled petsc example binary gives errors even for sequential 
run


C:\Users\balay\test>ex5f
Warning: '/dev/shm' does not exists or is not a directory.

POSIX shared memory objects require the existance of this directory.
Create the directory '/dev/shm' and set the permissions to 01777.
For instance on the command line: mkdir -m 01777 /dev/shm
[ps5:00560] [[INVALID],INVALID] ORTE_ERROR_LOG: A system-required executable 
either could not be found or was not executable by this user in file 
/cygdrive/d/cyg_pub/devel/openmpi/v3.1/prova/openmpi-3.1.6-1.x86_64/src/openmpi-3.1.6/orte/mca/ess/singleton/ess_singleton_module.c
 at line 388
[ps5:00560] [[INVALID],INVALID] ORTE_ERROR_LOG: A system-required executable 
either could not be found or was not executable by this user in file 
/cygdrive/d/cyg_pub/devel/openmpi/v3.1/prova/openmpi-3.1.6-1.x86_64/src/openmpi-3.1.6/orte/mca/ess/singleton/ess_singleton_module.c
 at line 166
--
Sorry!  You were supposed to get help about:
orte_init:startup:internal-failure
But I couldn't open the help file:
/usr/share/openmpi/help-orte-runtime: No such file or directory.  Sorry!
--
<<<

So looks like you would need cygwin installed to run Cygwin-MPI binaries.. Also 
I don't know how cygwin/windows interaction overhead will affect parallel 
performance.

Satish


Re: [petsc-users] PETSc and Windows 10

2020-06-28 Thread Satish Balay via petsc-users
To clarify - I would like to have a petsc test with MSYS2 working.

Previously I was lost with MSYS2 - so I stick with what I'm familiar with. 
However with your instructions - I'm hoping to make progress.

What I object to are suggestions (that come up regularly) of replacing what we 
currently have with:
- WSL2
- MSYS2

Satish


On Sun, 28 Jun 2020, Pierre Jolivet wrote:

> Hello Paolo,
> 
> > On 28 Jun 2020, at 4:19 PM, Satish Balay  wrote:
> > 
> > On Sun, 28 Jun 2020, Paolo Lampitella wrote:
> > 
> >>  1.  MSYS2+MinGW64 compilers. I understood that MinGW is not well 
> >> supported, probably because of how it handles paths, but I wanted to give 
> >> it a try, because it should be more “native” and there seems to be 
> >> relevant examples out there that managed to do it. I first tried with the 
> >> msys2 mpi distribution, produced the .mod file out of the mpi.f90 file in 
> >> the distribution (I tried my best with different hacks from known 
> >> limitations of this file as also present in the official MS-MPI 
> >> distribution) and tried with my code without petsc, but it failed in 
> >> compiling the code with some strange MPI related error (argument mismatch 
> >> between two unrelated MPI calls in the code, which is non sense to me). In 
> >> contrast, simple mpi tests (hello world like) worked as expected. Then I 
> >> decided to follow this:
> >> 
> >> 
> >> 
> >> https://doc.freefem.org/introduction/installation.html#compilation-on-windows
> >> 
> 
> Sorry, our (FreeFEM) documentation is not the best…
> 
> MSYS2+MinGW64 is a fantastic tool to deploy .exe with PETSc.
> For example, in this .exe 
> https://github.com/FreeFem/FreeFem-sources/releases/download/v4.6/FreeFEM-4.6-win7-64.exe
>  
> ,
>  we ship PETSc + SLEPc (in real + complex) with MS-MPI, hypre, MUMPS, 
> ScaLAPACK, SuperLU, SuiteSparse, ParMETIS, METIS, SCOTCH, TetGen, HPDDM, all 
> compiled by PETSc, needless to say :)
> There are some tricks, that you can copy/paste from 
> https://github.com/FreeFem/FreeFem-sources/blob/master/3rdparty/ff-petsc/Makefile#L99-L120
>  
> 
> Basically, hypre + MinGW64 does not work if you don’t supply 
> '--download-hypre-configure-arguments=--build=x86_64-linux-gnu 
> --host=x86_64-linux-gnu' and all CMake packages need an additional flag as 
> well:
> '--download-metis-cmake-arguments=-G "MSYS Makefiles"' 
> '--download-parmetis-cmake-arguments=-G "MSYS Makefiles"' 
> '--download-superlu-cmake-arguments=-G "MSYS Makefiles"'
> 
> This is tested on a daily basis on Windows 7 and Windows 10, so I’m a little 
> puzzled by your MPI problems.
> I’d suggest you stick to MS-MPI (that’s what we use and it’s trivial to 
> install on MSYS2 https://packages.msys2.org/package/mingw-w64-x86_64-msmpi 
> ).
> 
> I’m not sure this is specific to PETSc, so feel free to have a chat in 
> private.
> But I guess we can continue on the mailing list as well, it’s just that there 
> is not much love for MSYS2 over here, sadly.
> 
> Thanks,
> Pierre
> 
> >> 
> >> but the exact same type of error came up (MPI calls in my code were 
> >> different, but the error was the same). Trying again from scratch (i.e., 
> >> without all the things I did in the beginning to compile my code) the same 
> >> error came up in compiling some of the freefem dependencies (this time not 
> >> even mpi calls).
> >> 
> >> 
> >> 
> >> As a side note, there seems to be an official effort in porting petsc to 
> >> msys2 
> >> (https://github.com/okhlybov/MINGW-packages/tree/whpc/mingw-w64-petsc), 
> >> but it didn’t get into the official packages yet, which I interpret as a 
> >> warning
> >> 
> >> 
> >> 
> >>  1.  Didn’t give a try to cross compiling with MinGw from Linux, as I 
> >> tought it couldn’t be any better than doing it from MSYS2
> >>  2.  Didn’t try PGI as I actually didn’t know if I would then been able to 
> >> make PETSc work.
> >> 
> >> So, here there are some questions I have with respect to where I stand now 
> >> and the points above:
> >> 
> >> 
> >> *   I haven’t seen the MSYS2-MinGw64 toolchain mentioned at all in 
> >> official documentation/discussions. Should I definitely abandon it 
> >> (despite someone mentioning it as working) because of known issues?
> > 
> > I don't have experience with MSYS2-MinGw64, However Pierre does - and 
> > perhaps can comment on this. I don't know how things work on the fortran 
> > side.
> > 
> >> *   What about the PGI route? I don’t see it mentioned as well. I 
> >> guess it would require some work on win32fe
> > 
> > Again - no experience here.
> > 
> >> *   For my Cygwin-GNU route (basically what is mentioned in PFLOTRAN 
> >> documentation), am I expected to then run from the cygwin terminal or 
> >> should the windows prompt work 

Re: [petsc-users] PETSc and Windows 10

2020-06-28 Thread Pierre Jolivet
Hello Paolo,

> On 28 Jun 2020, at 4:19 PM, Satish Balay  wrote:
> 
> On Sun, 28 Jun 2020, Paolo Lampitella wrote:
> 
>>  1.  MSYS2+MinGW64 compilers. I understood that MinGW is not well supported, 
>> probably because of how it handles paths, but I wanted to give it a try, 
>> because it should be more “native” and there seems to be relevant examples 
>> out there that managed to do it. I first tried with the msys2 mpi 
>> distribution, produced the .mod file out of the mpi.f90 file in the 
>> distribution (I tried my best with different hacks from known limitations of 
>> this file as also present in the official MS-MPI distribution) and tried 
>> with my code without petsc, but it failed in compiling the code with some 
>> strange MPI related error (argument mismatch between two unrelated MPI calls 
>> in the code, which is non sense to me). In contrast, simple mpi tests (hello 
>> world like) worked as expected. Then I decided to follow this:
>> 
>> 
>> 
>> https://doc.freefem.org/introduction/installation.html#compilation-on-windows
>> 

Sorry, our (FreeFEM) documentation is not the best…

MSYS2+MinGW64 is a fantastic tool to deploy .exe with PETSc.
For example, in this .exe 
https://github.com/FreeFem/FreeFem-sources/releases/download/v4.6/FreeFEM-4.6-win7-64.exe
 
,
 we ship PETSc + SLEPc (in real + complex) with MS-MPI, hypre, MUMPS, 
ScaLAPACK, SuperLU, SuiteSparse, ParMETIS, METIS, SCOTCH, TetGen, HPDDM, all 
compiled by PETSc, needless to say :)
There are some tricks, that you can copy/paste from 
https://github.com/FreeFem/FreeFem-sources/blob/master/3rdparty/ff-petsc/Makefile#L99-L120
 

Basically, hypre + MinGW64 does not work if you don’t supply 
'--download-hypre-configure-arguments=--build=x86_64-linux-gnu 
--host=x86_64-linux-gnu' and all CMake packages need an additional flag as well:
'--download-metis-cmake-arguments=-G "MSYS Makefiles"' 
'--download-parmetis-cmake-arguments=-G "MSYS Makefiles"' 
'--download-superlu-cmake-arguments=-G "MSYS Makefiles"'

This is tested on a daily basis on Windows 7 and Windows 10, so I’m a little 
puzzled by your MPI problems.
I’d suggest you stick to MS-MPI (that’s what we use and it’s trivial to install 
on MSYS2 https://packages.msys2.org/package/mingw-w64-x86_64-msmpi 
).

I’m not sure this is specific to PETSc, so feel free to have a chat in private.
But I guess we can continue on the mailing list as well, it’s just that there 
is not much love for MSYS2 over here, sadly.

Thanks,
Pierre

>> 
>> but the exact same type of error came up (MPI calls in my code were 
>> different, but the error was the same). Trying again from scratch (i.e., 
>> without all the things I did in the beginning to compile my code) the same 
>> error came up in compiling some of the freefem dependencies (this time not 
>> even mpi calls).
>> 
>> 
>> 
>> As a side note, there seems to be an official effort in porting petsc to 
>> msys2 
>> (https://github.com/okhlybov/MINGW-packages/tree/whpc/mingw-w64-petsc), but 
>> it didn’t get into the official packages yet, which I interpret as a warning
>> 
>> 
>> 
>>  1.  Didn’t give a try to cross compiling with MinGw from Linux, as I tought 
>> it couldn’t be any better than doing it from MSYS2
>>  2.  Didn’t try PGI as I actually didn’t know if I would then been able to 
>> make PETSc work.
>> 
>> So, here there are some questions I have with respect to where I stand now 
>> and the points above:
>> 
>> 
>> *   I haven’t seen the MSYS2-MinGw64 toolchain mentioned at all in 
>> official documentation/discussions. Should I definitely abandon it (despite 
>> someone mentioning it as working) because of known issues?
> 
> I don't have experience with MSYS2-MinGw64, However Pierre does - and perhaps 
> can comment on this. I don't know how things work on the fortran side.
> 
>> *   What about the PGI route? I don’t see it mentioned as well. I guess 
>> it would require some work on win32fe
> 
> Again - no experience here.
> 
>> *   For my Cygwin-GNU route (basically what is mentioned in PFLOTRAN 
>> documentation), am I expected to then run from the cygwin terminal or should 
>> the windows prompt work as well? Is the fact that I require a second Enter 
>> hit and the mismanagement of serial executables the sign of something wrong 
>> with the Windows prompt?
> 
> I would think Cygwin-GNU route should work. I'll have to see if I can 
> reproduce the issues you have.
> 
> Satish
> 
>> *   More generally, is there some known working, albeit non official, 
>> route given my constraints (free+fortran+windows+mpi+petsc)?
>> 
>> Thanks for your attention and your great work on PETSc
>> 
>> Best regards
>> 
>> Paolo Lampitella



Re: [petsc-users] PETSc and Windows 10

2020-06-28 Thread Satish Balay via petsc-users
BTW: How does redistributing MPI/runtime work with all the choices you have?

For ex: with MS-MPI, Intel-MPI - wouldn't the user have to install these 
packages? [i.e you can't just copy them over to a folder and have mpiexec work 
- from what I can tell]

And how did you plan on installing MPICH - but make mpiexec from OpenMPI 
redistributable? Did you use OpeMPI from cygwin - or install it manually?

And presumably you don't want users installing cygwin.

Satish

On Sun, 28 Jun 2020, Satish Balay via petsc-users wrote:

> On Sun, 28 Jun 2020, Paolo Lampitella wrote:
> 
> > Dear PETSc users,
> > 
> > I’ve been an happy PETSc user since version 3.3, using it both under Ubuntu 
> > (from 14.04 up to 20.04) and CentOS (from 5 to 8).
> > 
> > I use it as an optional component for a parallel Fortran code (that, BTW, 
> > also uses metis) and, wherever allowed, I used to install myself MPI (both 
> > MPICH and OpenMPI) and PETSc on top of it without any trouble ever (besides 
> > being, myself, as dumb as one can be in this).
> > 
> > I did this on top of gnu compilers and, less extensively, intel compilers, 
> > both on a range of different systems (from virtual machines, to 
> > workstations to actual clusters).
> > 
> > So far so good.
> > 
> > Today I find myself in the need of deploying my application to Windows 10 
> > users, which means giving them a folder with all the executables and 
> > libraries to make them run in it, including the mpi runtime. Unfortunately, 
> > I also have to rely on free tools (can’t afford Intel for the moment).
> > 
> > To the best of my knowledge, considering also far from optimal solutions, 
> > my options would then be: Virtual machines and WSL1, Cygwin, MSYS2-MinGW64, 
> > Cross compiling with MinGW64 from within Linux, PGI + Visual Studio + 
> > Cygwin (not sure about this one)
> > 
> > I know this is largely unsupported, but I was wondering if there is, 
> > nonetheless, some general (and more official) knowledge available on the 
> > matter. What I tried so far:
> > 
> > 
> >   1.  Virtual machines and WSL1: both work like a charm, just like in the 
> > native OS, but very far from ideal for the distribution purpose
> > 
> > 
> >   1.  Cygwin with gnu compilers (as opposed to using Intel and Visual 
> > Studio): I was unable to compile myself MPI as I am used to on Linux, so I 
> > just tried going all in and let PETSc do everything for me (using static 
> > linking): download and install MPICH, BLAS, LAPACK, METIS and HYPRE. 
> > Everything just worked (for now compiling and making trivial tests) and I 
> > am able to use everything from within a cygwin terminal (even with 
> > executables and dependencies outside cygwin). Still, even within cygwin, I 
> > can’t switch to use, say, the cygwin ompi mpirun/mpiexec for an mpi program 
> > compiled with PETSc mpich (things run but not as expected). Some troubles 
> > start when I try to use cmd.exe (which I pictured as the more natural way 
> > to launch in Windows). In particular, using (note that \ is in cmd.exe, / 
> > was used in cygwin terminal):
> 
> I don't understand. Why build with MPICH - but use mpiexec from OpenMPI?
> 
> If it is because you can easily redistribute OpenMPI - why not build PETSc 
> with OpenMPI?
> 
> You can't use Intel/MS-MPI from cygwin/gcc/gfortran
> 
> Also - even-though --download-mpich works with cygwin/gcc - its no loner 
> supported on windows [by MPICH group].
> 
> > 
> > .\mpiexec.hydra.exe -np 8 .\my.exe
> > 
> > Nothing happens unless I push Enter a second time. Things seem to work 
> > then, but if I try to run a serial executable with the command above I get 
> > the following errors (which, instead, doesn’t happen using the cygwin 
> > terminal):
> > 
> > [proxy:0:0@Dell7540-Paolo] HYDU_sock_write (utils/sock/sock.c:286): write 
> > error (No such process)
> > [proxy:0:0@Dell7540-Paolo] HYD_pmcd_pmip_control_cmd_cb 
> > (pm/pmiserv/pmip_cb.c:935): unable to write to downstream stdin
> > [proxy:0:0@Dell7540-Paolo] HYDT_dmxu_poll_wait_for_event 
> > (tools/demux/demux_poll.c:76): callback returned error status
> > [proxy:0:0@Dell7540-Paolo] main (pm/pmiserv/pmip.c:206): demux engine error 
> > waiting for event
> > [mpiexec@Dell7540-Paolo] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert 
> > (!closed) failed
> > [mpiexec@Dell7540-Paolo] HYDT_dmxu_poll_wait_for_event 
> > (tools/demux/demux_poll.c:76): callback returned error status
> > [mpiexec@Dell7540-Paolo] HYD_pmci_wait_for_completion 
> > (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event
> > [mpiexec@Dell7540-Paolo] main (ui/mpich/mpiexec.c:336): process manager 
> > error waiting for completion
> > 
> > Just for the sake of completeness, I also tried using the Intel and 
> > Microsoft MPI redistributables, which might be more natural candidates, 
> > instead of the petsc compiled version of the MPI runtime (and they are 
> > MPICH derivatives, after all). But, running with:
> > 
> > mpiexec -np 1 my.exe
> > 
> > I 

Re: [petsc-users] PETSc and Windows 10

2020-06-28 Thread Satish Balay via petsc-users
On Sun, 28 Jun 2020, Paolo Lampitella wrote:

> Dear PETSc users,
> 
> I’ve been an happy PETSc user since version 3.3, using it both under Ubuntu 
> (from 14.04 up to 20.04) and CentOS (from 5 to 8).
> 
> I use it as an optional component for a parallel Fortran code (that, BTW, 
> also uses metis) and, wherever allowed, I used to install myself MPI (both 
> MPICH and OpenMPI) and PETSc on top of it without any trouble ever (besides 
> being, myself, as dumb as one can be in this).
> 
> I did this on top of gnu compilers and, less extensively, intel compilers, 
> both on a range of different systems (from virtual machines, to workstations 
> to actual clusters).
> 
> So far so good.
> 
> Today I find myself in the need of deploying my application to Windows 10 
> users, which means giving them a folder with all the executables and 
> libraries to make them run in it, including the mpi runtime. Unfortunately, I 
> also have to rely on free tools (can’t afford Intel for the moment).
> 
> To the best of my knowledge, considering also far from optimal solutions, my 
> options would then be: Virtual machines and WSL1, Cygwin, MSYS2-MinGW64, 
> Cross compiling with MinGW64 from within Linux, PGI + Visual Studio + Cygwin 
> (not sure about this one)
> 
> I know this is largely unsupported, but I was wondering if there is, 
> nonetheless, some general (and more official) knowledge available on the 
> matter. What I tried so far:
> 
> 
>   1.  Virtual machines and WSL1: both work like a charm, just like in the 
> native OS, but very far from ideal for the distribution purpose
> 
> 
>   1.  Cygwin with gnu compilers (as opposed to using Intel and Visual 
> Studio): I was unable to compile myself MPI as I am used to on Linux, so I 
> just tried going all in and let PETSc do everything for me (using static 
> linking): download and install MPICH, BLAS, LAPACK, METIS and HYPRE. 
> Everything just worked (for now compiling and making trivial tests) and I am 
> able to use everything from within a cygwin terminal (even with executables 
> and dependencies outside cygwin). Still, even within cygwin, I can’t switch 
> to use, say, the cygwin ompi mpirun/mpiexec for an mpi program compiled with 
> PETSc mpich (things run but not as expected). Some troubles start when I try 
> to use cmd.exe (which I pictured as the more natural way to launch in 
> Windows). In particular, using (note that \ is in cmd.exe, / was used in 
> cygwin terminal):

I don't understand. Why build with MPICH - but use mpiexec from OpenMPI?

If it is because you can easily redistribute OpenMPI - why not build PETSc with 
OpenMPI?

You can't use Intel/MS-MPI from cygwin/gcc/gfortran

Also - even-though --download-mpich works with cygwin/gcc - its no loner 
supported on windows [by MPICH group].

> 
> .\mpiexec.hydra.exe -np 8 .\my.exe
> 
> Nothing happens unless I push Enter a second time. Things seem to work then, 
> but if I try to run a serial executable with the command above I get the 
> following errors (which, instead, doesn’t happen using the cygwin terminal):
> 
> [proxy:0:0@Dell7540-Paolo] HYDU_sock_write (utils/sock/sock.c:286): write 
> error (No such process)
> [proxy:0:0@Dell7540-Paolo] HYD_pmcd_pmip_control_cmd_cb 
> (pm/pmiserv/pmip_cb.c:935): unable to write to downstream stdin
> [proxy:0:0@Dell7540-Paolo] HYDT_dmxu_poll_wait_for_event 
> (tools/demux/demux_poll.c:76): callback returned error status
> [proxy:0:0@Dell7540-Paolo] main (pm/pmiserv/pmip.c:206): demux engine error 
> waiting for event
> [mpiexec@Dell7540-Paolo] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert 
> (!closed) failed
> [mpiexec@Dell7540-Paolo] HYDT_dmxu_poll_wait_for_event 
> (tools/demux/demux_poll.c:76): callback returned error status
> [mpiexec@Dell7540-Paolo] HYD_pmci_wait_for_completion 
> (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event
> [mpiexec@Dell7540-Paolo] main (ui/mpich/mpiexec.c:336): process manager error 
> waiting for completion
> 
> Just for the sake of completeness, I also tried using the Intel and Microsoft 
> MPI redistributables, which might be more natural candidates, instead of the 
> petsc compiled version of the MPI runtime (and they are MPICH derivatives, 
> after all). But, running with:
> 
> mpiexec -np 1 my.exe
> 
> I get the following error with Intel:
> 
> [cli_0]: write_line error; fd=440 buf=:cmd=init pmi_version=1 pmi_subversion=1
> :
> system msg for write_line failure : Bad file descriptor
> [cli_0]: Unable to write to PMI_fd
> [cli_0]: write_line error; fd=440 buf=:cmd=get_appnum
> :
> system msg for write_line failure : Bad file descriptor
> Fatal error in MPI_Init: Other MPI error, error stack:
> MPIR_Init_thread(467):
> MPID_Init(140)...: channel initialization failed
> MPID_Init(421)...: PMI_Get_appnum returned -1
> [cli_0]: aborting job:
> Fatal error in MPI_Init: Other MPI error, error stack:
> MPIR_Init_thread(467):
> MPID_Init(140)...: channel initialization failed
> MPID_Init(421)...: 

Re: [petsc-users] petsc on windows

2019-08-30 Thread Balay, Satish via petsc-users
Thanks for the update.

Yes - having the wrong variant of libpetsc.dll in PATH can cause problems.

Satish

On Fri, 30 Aug 2019, Sam Guo via petsc-users wrote:

> Thanks a lot for your help. It is my pilot error: I have both serial
> version and parallel version of petstc. It turns out serial version is
> always loaded. Now parallel petstc is working.
> 
> On Thu, Aug 29, 2019 at 5:51 PM Balay, Satish  wrote:
> 
> > On MS-Windows - you need the location of the DLLs in PATH
> >
> > Or use --with-shared-libraries=0
> >
> > Satish
> >
> > On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:
> >
> > > When I use intel mpi, configuration, compile and test all work fine but I
> > > cannot use dll in my application.
> > >
> > > On Thu, Aug 29, 2019 at 3:46 PM Sam Guo  wrote:
> > >
> > > > After I removed following lines inin
> > config/BuildSystem/config/package.py,
> > > > configuration finished without error.
> > > >  self.executeTest(self.checkDependencies)
> > > >  self.executeTest(self.configureLibrary)
> > > >  self.executeTest(self.checkSharedLibrary)
> > > >
> > > > I then add my mpi wrapper to
> > ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
> > > > PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> > > >
> > /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> > > >
> > > > On Thu, Aug 29, 2019 at 3:28 PM Balay, Satish 
> > wrote:
> > > >
> > > >> On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:
> > > >>
> > > >> > I can link when I add my wrapper to
> > > >> > PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> > > >> >
> > /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> > > >>
> > > >> I don't understand what you mean here. Add PCC_LINKER_FLAGS to where?
> > > >> This is a variable in configure generated makefile
> > > >>
> > > >> Since PETSc is not built [as configure failed] - there should be no
> > > >> configure generated makefiles.
> > > >>
> > > >> > (I don't understand why configure does not include my wrapper)
> > > >>
> > > >> Well the compiler gives the error below. Can you try to compile
> > > >> manually [i.e without PETSc or any petsc makefiles] a simple MPI code
> > > >> - say cpi.c from MPICH and see if it works?  [and copy/paste the log
> > > >> from this compile attempt.
> > > >>
> > > >> Satish
> > > >>
> > > >> >
> > > >> >
> > > >> > On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley 
> > > >> wrote:
> > > >> >
> > > >> > > On Thu, Aug 29, 2019 at 4:02 PM Sam Guo 
> > > >> wrote:
> > > >> > >
> > > >> > >> Thanks for the quick response. Attached please find the
> > configure.log
> > > >> > >> containing the configure error.
> > > >> > >>
> > > >> > >
> > > >> > > Executing:
> > > >> /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> > > >> > > -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o
> > > >> > > -I/tmp/petsc-6DsCEk/config.compilers
> > > >> > > -I/tmp/petsc-6DsCEk/config.setCompilers
> > > >> > > -I/tmp/petsc-6DsCEk/config.utilities.closure
> > > >> > > -I/tmp/petsc-6DsCEk/config.headers
> > > >> > > -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails
> > > >> > > -I/tmp/petsc-6DsCEk/config.types
> > -I/tmp/petsc-6DsCEk/config.atomics
> > > >> > > -I/tmp/petsc-6DsCEk/config.functions
> > > >> > > -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros
> > > >> > > -I/tmp/petsc-6DsCEk/config.utilities.missing
> > > >> > > -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes
> > > >> > > -I/tmp/petsc-6DsCEk/config.libraries  -MD -wd4996 -Z7
> > > >> > >  /tmp/petsc-6DsCEk/config.libraries/conftest.c
> > > >> > > stdout: conftest.c
> > > >> > > Successful compile:
> > > >> > > Source:
> > > >> > > #include "confdefs.h"
> > > >> > > #include "conffix.h"
> > > >> > > /* Override any gcc2 internal prototype to avoid an error. */
> > > >> > > char MPI_Init();
> > > >> > > static void _check_MPI_Init() { MPI_Init(); }
> > > >> > > char MPI_Comm_create();
> > > >> > > static void _check_MPI_Comm_create() { MPI_Comm_create(); }
> > > >> > >
> > > >> > > int main() {
> > > >> > > _check_MPI_Init();
> > > >> > > _check_MPI_Comm_create();;
> > > >> > >   return 0;
> > > >> > > }
> > > >> > > Executing:
> > > >> /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> > > >> > >  -o /tmp/petsc-6DsCEk/config.libraries/conftest.exe-MD
> > -wd4996 -Z7
> > > >> > > /tmp/petsc-6DsCEk/config.libraries/conftest.o
> > > >> > >
> > > >>
> > /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> > > >> > > Ws2_32.lib
> > > >> > > stdout:
> > > >> > > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not
> > found
> > > >> or not
> > > >> > > built by the last incremental link; performing full link
> > > >> > > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> > > >> > > referenced in function _check_MPI_Init
> > > >> > > conftest.obj : error LNK2019: unresolved external symbol
> > > >> MPI_Comm_create
> > > >> > > referenced in function _check_MPI_Comm_create
> > > >> > > 

Re: [petsc-users] petsc on windows

2019-08-30 Thread Sam Guo via petsc-users
Thanks a lot for your help. It is my pilot error: I have both serial
version and parallel version of petstc. It turns out serial version is
always loaded. Now parallel petstc is working.

On Thu, Aug 29, 2019 at 5:51 PM Balay, Satish  wrote:

> On MS-Windows - you need the location of the DLLs in PATH
>
> Or use --with-shared-libraries=0
>
> Satish
>
> On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:
>
> > When I use intel mpi, configuration, compile and test all work fine but I
> > cannot use dll in my application.
> >
> > On Thu, Aug 29, 2019 at 3:46 PM Sam Guo  wrote:
> >
> > > After I removed following lines inin
> config/BuildSystem/config/package.py,
> > > configuration finished without error.
> > >  self.executeTest(self.checkDependencies)
> > >  self.executeTest(self.configureLibrary)
> > >  self.executeTest(self.checkSharedLibrary)
> > >
> > > I then add my mpi wrapper to
> ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
> > > PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> > >
> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> > >
> > > On Thu, Aug 29, 2019 at 3:28 PM Balay, Satish 
> wrote:
> > >
> > >> On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:
> > >>
> > >> > I can link when I add my wrapper to
> > >> > PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> > >> >
> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> > >>
> > >> I don't understand what you mean here. Add PCC_LINKER_FLAGS to where?
> > >> This is a variable in configure generated makefile
> > >>
> > >> Since PETSc is not built [as configure failed] - there should be no
> > >> configure generated makefiles.
> > >>
> > >> > (I don't understand why configure does not include my wrapper)
> > >>
> > >> Well the compiler gives the error below. Can you try to compile
> > >> manually [i.e without PETSc or any petsc makefiles] a simple MPI code
> > >> - say cpi.c from MPICH and see if it works?  [and copy/paste the log
> > >> from this compile attempt.
> > >>
> > >> Satish
> > >>
> > >> >
> > >> >
> > >> > On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley 
> > >> wrote:
> > >> >
> > >> > > On Thu, Aug 29, 2019 at 4:02 PM Sam Guo 
> > >> wrote:
> > >> > >
> > >> > >> Thanks for the quick response. Attached please find the
> configure.log
> > >> > >> containing the configure error.
> > >> > >>
> > >> > >
> > >> > > Executing:
> > >> /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> > >> > > -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o
> > >> > > -I/tmp/petsc-6DsCEk/config.compilers
> > >> > > -I/tmp/petsc-6DsCEk/config.setCompilers
> > >> > > -I/tmp/petsc-6DsCEk/config.utilities.closure
> > >> > > -I/tmp/petsc-6DsCEk/config.headers
> > >> > > -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails
> > >> > > -I/tmp/petsc-6DsCEk/config.types
> -I/tmp/petsc-6DsCEk/config.atomics
> > >> > > -I/tmp/petsc-6DsCEk/config.functions
> > >> > > -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros
> > >> > > -I/tmp/petsc-6DsCEk/config.utilities.missing
> > >> > > -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes
> > >> > > -I/tmp/petsc-6DsCEk/config.libraries  -MD -wd4996 -Z7
> > >> > >  /tmp/petsc-6DsCEk/config.libraries/conftest.c
> > >> > > stdout: conftest.c
> > >> > > Successful compile:
> > >> > > Source:
> > >> > > #include "confdefs.h"
> > >> > > #include "conffix.h"
> > >> > > /* Override any gcc2 internal prototype to avoid an error. */
> > >> > > char MPI_Init();
> > >> > > static void _check_MPI_Init() { MPI_Init(); }
> > >> > > char MPI_Comm_create();
> > >> > > static void _check_MPI_Comm_create() { MPI_Comm_create(); }
> > >> > >
> > >> > > int main() {
> > >> > > _check_MPI_Init();
> > >> > > _check_MPI_Comm_create();;
> > >> > >   return 0;
> > >> > > }
> > >> > > Executing:
> > >> /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> > >> > >  -o /tmp/petsc-6DsCEk/config.libraries/conftest.exe-MD
> -wd4996 -Z7
> > >> > > /tmp/petsc-6DsCEk/config.libraries/conftest.o
> > >> > >
> > >>
> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> > >> > > Ws2_32.lib
> > >> > > stdout:
> > >> > > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not
> found
> > >> or not
> > >> > > built by the last incremental link; performing full link
> > >> > > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> > >> > > referenced in function _check_MPI_Init
> > >> > > conftest.obj : error LNK2019: unresolved external symbol
> > >> MPI_Comm_create
> > >> > > referenced in function _check_MPI_Comm_create
> > >> > > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error
> > >> LNK1120:
> > >> > > 2 unresolved externals
> > >> > > Possible ERROR while running linker: exit code 2
> > >> > > stdout:
> > >> > > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not
> found
> > >> or not
> > >> > > built by the last incremental link; performing full link
> > >> > > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> > 

Re: [petsc-users] petsc on windows

2019-08-29 Thread Balay, Satish via petsc-users
On MS-Windows - you need the location of the DLLs in PATH

Or use --with-shared-libraries=0

Satish

On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:

> When I use intel mpi, configuration, compile and test all work fine but I
> cannot use dll in my application.
> 
> On Thu, Aug 29, 2019 at 3:46 PM Sam Guo  wrote:
> 
> > After I removed following lines inin config/BuildSystem/config/package.py,
> > configuration finished without error.
> >  self.executeTest(self.checkDependencies)
> >  self.executeTest(self.configureLibrary)
> >  self.executeTest(self.checkSharedLibrary)
> >
> > I then add my mpi wrapper to ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
> > PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> > /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> >
> > On Thu, Aug 29, 2019 at 3:28 PM Balay, Satish  wrote:
> >
> >> On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:
> >>
> >> > I can link when I add my wrapper to
> >> > PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> >> > /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> >>
> >> I don't understand what you mean here. Add PCC_LINKER_FLAGS to where?
> >> This is a variable in configure generated makefile
> >>
> >> Since PETSc is not built [as configure failed] - there should be no
> >> configure generated makefiles.
> >>
> >> > (I don't understand why configure does not include my wrapper)
> >>
> >> Well the compiler gives the error below. Can you try to compile
> >> manually [i.e without PETSc or any petsc makefiles] a simple MPI code
> >> - say cpi.c from MPICH and see if it works?  [and copy/paste the log
> >> from this compile attempt.
> >>
> >> Satish
> >>
> >> >
> >> >
> >> > On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley 
> >> wrote:
> >> >
> >> > > On Thu, Aug 29, 2019 at 4:02 PM Sam Guo 
> >> wrote:
> >> > >
> >> > >> Thanks for the quick response. Attached please find the configure.log
> >> > >> containing the configure error.
> >> > >>
> >> > >
> >> > > Executing:
> >> /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> >> > > -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o
> >> > > -I/tmp/petsc-6DsCEk/config.compilers
> >> > > -I/tmp/petsc-6DsCEk/config.setCompilers
> >> > > -I/tmp/petsc-6DsCEk/config.utilities.closure
> >> > > -I/tmp/petsc-6DsCEk/config.headers
> >> > > -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails
> >> > > -I/tmp/petsc-6DsCEk/config.types -I/tmp/petsc-6DsCEk/config.atomics
> >> > > -I/tmp/petsc-6DsCEk/config.functions
> >> > > -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros
> >> > > -I/tmp/petsc-6DsCEk/config.utilities.missing
> >> > > -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes
> >> > > -I/tmp/petsc-6DsCEk/config.libraries  -MD -wd4996 -Z7
> >> > >  /tmp/petsc-6DsCEk/config.libraries/conftest.c
> >> > > stdout: conftest.c
> >> > > Successful compile:
> >> > > Source:
> >> > > #include "confdefs.h"
> >> > > #include "conffix.h"
> >> > > /* Override any gcc2 internal prototype to avoid an error. */
> >> > > char MPI_Init();
> >> > > static void _check_MPI_Init() { MPI_Init(); }
> >> > > char MPI_Comm_create();
> >> > > static void _check_MPI_Comm_create() { MPI_Comm_create(); }
> >> > >
> >> > > int main() {
> >> > > _check_MPI_Init();
> >> > > _check_MPI_Comm_create();;
> >> > >   return 0;
> >> > > }
> >> > > Executing:
> >> /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> >> > >  -o /tmp/petsc-6DsCEk/config.libraries/conftest.exe-MD -wd4996 -Z7
> >> > > /tmp/petsc-6DsCEk/config.libraries/conftest.o
> >> > >
> >> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> >> > > Ws2_32.lib
> >> > > stdout:
> >> > > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found
> >> or not
> >> > > built by the last incremental link; performing full link
> >> > > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> >> > > referenced in function _check_MPI_Init
> >> > > conftest.obj : error LNK2019: unresolved external symbol
> >> MPI_Comm_create
> >> > > referenced in function _check_MPI_Comm_create
> >> > > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error
> >> LNK1120:
> >> > > 2 unresolved externals
> >> > > Possible ERROR while running linker: exit code 2
> >> > > stdout:
> >> > > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found
> >> or not
> >> > > built by the last incremental link; performing full link
> >> > > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> >> > > referenced in function _check_MPI_Init
> >> > > conftest.obj : error LNK2019: unresolved external symbol
> >> MPI_Comm_create
> >> > > referenced in function _check_MPI_Comm_create
> >> > > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error
> >> LNK1120:
> >> > > 2 unresolved externals
> >> > >
> >> > > The link is definitely failing. Does it work if you do it by hand?
> >> > >
> >> > >   Thanks,
> >> > >
> >> > >  Matt
> >> > >
> >> > >
> >> > >> Regarding our 

Re: [petsc-users] petsc on windows

2019-08-29 Thread Sam Guo via petsc-users
After I removed following lines inin config/BuildSystem/config/package.py,
configuration finished without error.
 self.executeTest(self.checkDependencies)
 self.executeTest(self.configureLibrary)
 self.executeTest(self.checkSharedLibrary)

I then add my mpi wrapper to ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
PCC_LINKER_FLAGS =-MD -wd4996 -Z7
/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib

On Thu, Aug 29, 2019 at 3:28 PM Balay, Satish  wrote:

> On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:
>
> > I can link when I add my wrapper to
> > PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> > /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
>
> I don't understand what you mean here. Add PCC_LINKER_FLAGS to where? This
> is a variable in configure generated makefile
>
> Since PETSc is not built [as configure failed] - there should be no
> configure generated makefiles.
>
> > (I don't understand why configure does not include my wrapper)
>
> Well the compiler gives the error below. Can you try to compile
> manually [i.e without PETSc or any petsc makefiles] a simple MPI code
> - say cpi.c from MPICH and see if it works?  [and copy/paste the log
> from this compile attempt.
>
> Satish
>
> >
> >
> > On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley 
> wrote:
> >
> > > On Thu, Aug 29, 2019 at 4:02 PM Sam Guo  wrote:
> > >
> > >> Thanks for the quick response. Attached please find the configure.log
> > >> containing the configure error.
> > >>
> > >
> > > Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe
> cl
> > > -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o
> > > -I/tmp/petsc-6DsCEk/config.compilers
> > > -I/tmp/petsc-6DsCEk/config.setCompilers
> > > -I/tmp/petsc-6DsCEk/config.utilities.closure
> > > -I/tmp/petsc-6DsCEk/config.headers
> > > -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails
> > > -I/tmp/petsc-6DsCEk/config.types -I/tmp/petsc-6DsCEk/config.atomics
> > > -I/tmp/petsc-6DsCEk/config.functions
> > > -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros
> > > -I/tmp/petsc-6DsCEk/config.utilities.missing
> > > -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes
> > > -I/tmp/petsc-6DsCEk/config.libraries  -MD -wd4996 -Z7
> > >  /tmp/petsc-6DsCEk/config.libraries/conftest.c
> > > stdout: conftest.c
> > > Successful compile:
> > > Source:
> > > #include "confdefs.h"
> > > #include "conffix.h"
> > > /* Override any gcc2 internal prototype to avoid an error. */
> > > char MPI_Init();
> > > static void _check_MPI_Init() { MPI_Init(); }
> > > char MPI_Comm_create();
> > > static void _check_MPI_Comm_create() { MPI_Comm_create(); }
> > >
> > > int main() {
> > > _check_MPI_Init();
> > > _check_MPI_Comm_create();;
> > >   return 0;
> > > }
> > > Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe
> cl
> > >  -o /tmp/petsc-6DsCEk/config.libraries/conftest.exe-MD -wd4996 -Z7
> > > /tmp/petsc-6DsCEk/config.libraries/conftest.o
> > >
> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> > > Ws2_32.lib
> > > stdout:
> > > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or
> not
> > > built by the last incremental link; performing full link
> > > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> > > referenced in function _check_MPI_Init
> > > conftest.obj : error LNK2019: unresolved external symbol
> MPI_Comm_create
> > > referenced in function _check_MPI_Comm_create
> > > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error
> LNK1120:
> > > 2 unresolved externals
> > > Possible ERROR while running linker: exit code 2
> > > stdout:
> > > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or
> not
> > > built by the last incremental link; performing full link
> > > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> > > referenced in function _check_MPI_Init
> > > conftest.obj : error LNK2019: unresolved external symbol
> MPI_Comm_create
> > > referenced in function _check_MPI_Comm_create
> > > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error
> LNK1120:
> > > 2 unresolved externals
> > >
> > > The link is definitely failing. Does it work if you do it by hand?
> > >
> > >   Thanks,
> > >
> > >  Matt
> > >
> > >
> > >> Regarding our dup, our wrapper does support it. In fact, everything
> works
> > >> fine on Linux. I suspect on windows, PETSc picks the system mpi.h
> somehow.
> > >> I am investigating it.
> > >>
> > >> Thanks,
> > >> Sam
> > >>
> > >> On Thu, Aug 29, 2019 at 3:39 PM Matthew Knepley 
> > >> wrote:
> > >>
> > >>> On Thu, Aug 29, 2019 at 3:33 PM Sam Guo via petsc-users <
> > >>> petsc-users@mcs.anl.gov> wrote:
> > >>>
> >  Dear PETSc dev team,
> > I am looking some tips porting petsc to windows. We have our mpi
> >  wrapper (so we can switch different mpi). I configure petsc using
> >  --with-mpi-lib and --with-mpi-include
> >   ./configure --with-cc="win32fe cl" 

Re: [petsc-users] petsc on windows

2019-08-29 Thread Balay, Satish via petsc-users
On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:

> I can link when I add my wrapper to
> PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib

I don't understand what you mean here. Add PCC_LINKER_FLAGS to where? This is a 
variable in configure generated makefile 

Since PETSc is not built [as configure failed] - there should be no configure 
generated makefiles.

> (I don't understand why configure does not include my wrapper)

Well the compiler gives the error below. Can you try to compile
manually [i.e without PETSc or any petsc makefiles] a simple MPI code
- say cpi.c from MPICH and see if it works?  [and copy/paste the log
from this compile attempt.

Satish

> 
> 
> On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley  wrote:
> 
> > On Thu, Aug 29, 2019 at 4:02 PM Sam Guo  wrote:
> >
> >> Thanks for the quick response. Attached please find the configure.log
> >> containing the configure error.
> >>
> >
> > Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> > -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o
> > -I/tmp/petsc-6DsCEk/config.compilers
> > -I/tmp/petsc-6DsCEk/config.setCompilers
> > -I/tmp/petsc-6DsCEk/config.utilities.closure
> > -I/tmp/petsc-6DsCEk/config.headers
> > -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails
> > -I/tmp/petsc-6DsCEk/config.types -I/tmp/petsc-6DsCEk/config.atomics
> > -I/tmp/petsc-6DsCEk/config.functions
> > -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros
> > -I/tmp/petsc-6DsCEk/config.utilities.missing
> > -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes
> > -I/tmp/petsc-6DsCEk/config.libraries  -MD -wd4996 -Z7
> >  /tmp/petsc-6DsCEk/config.libraries/conftest.c
> > stdout: conftest.c
> > Successful compile:
> > Source:
> > #include "confdefs.h"
> > #include "conffix.h"
> > /* Override any gcc2 internal prototype to avoid an error. */
> > char MPI_Init();
> > static void _check_MPI_Init() { MPI_Init(); }
> > char MPI_Comm_create();
> > static void _check_MPI_Comm_create() { MPI_Comm_create(); }
> >
> > int main() {
> > _check_MPI_Init();
> > _check_MPI_Comm_create();;
> >   return 0;
> > }
> > Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> >  -o /tmp/petsc-6DsCEk/config.libraries/conftest.exe-MD -wd4996 -Z7
> > /tmp/petsc-6DsCEk/config.libraries/conftest.o
> >  /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> > Ws2_32.lib
> > stdout:
> > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or not
> > built by the last incremental link; performing full link
> > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> > referenced in function _check_MPI_Init
> > conftest.obj : error LNK2019: unresolved external symbol MPI_Comm_create
> > referenced in function _check_MPI_Comm_create
> > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120:
> > 2 unresolved externals
> > Possible ERROR while running linker: exit code 2
> > stdout:
> > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or not
> > built by the last incremental link; performing full link
> > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> > referenced in function _check_MPI_Init
> > conftest.obj : error LNK2019: unresolved external symbol MPI_Comm_create
> > referenced in function _check_MPI_Comm_create
> > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120:
> > 2 unresolved externals
> >
> > The link is definitely failing. Does it work if you do it by hand?
> >
> >   Thanks,
> >
> >  Matt
> >
> >
> >> Regarding our dup, our wrapper does support it. In fact, everything works
> >> fine on Linux. I suspect on windows, PETSc picks the system mpi.h somehow.
> >> I am investigating it.
> >>
> >> Thanks,
> >> Sam
> >>
> >> On Thu, Aug 29, 2019 at 3:39 PM Matthew Knepley 
> >> wrote:
> >>
> >>> On Thu, Aug 29, 2019 at 3:33 PM Sam Guo via petsc-users <
> >>> petsc-users@mcs.anl.gov> wrote:
> >>>
>  Dear PETSc dev team,
> I am looking some tips porting petsc to windows. We have our mpi
>  wrapper (so we can switch different mpi). I configure petsc using
>  --with-mpi-lib and --with-mpi-include
>   ./configure --with-cc="win32fe cl" --with-fc=0
>  --download-f2cblaslapack
>  --with-mpi-lib=/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
>  --with-mpi-include=/home/xianzhongg/dev/star/base/src/mpi/include
>  --with-shared-libaries=1
> 
>  But I got error
> 
>  ===
>   Configuring PETSc to compile on your system
> 
>  ===
>  TESTING: check from
>  config.libraries(config/BuildSystem/config/libraries.py:154)
>  ***
>    

Re: [petsc-users] petsc on windows

2019-08-29 Thread Sam Guo via petsc-users
I can link when I add my wrapper to
PCC_LINKER_FLAGS =-MD -wd4996 -Z7
/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
(I don't understand why configure does not include my wrapper)


On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley  wrote:

> On Thu, Aug 29, 2019 at 4:02 PM Sam Guo  wrote:
>
>> Thanks for the quick response. Attached please find the configure.log
>> containing the configure error.
>>
>
> Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o
> -I/tmp/petsc-6DsCEk/config.compilers
> -I/tmp/petsc-6DsCEk/config.setCompilers
> -I/tmp/petsc-6DsCEk/config.utilities.closure
> -I/tmp/petsc-6DsCEk/config.headers
> -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails
> -I/tmp/petsc-6DsCEk/config.types -I/tmp/petsc-6DsCEk/config.atomics
> -I/tmp/petsc-6DsCEk/config.functions
> -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros
> -I/tmp/petsc-6DsCEk/config.utilities.missing
> -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes
> -I/tmp/petsc-6DsCEk/config.libraries  -MD -wd4996 -Z7
>  /tmp/petsc-6DsCEk/config.libraries/conftest.c
> stdout: conftest.c
> Successful compile:
> Source:
> #include "confdefs.h"
> #include "conffix.h"
> /* Override any gcc2 internal prototype to avoid an error. */
> char MPI_Init();
> static void _check_MPI_Init() { MPI_Init(); }
> char MPI_Comm_create();
> static void _check_MPI_Comm_create() { MPI_Comm_create(); }
>
> int main() {
> _check_MPI_Init();
> _check_MPI_Comm_create();;
>   return 0;
> }
> Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
>  -o /tmp/petsc-6DsCEk/config.libraries/conftest.exe-MD -wd4996 -Z7
> /tmp/petsc-6DsCEk/config.libraries/conftest.o
>  /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> Ws2_32.lib
> stdout:
> LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or not
> built by the last incremental link; performing full link
> conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> referenced in function _check_MPI_Init
> conftest.obj : error LNK2019: unresolved external symbol MPI_Comm_create
> referenced in function _check_MPI_Comm_create
> C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120:
> 2 unresolved externals
> Possible ERROR while running linker: exit code 2
> stdout:
> LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or not
> built by the last incremental link; performing full link
> conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> referenced in function _check_MPI_Init
> conftest.obj : error LNK2019: unresolved external symbol MPI_Comm_create
> referenced in function _check_MPI_Comm_create
> C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120:
> 2 unresolved externals
>
> The link is definitely failing. Does it work if you do it by hand?
>
>   Thanks,
>
>  Matt
>
>
>> Regarding our dup, our wrapper does support it. In fact, everything works
>> fine on Linux. I suspect on windows, PETSc picks the system mpi.h somehow.
>> I am investigating it.
>>
>> Thanks,
>> Sam
>>
>> On Thu, Aug 29, 2019 at 3:39 PM Matthew Knepley 
>> wrote:
>>
>>> On Thu, Aug 29, 2019 at 3:33 PM Sam Guo via petsc-users <
>>> petsc-users@mcs.anl.gov> wrote:
>>>
 Dear PETSc dev team,
I am looking some tips porting petsc to windows. We have our mpi
 wrapper (so we can switch different mpi). I configure petsc using
 --with-mpi-lib and --with-mpi-include
  ./configure --with-cc="win32fe cl" --with-fc=0
 --download-f2cblaslapack
 --with-mpi-lib=/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
 --with-mpi-include=/home/xianzhongg/dev/star/base/src/mpi/include
 --with-shared-libaries=1

 But I got error

 ===
  Configuring PETSc to compile on your system

 ===
 TESTING: check from
 config.libraries(config/BuildSystem/config/libraries.py:154)
 ***
  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log
 for details):

 ---
 --with-mpi-lib=['/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib']
 and
 --with-mpi-include=['/home/xianzhongg/dev/star/base/src/mpi/include']
 did not work

 ***

>>>
>>> Your MPI wrapper should pass the tests here. Send the configure.log
>>>
>>>
 To fix the configuration error,  in
 config/BuildSystem/config/package.py, I removed
  self.executeTest(self.checkDependencies)
  

Re: [petsc-users] petsc on windows

2019-08-29 Thread Sam Guo via petsc-users
Thanks for the quick response. Attached please find the configure.log
containing the configure error.

Regarding our dup, our wrapper does support it. In fact, everything works
fine on Linux. I suspect on windows, PETSc picks the system mpi.h somehow.
I am investigating it.

Thanks,
Sam

On Thu, Aug 29, 2019 at 3:39 PM Matthew Knepley  wrote:

> On Thu, Aug 29, 2019 at 3:33 PM Sam Guo via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
>> Dear PETSc dev team,
>>I am looking some tips porting petsc to windows. We have our mpi
>> wrapper (so we can switch different mpi). I configure petsc using
>> --with-mpi-lib and --with-mpi-include
>>  ./configure --with-cc="win32fe cl" --with-fc=0 --download-f2cblaslapack
>> --with-mpi-lib=/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
>> --with-mpi-include=/home/xianzhongg/dev/star/base/src/mpi/include
>> --with-shared-libaries=1
>>
>> But I got error
>>
>> ===
>>  Configuring PETSc to compile on your system
>>
>> ===
>> TESTING: check from
>> config.libraries(config/BuildSystem/config/libraries.py:154)
>> ***
>>  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for
>> details):
>>
>> ---
>> --with-mpi-lib=['/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib']
>> and
>> --with-mpi-include=['/home/xianzhongg/dev/star/base/src/mpi/include'] did
>> not work
>>
>> ***
>>
>
> Your MPI wrapper should pass the tests here. Send the configure.log
>
>
>> To fix the configuration error,  in config/BuildSystem/config/package.py,
>> I removed
>>  self.executeTest(self.checkDependencies)
>>  self.executeTest(self.configureLibrary)
>>  self.executeTest(self.checkSharedLibrary)
>>
>> To link, I add my mpi wrapper
>> to ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
>> PCC_LINKER_FLAGS =-MD -wd4996 -Z7
>> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
>>
>> I got libpetstc.dll and libpetstc.lib. When I try to test it inside our
>> code, PETSc somehow crates a duplicate of communicator with only 1 MPI
>> process and PETSC_COMM_WORLD is set to 2. If I set PETSC_COMM_WORLD to 1
>> (our MPI_COMM_WORLD), PETSc is hanging.
>>
>
> We do dup the communicator on entry. Shouldn't that be supported by your
> wrapper?
>
>   Thanks,
>
>  Matt
>
>
>> I am wondering if you could give me some tips how to debug this problem.
>>
>> BR,
>> Sam
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


configure.log
Description: Binary data


Re: [petsc-users] Petsc with Windows

2016-11-30 Thread E. Tadeu
On Wed, Nov 30, 2016 at 6:03 AM, Boris Kaus  wrote:

>
> > This is probably a better choice than Cygwin going forward.
> >
> > https://msdn.microsoft.com/en-us/commandline/wsl/about
> >
> > I don't know to what extent PETSc users have experimented with this
> > feature, but it should make it easier to build and distribute PETSc.
> We have tried this in Mainz, and PETSc (with MUMPS/SUPERLU_DIST/mpich)
> compiles out of the box with the new command-line option under windows 10.
> It’s not as fast as linux/mac, but does the job
>

Good to hear this, Boris :)
I'm very interested, do you know how much the performance hit is? Perhaps
by using BLAS from MKL it could be faster? Do you know what compiler is
being used?

Thanks!


Re: [petsc-users] Petsc with Windows

2016-11-30 Thread E. Tadeu
Hi Elaine,

  The PETSc configured/built in Cygwin can be normally used outside of it.
It can either be statically linked with your software, or linked as a .DLL
and deployed normally :).


On Mon, Nov 28, 2016 at 1:49 PM, Elaine Tang  wrote:

> Hi,
>
> I am developing some software on windows that would like to utilize petsc
> library. Currently I have petsc library configured on cygwin on my windows
> machine.
>
> Is there any binary of petsc for windows so that the software that I
> develop will be more portable and can be run on other windows machine as
> well?
>
> Thanks!
> --
> Elaine Tang
>


Re: [petsc-users] Petsc with Windows

2016-11-30 Thread Boris Kaus

> This is probably a better choice than Cygwin going forward.
> 
> https://msdn.microsoft.com/en-us/commandline/wsl/about
> 
> I don't know to what extent PETSc users have experimented with this
> feature, but it should make it easier to build and distribute PETSc.
We have tried this in Mainz, and PETSc (with MUMPS/SUPERLU_DIST/mpich) compiles 
out of the box with the new command-line option under windows 10.
It’s not as fast as linux/mac, but does the job

This won’t make me give up my mac yet, but windows seems to head in the right 
direction.

Boris


___

Boris J.P. Kaus

Institute of Geosciences, 
Center for Computational Sciences & 
Center for Volcanoes and Atmosphere in Magmatic Open Systems
Johannes Gutenberg University of Mainz, Mainz, Germany
Office: 00-285
Tel:+49.6131.392.4527

http://www.geo-dynamics.eu
___




Re: [petsc-users] Petsc with Windows

2016-11-30 Thread Mohammad Mirzadeh
May I propose docker as an alternative approach? https://www.docker.com

There are already petsc images and creating your own environment is not
that hard. As a bonus you get a cross platform solution ... unless your
application is windows specific in which case docker might not be the best
way to go.

On Wed, Nov 30, 2016 at 1:45 AM Jed Brown  wrote:

> Elaine Tang  writes:
>
> > I am developing some software on windows that would like to utilize petsc
> > library. Currently I have petsc library configured on cygwin on my
> windows
> > machine.
>
> This is probably a better choice than Cygwin going forward.
>
> https://msdn.microsoft.com/en-us/commandline/wsl/about
>
> I don't know to what extent PETSc users have experimented with this
> feature, but it should make it easier to build and distribute PETSc.
>
> > Is there any binary of petsc for windows so that the software that I
> > develop will be more portable and can be run on other windows machine as
> > well?
>
> Are you developing an application or a library?
>
-- 
Sent from Gmail Mobile


Re: [petsc-users] Petsc with Windows

2016-11-29 Thread Jed Brown
Elaine Tang  writes:

> I am developing some software on windows that would like to utilize petsc
> library. Currently I have petsc library configured on cygwin on my windows
> machine.

This is probably a better choice than Cygwin going forward.

https://msdn.microsoft.com/en-us/commandline/wsl/about

I don't know to what extent PETSc users have experimented with this
feature, but it should make it easier to build and distribute PETSc.

> Is there any binary of petsc for windows so that the software that I
> develop will be more portable and can be run on other windows machine as
> well?

Are you developing an application or a library?


signature.asc
Description: PGP signature