Re: [petsc-users] petsc on windows

2019-08-30 Thread Sam Guo via petsc-users
Thanks a lot for your help. It is my pilot error: I have both serial
version and parallel version of petstc. It turns out serial version is
always loaded. Now parallel petstc is working.

On Thu, Aug 29, 2019 at 5:51 PM Balay, Satish  wrote:

> On MS-Windows - you need the location of the DLLs in PATH
>
> Or use --with-shared-libraries=0
>
> Satish
>
> On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:
>
> > When I use intel mpi, configuration, compile and test all work fine but I
> > cannot use dll in my application.
> >
> > On Thu, Aug 29, 2019 at 3:46 PM Sam Guo  wrote:
> >
> > > After I removed following lines inin
> config/BuildSystem/config/package.py,
> > > configuration finished without error.
> > >  self.executeTest(self.checkDependencies)
> > >  self.executeTest(self.configureLibrary)
> > >  self.executeTest(self.checkSharedLibrary)
> > >
> > > I then add my mpi wrapper to
> ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
> > > PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> > >
> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> > >
> > > On Thu, Aug 29, 2019 at 3:28 PM Balay, Satish 
> wrote:
> > >
> > >> On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:
> > >>
> > >> > I can link when I add my wrapper to
> > >> > PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> > >> >
> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> > >>
> > >> I don't understand what you mean here. Add PCC_LINKER_FLAGS to where?
> > >> This is a variable in configure generated makefile
> > >>
> > >> Since PETSc is not built [as configure failed] - there should be no
> > >> configure generated makefiles.
> > >>
> > >> > (I don't understand why configure does not include my wrapper)
> > >>
> > >> Well the compiler gives the error below. Can you try to compile
> > >> manually [i.e without PETSc or any petsc makefiles] a simple MPI code
> > >> - say cpi.c from MPICH and see if it works?  [and copy/paste the log
> > >> from this compile attempt.
> > >>
> > >> Satish
> > >>
> > >> >
> > >> >
> > >> > On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley 
> > >> wrote:
> > >> >
> > >> > > On Thu, Aug 29, 2019 at 4:02 PM Sam Guo 
> > >> wrote:
> > >> > >
> > >> > >> Thanks for the quick response. Attached please find the
> configure.log
> > >> > >> containing the configure error.
> > >> > >>
> > >> > >
> > >> > > Executing:
> > >> /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> > >> > > -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o
> > >> > > -I/tmp/petsc-6DsCEk/config.compilers
> > >> > > -I/tmp/petsc-6DsCEk/config.setCompilers
> > >> > > -I/tmp/petsc-6DsCEk/config.utilities.closure
> > >> > > -I/tmp/petsc-6DsCEk/config.headers
> > >> > > -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails
> > >> > > -I/tmp/petsc-6DsCEk/config.types
> -I/tmp/petsc-6DsCEk/config.atomics
> > >> > > -I/tmp/petsc-6DsCEk/config.functions
> > >> > > -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros
> > >> > > -I/tmp/petsc-6DsCEk/config.utilities.missing
> > >> > > -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes
> > >> > > -I/tmp/petsc-6DsCEk/config.libraries  -MD -wd4996 -Z7
> > >> > >  /tmp/petsc-6DsCEk/config.libraries/conftest.c
> > >> > > stdout: conftest.c
> > >> > > Successful compile:
> > >> > > Source:
> > >> > > #include "confdefs.h"
> > >> > > #include "conffix.h"
> > >> > > /* Override any gcc2 internal prototype to avoid an error. */
> > >> > > char MPI_Init();
> > >> > > static void _check_MPI_Init() { MPI_Init(); }
> > >> > > char MPI_Comm_create();
> > >> > > static void _check_MPI_Comm_create() { MPI_Comm_create(); }
> > >> > >
> > >> > > int main() {
> > >> > > _check_MPI_Init();
> > >> > > _check_MPI_Comm_create();;
> > >> > >   return 0;
> > >> > > }
> > >> > > 

Re: [petsc-users] petsc on windows

2019-08-29 Thread Sam Guo via petsc-users
After I removed following lines inin config/BuildSystem/config/package.py,
configuration finished without error.
 self.executeTest(self.checkDependencies)
 self.executeTest(self.configureLibrary)
 self.executeTest(self.checkSharedLibrary)

I then add my mpi wrapper to ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
PCC_LINKER_FLAGS =-MD -wd4996 -Z7
/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib

On Thu, Aug 29, 2019 at 3:28 PM Balay, Satish  wrote:

> On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:
>
> > I can link when I add my wrapper to
> > PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> > /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
>
> I don't understand what you mean here. Add PCC_LINKER_FLAGS to where? This
> is a variable in configure generated makefile
>
> Since PETSc is not built [as configure failed] - there should be no
> configure generated makefiles.
>
> > (I don't understand why configure does not include my wrapper)
>
> Well the compiler gives the error below. Can you try to compile
> manually [i.e without PETSc or any petsc makefiles] a simple MPI code
> - say cpi.c from MPICH and see if it works?  [and copy/paste the log
> from this compile attempt.
>
> Satish
>
> >
> >
> > On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley 
> wrote:
> >
> > > On Thu, Aug 29, 2019 at 4:02 PM Sam Guo  wrote:
> > >
> > >> Thanks for the quick response. Attached please find the configure.log
> > >> containing the configure error.
> > >>
> > >
> > > Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe
> cl
> > > -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o
> > > -I/tmp/petsc-6DsCEk/config.compilers
> > > -I/tmp/petsc-6DsCEk/config.setCompilers
> > > -I/tmp/petsc-6DsCEk/config.utilities.closure
> > > -I/tmp/petsc-6DsCEk/config.headers
> > > -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails
> > > -I/tmp/petsc-6DsCEk/config.types -I/tmp/petsc-6DsCEk/config.atomics
> > > -I/tmp/petsc-6DsCEk/config.functions
> > > -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros
> > > -I/tmp/petsc-6DsCEk/config.utilities.missing
> > > -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes
> > > -I/tmp/petsc-6DsCEk/config.libraries  -MD -wd4996 -Z7
> > >  /tmp/petsc-6DsCEk/config.libraries/conftest.c
> > > stdout: conftest.c
> > > Successful compile:
> > > Source:
> > > #include "confdefs.h"
> > > #include "conffix.h"
> > > /* Override any gcc2 internal prototype to avoid an error. */
> > > char MPI_Init();
> > > static void _check_MPI_Init() { MPI_Init(); }
> > > char MPI_Comm_create();
> > > static void _check_MPI_Comm_create() { MPI_Comm_create(); }
> > >
> > > int main() {
> > > _check_MPI_Init();
> > > _check_MPI_Comm_create();;
> > >   return 0;
> > > }
> > > Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe
> cl
> > >  -o /tmp/petsc-6DsCEk/config.libraries/conftest.exe-MD -wd4996 -Z7
> > > /tmp/petsc-6DsCEk/config.libraries/conftest.o
> > >
> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> > > Ws2_32.lib
> > > stdout:
> > > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or
> not
> > > built by the last incremental link; performing full link
> > > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> > > referenced in function _check_MPI_Init
> > > conftest.obj : error LNK2019: unresolved external symbol
> MPI_Comm_create
> > > referenced in function _check_MPI_Comm_create
> > > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error
> LNK1120:
> > > 2 unresolved externals
> > > Possible ERROR while running linker: exit code 2
> > > stdout:
> > > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or
> not
> > > built by the last incremental link; performing full link
> > > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> > > referenced in function _check_MPI_Init
> > > conftest.obj : error LNK2019: unresolved external symbol
> MPI_Comm_create
> > > referenced in function _check_MPI_Comm_create
> > > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error
> LNK1120:
> > > 2 unresolved externals
> > >
> > > The link is definitely failing. Does it work if you do it by hand?
> > >
> > >   Thanks,

Re: [petsc-users] petsc on windows

2019-08-29 Thread Sam Guo via petsc-users
I can link when I add my wrapper to
PCC_LINKER_FLAGS =-MD -wd4996 -Z7
/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
(I don't understand why configure does not include my wrapper)


On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley  wrote:

> On Thu, Aug 29, 2019 at 4:02 PM Sam Guo  wrote:
>
>> Thanks for the quick response. Attached please find the configure.log
>> containing the configure error.
>>
>
> Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o
> -I/tmp/petsc-6DsCEk/config.compilers
> -I/tmp/petsc-6DsCEk/config.setCompilers
> -I/tmp/petsc-6DsCEk/config.utilities.closure
> -I/tmp/petsc-6DsCEk/config.headers
> -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails
> -I/tmp/petsc-6DsCEk/config.types -I/tmp/petsc-6DsCEk/config.atomics
> -I/tmp/petsc-6DsCEk/config.functions
> -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros
> -I/tmp/petsc-6DsCEk/config.utilities.missing
> -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes
> -I/tmp/petsc-6DsCEk/config.libraries  -MD -wd4996 -Z7
>  /tmp/petsc-6DsCEk/config.libraries/conftest.c
> stdout: conftest.c
> Successful compile:
> Source:
> #include "confdefs.h"
> #include "conffix.h"
> /* Override any gcc2 internal prototype to avoid an error. */
> char MPI_Init();
> static void _check_MPI_Init() { MPI_Init(); }
> char MPI_Comm_create();
> static void _check_MPI_Comm_create() { MPI_Comm_create(); }
>
> int main() {
> _check_MPI_Init();
> _check_MPI_Comm_create();;
>   return 0;
> }
> Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
>  -o /tmp/petsc-6DsCEk/config.libraries/conftest.exe-MD -wd4996 -Z7
> /tmp/petsc-6DsCEk/config.libraries/conftest.o
>  /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> Ws2_32.lib
> stdout:
> LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or not
> built by the last incremental link; performing full link
> conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> referenced in function _check_MPI_Init
> conftest.obj : error LNK2019: unresolved external symbol MPI_Comm_create
> referenced in function _check_MPI_Comm_create
> C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120:
> 2 unresolved externals
> Possible ERROR while running linker: exit code 2
> stdout:
> LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or not
> built by the last incremental link; performing full link
> conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> referenced in function _check_MPI_Init
> conftest.obj : error LNK2019: unresolved external symbol MPI_Comm_create
> referenced in function _check_MPI_Comm_create
> C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120:
> 2 unresolved externals
>
> The link is definitely failing. Does it work if you do it by hand?
>
>   Thanks,
>
>  Matt
>
>
>> Regarding our dup, our wrapper does support it. In fact, everything works
>> fine on Linux. I suspect on windows, PETSc picks the system mpi.h somehow.
>> I am investigating it.
>>
>> Thanks,
>> Sam
>>
>> On Thu, Aug 29, 2019 at 3:39 PM Matthew Knepley 
>> wrote:
>>
>>> On Thu, Aug 29, 2019 at 3:33 PM Sam Guo via petsc-users <
>>> petsc-users@mcs.anl.gov> wrote:
>>>
>>>> Dear PETSc dev team,
>>>>I am looking some tips porting petsc to windows. We have our mpi
>>>> wrapper (so we can switch different mpi). I configure petsc using
>>>> --with-mpi-lib and --with-mpi-include
>>>>  ./configure --with-cc="win32fe cl" --with-fc=0
>>>> --download-f2cblaslapack
>>>> --with-mpi-lib=/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
>>>> --with-mpi-include=/home/xianzhongg/dev/star/base/src/mpi/include
>>>> --with-shared-libaries=1
>>>>
>>>> But I got error
>>>>
>>>> ===
>>>>  Configuring PETSc to compile on your system
>>>>
>>>> ===
>>>> TESTING: check from
>>>> config.libraries(config/BuildSystem/config/libraries.py:154)
>>>> ***
>>>>  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log
>>>> for details):
>>>>
>>>> -

Re: [petsc-users] petsc on windows

2019-08-29 Thread Sam Guo via petsc-users
Thanks for the quick response. Attached please find the configure.log
containing the configure error.

Regarding our dup, our wrapper does support it. In fact, everything works
fine on Linux. I suspect on windows, PETSc picks the system mpi.h somehow.
I am investigating it.

Thanks,
Sam

On Thu, Aug 29, 2019 at 3:39 PM Matthew Knepley  wrote:

> On Thu, Aug 29, 2019 at 3:33 PM Sam Guo via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
>> Dear PETSc dev team,
>>I am looking some tips porting petsc to windows. We have our mpi
>> wrapper (so we can switch different mpi). I configure petsc using
>> --with-mpi-lib and --with-mpi-include
>>  ./configure --with-cc="win32fe cl" --with-fc=0 --download-f2cblaslapack
>> --with-mpi-lib=/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
>> --with-mpi-include=/home/xianzhongg/dev/star/base/src/mpi/include
>> --with-shared-libaries=1
>>
>> But I got error
>>
>> ===
>>  Configuring PETSc to compile on your system
>>
>> ===
>> TESTING: check from
>> config.libraries(config/BuildSystem/config/libraries.py:154)
>> ***
>>  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for
>> details):
>>
>> ---
>> --with-mpi-lib=['/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib']
>> and
>> --with-mpi-include=['/home/xianzhongg/dev/star/base/src/mpi/include'] did
>> not work
>>
>> ***
>>
>
> Your MPI wrapper should pass the tests here. Send the configure.log
>
>
>> To fix the configuration error,  in config/BuildSystem/config/package.py,
>> I removed
>>  self.executeTest(self.checkDependencies)
>>  self.executeTest(self.configureLibrary)
>>  self.executeTest(self.checkSharedLibrary)
>>
>> To link, I add my mpi wrapper
>> to ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
>> PCC_LINKER_FLAGS =-MD -wd4996 -Z7
>> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
>>
>> I got libpetstc.dll and libpetstc.lib. When I try to test it inside our
>> code, PETSc somehow crates a duplicate of communicator with only 1 MPI
>> process and PETSC_COMM_WORLD is set to 2. If I set PETSC_COMM_WORLD to 1
>> (our MPI_COMM_WORLD), PETSc is hanging.
>>
>
> We do dup the communicator on entry. Shouldn't that be supported by your
> wrapper?
>
>   Thanks,
>
>  Matt
>
>
>> I am wondering if you could give me some tips how to debug this problem.
>>
>> BR,
>> Sam
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>


configure.log
Description: Binary data


[petsc-users] petsc on windows

2019-08-29 Thread Sam Guo via petsc-users
Dear PETSc dev team,
   I am looking some tips porting petsc to windows. We have our mpi wrapper
(so we can switch different mpi). I configure petsc using --with-mpi-lib
and --with-mpi-include
 ./configure --with-cc="win32fe cl" --with-fc=0 --download-f2cblaslapack
--with-mpi-lib=/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
--with-mpi-include=/home/xianzhongg/dev/star/base/src/mpi/include
--with-shared-libaries=1

But I got error
===
 Configuring PETSc to compile on your system
===
TESTING: check from
config.libraries(config/BuildSystem/config/libraries.py:154)
***
 UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for
details):
---
--with-mpi-lib=['/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib']
and
--with-mpi-include=['/home/xianzhongg/dev/star/base/src/mpi/include'] did
not work
***

To fix the configuration error,  in config/BuildSystem/config/package.py, I
removed
 self.executeTest(self.checkDependencies)
 self.executeTest(self.configureLibrary)
 self.executeTest(self.checkSharedLibrary)

To link, I add my mpi wrapper
to ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
PCC_LINKER_FLAGS =-MD -wd4996 -Z7
/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib

I got libpetstc.dll and libpetstc.lib. When I try to test it inside our
code, PETSc somehow crates a duplicate of communicator with only 1 MPI
process and PETSC_COMM_WORLD is set to 2. If I set PETSC_COMM_WORLD to 1
(our MPI_COMM_WORLD), PETSc is hanging.

I am wondering if you could give me some tips how to debug this problem.

BR,
Sam