[petsc-users] DMLocalToLocal for DMPlex

2019-08-29 Thread Adrian Croucher via petsc-users

hi

If I have a local vector and I want to update all its ghost values from 
its non-ghost values, should I use DMLocalToLocalBegin()/End() ?


I have tried it and it gives me an error: "This DM does not support 
local to local maps".


The DM is a DMPlex. Is the local-to-local operation not implemented for 
DMPlex?


Or should I be using something else to do this?

- Adrian

--
Dr Adrian Croucher
Senior Research Fellow
Department of Engineering Science
University of Auckland, New Zealand
email: a.crouc...@auckland.ac.nz
tel: +64 (0)9 923 4611



Re: [petsc-users] petsc on windows

2019-08-29 Thread Balay, Satish via petsc-users
On MS-Windows - you need the location of the DLLs in PATH

Or use --with-shared-libraries=0

Satish

On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:

> When I use intel mpi, configuration, compile and test all work fine but I
> cannot use dll in my application.
> 
> On Thu, Aug 29, 2019 at 3:46 PM Sam Guo  wrote:
> 
> > After I removed following lines inin config/BuildSystem/config/package.py,
> > configuration finished without error.
> >  self.executeTest(self.checkDependencies)
> >  self.executeTest(self.configureLibrary)
> >  self.executeTest(self.checkSharedLibrary)
> >
> > I then add my mpi wrapper to ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
> > PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> > /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> >
> > On Thu, Aug 29, 2019 at 3:28 PM Balay, Satish  wrote:
> >
> >> On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:
> >>
> >> > I can link when I add my wrapper to
> >> > PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> >> > /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> >>
> >> I don't understand what you mean here. Add PCC_LINKER_FLAGS to where?
> >> This is a variable in configure generated makefile
> >>
> >> Since PETSc is not built [as configure failed] - there should be no
> >> configure generated makefiles.
> >>
> >> > (I don't understand why configure does not include my wrapper)
> >>
> >> Well the compiler gives the error below. Can you try to compile
> >> manually [i.e without PETSc or any petsc makefiles] a simple MPI code
> >> - say cpi.c from MPICH and see if it works?  [and copy/paste the log
> >> from this compile attempt.
> >>
> >> Satish
> >>
> >> >
> >> >
> >> > On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley 
> >> wrote:
> >> >
> >> > > On Thu, Aug 29, 2019 at 4:02 PM Sam Guo 
> >> wrote:
> >> > >
> >> > >> Thanks for the quick response. Attached please find the configure.log
> >> > >> containing the configure error.
> >> > >>
> >> > >
> >> > > Executing:
> >> /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> >> > > -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o
> >> > > -I/tmp/petsc-6DsCEk/config.compilers
> >> > > -I/tmp/petsc-6DsCEk/config.setCompilers
> >> > > -I/tmp/petsc-6DsCEk/config.utilities.closure
> >> > > -I/tmp/petsc-6DsCEk/config.headers
> >> > > -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails
> >> > > -I/tmp/petsc-6DsCEk/config.types -I/tmp/petsc-6DsCEk/config.atomics
> >> > > -I/tmp/petsc-6DsCEk/config.functions
> >> > > -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros
> >> > > -I/tmp/petsc-6DsCEk/config.utilities.missing
> >> > > -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes
> >> > > -I/tmp/petsc-6DsCEk/config.libraries  -MD -wd4996 -Z7
> >> > >  /tmp/petsc-6DsCEk/config.libraries/conftest.c
> >> > > stdout: conftest.c
> >> > > Successful compile:
> >> > > Source:
> >> > > #include "confdefs.h"
> >> > > #include "conffix.h"
> >> > > /* Override any gcc2 internal prototype to avoid an error. */
> >> > > char MPI_Init();
> >> > > static void _check_MPI_Init() { MPI_Init(); }
> >> > > char MPI_Comm_create();
> >> > > static void _check_MPI_Comm_create() { MPI_Comm_create(); }
> >> > >
> >> > > int main() {
> >> > > _check_MPI_Init();
> >> > > _check_MPI_Comm_create();;
> >> > >   return 0;
> >> > > }
> >> > > Executing:
> >> /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> >> > >  -o /tmp/petsc-6DsCEk/config.libraries/conftest.exe-MD -wd4996 -Z7
> >> > > /tmp/petsc-6DsCEk/config.libraries/conftest.o
> >> > >
> >> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> >> > > Ws2_32.lib
> >> > > stdout:
> >> > > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found
> >> or not
> >> > > built by the last incremental link; performing full link
> >> > > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> >> > > referenced in function _check_MPI_Init
> >> > > conftest.obj : error LNK2019: unresolved external symbol
> >> MPI_Comm_create
> >> > > referenced in function _check_MPI_Comm_create
> >> > > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error
> >> LNK1120:
> >> > > 2 unresolved externals
> >> > > Possible ERROR while running linker: exit code 2
> >> > > stdout:
> >> > > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found
> >> or not
> >> > > built by the last incremental link; performing full link
> >> > > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> >> > > referenced in function _check_MPI_Init
> >> > > conftest.obj : error LNK2019: unresolved external symbol
> >> MPI_Comm_create
> >> > > referenced in function _check_MPI_Comm_create
> >> > > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error
> >> LNK1120:
> >> > > 2 unresolved externals
> >> > >
> >> > > The link is definitely failing. Does it work if you do it by hand?
> >> > >
> >> > >   Thanks,
> >> > >
> >> > >  Matt
> >> > >
> >> > >
> >> > >> Regarding our 

Re: [petsc-users] petsc on windows

2019-08-29 Thread Sam Guo via petsc-users
After I removed following lines inin config/BuildSystem/config/package.py,
configuration finished without error.
 self.executeTest(self.checkDependencies)
 self.executeTest(self.configureLibrary)
 self.executeTest(self.checkSharedLibrary)

I then add my mpi wrapper to ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
PCC_LINKER_FLAGS =-MD -wd4996 -Z7
/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib

On Thu, Aug 29, 2019 at 3:28 PM Balay, Satish  wrote:

> On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:
>
> > I can link when I add my wrapper to
> > PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> > /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
>
> I don't understand what you mean here. Add PCC_LINKER_FLAGS to where? This
> is a variable in configure generated makefile
>
> Since PETSc is not built [as configure failed] - there should be no
> configure generated makefiles.
>
> > (I don't understand why configure does not include my wrapper)
>
> Well the compiler gives the error below. Can you try to compile
> manually [i.e without PETSc or any petsc makefiles] a simple MPI code
> - say cpi.c from MPICH and see if it works?  [and copy/paste the log
> from this compile attempt.
>
> Satish
>
> >
> >
> > On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley 
> wrote:
> >
> > > On Thu, Aug 29, 2019 at 4:02 PM Sam Guo  wrote:
> > >
> > >> Thanks for the quick response. Attached please find the configure.log
> > >> containing the configure error.
> > >>
> > >
> > > Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe
> cl
> > > -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o
> > > -I/tmp/petsc-6DsCEk/config.compilers
> > > -I/tmp/petsc-6DsCEk/config.setCompilers
> > > -I/tmp/petsc-6DsCEk/config.utilities.closure
> > > -I/tmp/petsc-6DsCEk/config.headers
> > > -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails
> > > -I/tmp/petsc-6DsCEk/config.types -I/tmp/petsc-6DsCEk/config.atomics
> > > -I/tmp/petsc-6DsCEk/config.functions
> > > -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros
> > > -I/tmp/petsc-6DsCEk/config.utilities.missing
> > > -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes
> > > -I/tmp/petsc-6DsCEk/config.libraries  -MD -wd4996 -Z7
> > >  /tmp/petsc-6DsCEk/config.libraries/conftest.c
> > > stdout: conftest.c
> > > Successful compile:
> > > Source:
> > > #include "confdefs.h"
> > > #include "conffix.h"
> > > /* Override any gcc2 internal prototype to avoid an error. */
> > > char MPI_Init();
> > > static void _check_MPI_Init() { MPI_Init(); }
> > > char MPI_Comm_create();
> > > static void _check_MPI_Comm_create() { MPI_Comm_create(); }
> > >
> > > int main() {
> > > _check_MPI_Init();
> > > _check_MPI_Comm_create();;
> > >   return 0;
> > > }
> > > Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe
> cl
> > >  -o /tmp/petsc-6DsCEk/config.libraries/conftest.exe-MD -wd4996 -Z7
> > > /tmp/petsc-6DsCEk/config.libraries/conftest.o
> > >
> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> > > Ws2_32.lib
> > > stdout:
> > > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or
> not
> > > built by the last incremental link; performing full link
> > > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> > > referenced in function _check_MPI_Init
> > > conftest.obj : error LNK2019: unresolved external symbol
> MPI_Comm_create
> > > referenced in function _check_MPI_Comm_create
> > > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error
> LNK1120:
> > > 2 unresolved externals
> > > Possible ERROR while running linker: exit code 2
> > > stdout:
> > > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or
> not
> > > built by the last incremental link; performing full link
> > > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> > > referenced in function _check_MPI_Init
> > > conftest.obj : error LNK2019: unresolved external symbol
> MPI_Comm_create
> > > referenced in function _check_MPI_Comm_create
> > > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error
> LNK1120:
> > > 2 unresolved externals
> > >
> > > The link is definitely failing. Does it work if you do it by hand?
> > >
> > >   Thanks,
> > >
> > >  Matt
> > >
> > >
> > >> Regarding our dup, our wrapper does support it. In fact, everything
> works
> > >> fine on Linux. I suspect on windows, PETSc picks the system mpi.h
> somehow.
> > >> I am investigating it.
> > >>
> > >> Thanks,
> > >> Sam
> > >>
> > >> On Thu, Aug 29, 2019 at 3:39 PM Matthew Knepley 
> > >> wrote:
> > >>
> > >>> On Thu, Aug 29, 2019 at 3:33 PM Sam Guo via petsc-users <
> > >>> petsc-users@mcs.anl.gov> wrote:
> > >>>
> >  Dear PETSc dev team,
> > I am looking some tips porting petsc to windows. We have our mpi
> >  wrapper (so we can switch different mpi). I configure petsc using
> >  --with-mpi-lib and --with-mpi-include
> >   ./configure --with-cc="win32fe cl" 

Re: [petsc-users] petsc on windows

2019-08-29 Thread Balay, Satish via petsc-users
On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:

> I can link when I add my wrapper to
> PCC_LINKER_FLAGS =-MD -wd4996 -Z7
> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib

I don't understand what you mean here. Add PCC_LINKER_FLAGS to where? This is a 
variable in configure generated makefile 

Since PETSc is not built [as configure failed] - there should be no configure 
generated makefiles.

> (I don't understand why configure does not include my wrapper)

Well the compiler gives the error below. Can you try to compile
manually [i.e without PETSc or any petsc makefiles] a simple MPI code
- say cpi.c from MPICH and see if it works?  [and copy/paste the log
from this compile attempt.

Satish

> 
> 
> On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley  wrote:
> 
> > On Thu, Aug 29, 2019 at 4:02 PM Sam Guo  wrote:
> >
> >> Thanks for the quick response. Attached please find the configure.log
> >> containing the configure error.
> >>
> >
> > Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> > -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o
> > -I/tmp/petsc-6DsCEk/config.compilers
> > -I/tmp/petsc-6DsCEk/config.setCompilers
> > -I/tmp/petsc-6DsCEk/config.utilities.closure
> > -I/tmp/petsc-6DsCEk/config.headers
> > -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails
> > -I/tmp/petsc-6DsCEk/config.types -I/tmp/petsc-6DsCEk/config.atomics
> > -I/tmp/petsc-6DsCEk/config.functions
> > -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros
> > -I/tmp/petsc-6DsCEk/config.utilities.missing
> > -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes
> > -I/tmp/petsc-6DsCEk/config.libraries  -MD -wd4996 -Z7
> >  /tmp/petsc-6DsCEk/config.libraries/conftest.c
> > stdout: conftest.c
> > Successful compile:
> > Source:
> > #include "confdefs.h"
> > #include "conffix.h"
> > /* Override any gcc2 internal prototype to avoid an error. */
> > char MPI_Init();
> > static void _check_MPI_Init() { MPI_Init(); }
> > char MPI_Comm_create();
> > static void _check_MPI_Comm_create() { MPI_Comm_create(); }
> >
> > int main() {
> > _check_MPI_Init();
> > _check_MPI_Comm_create();;
> >   return 0;
> > }
> > Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> >  -o /tmp/petsc-6DsCEk/config.libraries/conftest.exe-MD -wd4996 -Z7
> > /tmp/petsc-6DsCEk/config.libraries/conftest.o
> >  /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> > Ws2_32.lib
> > stdout:
> > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or not
> > built by the last incremental link; performing full link
> > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> > referenced in function _check_MPI_Init
> > conftest.obj : error LNK2019: unresolved external symbol MPI_Comm_create
> > referenced in function _check_MPI_Comm_create
> > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120:
> > 2 unresolved externals
> > Possible ERROR while running linker: exit code 2
> > stdout:
> > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or not
> > built by the last incremental link; performing full link
> > conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> > referenced in function _check_MPI_Init
> > conftest.obj : error LNK2019: unresolved external symbol MPI_Comm_create
> > referenced in function _check_MPI_Comm_create
> > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120:
> > 2 unresolved externals
> >
> > The link is definitely failing. Does it work if you do it by hand?
> >
> >   Thanks,
> >
> >  Matt
> >
> >
> >> Regarding our dup, our wrapper does support it. In fact, everything works
> >> fine on Linux. I suspect on windows, PETSc picks the system mpi.h somehow.
> >> I am investigating it.
> >>
> >> Thanks,
> >> Sam
> >>
> >> On Thu, Aug 29, 2019 at 3:39 PM Matthew Knepley 
> >> wrote:
> >>
> >>> On Thu, Aug 29, 2019 at 3:33 PM Sam Guo via petsc-users <
> >>> petsc-users@mcs.anl.gov> wrote:
> >>>
>  Dear PETSc dev team,
> I am looking some tips porting petsc to windows. We have our mpi
>  wrapper (so we can switch different mpi). I configure petsc using
>  --with-mpi-lib and --with-mpi-include
>   ./configure --with-cc="win32fe cl" --with-fc=0
>  --download-f2cblaslapack
>  --with-mpi-lib=/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
>  --with-mpi-include=/home/xianzhongg/dev/star/base/src/mpi/include
>  --with-shared-libaries=1
> 
>  But I got error
> 
>  ===
>   Configuring PETSc to compile on your system
> 
>  ===
>  TESTING: check from
>  config.libraries(config/BuildSystem/config/libraries.py:154)
>  ***
>    

Re: [petsc-users] question about CISS

2019-08-29 Thread Jose E. Roman via petsc-users



> El 29 ago 2019, a las 22:20, Povolotskyi, Mykhailo  
> escribió:
> 
> Thank you for suggestion.
> 
> Is it interfaced to SLEPC?

No, could be a future project...

> 
> 
> On 08/29/2019 04:14 PM, Jose E. Roman wrote:
>> I am not an expert in contour integral eigensolvers. I think difficulties 
>> come with corners, so ellipses are the best choice. I don't think ring 
>> regions are relevant here.
>> 
>> Have you considered using ScaLAPACK. Some time ago we were able to address 
>> problems of size up to 400k   https://doi.org/10.1017/jfm.2016.208
>> 
>> Jose
>> 
>> 
>>> El 29 ago 2019, a las 21:55, Povolotskyi, Mykhailo  
>>> escribió:
>>> 
>>> Thank you, Jose,
>>> 
>>> what about rings? Are they better than rectangles?
>>> 
>>> Michael.
>>> 
>>> 
>>> On 08/29/2019 03:44 PM, Jose E. Roman wrote:
 The CISS solver is supposed to estimate the number of eigenvalues 
 contained in the contour. My impression is that the estimation is less 
 accurate in case of rectangular contours, compared to elliptic ones. But 
 of course, with ellipses it is not possible to fully cover the complex 
 plane unless there is some overlap.
 
 Jose
 
 
> El 29 ago 2019, a las 20:56, Povolotskyi, Mykhailo via petsc-users 
>  escribió:
> 
> Hello everyone,
> 
> this is a question about  SLEPc.
> 
> The problem that I need to solve is as follows.
> 
> I have a matrix and I need a full spectrum of it (both eigenvalues and
> eigenvectors).
> 
> The regular way is to use Lapack, but it is slow. I decided to try the
> following:
> 
> a) compute the bounds of the spectrum using Krylov Schur approach.
> 
> b) divide the complex eigenvalue plane into rectangular areas, then
> apply CISS to each area in parallel.
> 
> However, I found that the solver is missing some eigenvalues, even if my
> rectangles cover the whole spectral area.
> 
> My question: can this approach work in principle? If yes, how one can
> set-up CISS solver to not loose the eigenvalues?
> 
> Thank you,
> 
> Michael.
> 
> 



Re: [petsc-users] petsc on windows

2019-08-29 Thread Sam Guo via petsc-users
I can link when I add my wrapper to
PCC_LINKER_FLAGS =-MD -wd4996 -Z7
/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
(I don't understand why configure does not include my wrapper)


On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley  wrote:

> On Thu, Aug 29, 2019 at 4:02 PM Sam Guo  wrote:
>
>> Thanks for the quick response. Attached please find the configure.log
>> containing the configure error.
>>
>
> Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
> -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o
> -I/tmp/petsc-6DsCEk/config.compilers
> -I/tmp/petsc-6DsCEk/config.setCompilers
> -I/tmp/petsc-6DsCEk/config.utilities.closure
> -I/tmp/petsc-6DsCEk/config.headers
> -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails
> -I/tmp/petsc-6DsCEk/config.types -I/tmp/petsc-6DsCEk/config.atomics
> -I/tmp/petsc-6DsCEk/config.functions
> -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros
> -I/tmp/petsc-6DsCEk/config.utilities.missing
> -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes
> -I/tmp/petsc-6DsCEk/config.libraries  -MD -wd4996 -Z7
>  /tmp/petsc-6DsCEk/config.libraries/conftest.c
> stdout: conftest.c
> Successful compile:
> Source:
> #include "confdefs.h"
> #include "conffix.h"
> /* Override any gcc2 internal prototype to avoid an error. */
> char MPI_Init();
> static void _check_MPI_Init() { MPI_Init(); }
> char MPI_Comm_create();
> static void _check_MPI_Comm_create() { MPI_Comm_create(); }
>
> int main() {
> _check_MPI_Init();
> _check_MPI_Comm_create();;
>   return 0;
> }
> Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl
>  -o /tmp/petsc-6DsCEk/config.libraries/conftest.exe-MD -wd4996 -Z7
> /tmp/petsc-6DsCEk/config.libraries/conftest.o
>  /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> Ws2_32.lib
> stdout:
> LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or not
> built by the last incremental link; performing full link
> conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> referenced in function _check_MPI_Init
> conftest.obj : error LNK2019: unresolved external symbol MPI_Comm_create
> referenced in function _check_MPI_Comm_create
> C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120:
> 2 unresolved externals
> Possible ERROR while running linker: exit code 2
> stdout:
> LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or not
> built by the last incremental link; performing full link
> conftest.obj : error LNK2019: unresolved external symbol MPI_Init
> referenced in function _check_MPI_Init
> conftest.obj : error LNK2019: unresolved external symbol MPI_Comm_create
> referenced in function _check_MPI_Comm_create
> C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120:
> 2 unresolved externals
>
> The link is definitely failing. Does it work if you do it by hand?
>
>   Thanks,
>
>  Matt
>
>
>> Regarding our dup, our wrapper does support it. In fact, everything works
>> fine on Linux. I suspect on windows, PETSc picks the system mpi.h somehow.
>> I am investigating it.
>>
>> Thanks,
>> Sam
>>
>> On Thu, Aug 29, 2019 at 3:39 PM Matthew Knepley 
>> wrote:
>>
>>> On Thu, Aug 29, 2019 at 3:33 PM Sam Guo via petsc-users <
>>> petsc-users@mcs.anl.gov> wrote:
>>>
 Dear PETSc dev team,
I am looking some tips porting petsc to windows. We have our mpi
 wrapper (so we can switch different mpi). I configure petsc using
 --with-mpi-lib and --with-mpi-include
  ./configure --with-cc="win32fe cl" --with-fc=0
 --download-f2cblaslapack
 --with-mpi-lib=/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
 --with-mpi-include=/home/xianzhongg/dev/star/base/src/mpi/include
 --with-shared-libaries=1

 But I got error

 ===
  Configuring PETSc to compile on your system

 ===
 TESTING: check from
 config.libraries(config/BuildSystem/config/libraries.py:154)
 ***
  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log
 for details):

 ---
 --with-mpi-lib=['/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib']
 and
 --with-mpi-include=['/home/xianzhongg/dev/star/base/src/mpi/include']
 did not work

 ***

>>>
>>> Your MPI wrapper should pass the tests here. Send the configure.log
>>>
>>>
 To fix the configuration error,  in
 config/BuildSystem/config/package.py, I removed
  self.executeTest(self.checkDependencies)
  

Re: [petsc-users] question about CISS

2019-08-29 Thread Povolotskyi, Mykhailo via petsc-users
It is not a symmetric matrix

On 08/29/2019 04:30 PM, Matthew Knepley wrote:
On Thu, Aug 29, 2019 at 4:29 PM Jed Brown via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:
Elemental also has distributed-memory eigensolvers that should be at
least as good as ScaLAPACK's.  There is support for Elemental in PETSc,
but not yet in SLEPc.

Also if its symmetric, isn't https://elpa.mpcdf.mpg.de/ fairly scalable?

   Matt

"Povolotskyi, Mykhailo via petsc-users" 
mailto:petsc-users@mcs.anl.gov>> writes:

> Thank you for suggestion.
>
> Is it interfaced to SLEPC?
>
>
> On 08/29/2019 04:14 PM, Jose E. Roman wrote:
>> I am not an expert in contour integral eigensolvers. I think difficulties 
>> come with corners, so ellipses are the best choice. I don't think ring 
>> regions are relevant here.
>>
>> Have you considered using ScaLAPACK. Some time ago we were able to address 
>> problems of size up to 400k   https://doi.org/10.1017/jfm.2016.208
>>
>> Jose
>>
>>
>>> El 29 ago 2019, a las 21:55, Povolotskyi, Mykhailo 
>>> mailto:mpovo...@purdue.edu>> escribió:
>>>
>>> Thank you, Jose,
>>>
>>> what about rings? Are they better than rectangles?
>>>
>>> Michael.
>>>
>>>
>>> On 08/29/2019 03:44 PM, Jose E. Roman wrote:
 The CISS solver is supposed to estimate the number of eigenvalues 
 contained in the contour. My impression is that the estimation is less 
 accurate in case of rectangular contours, compared to elliptic ones. But 
 of course, with ellipses it is not possible to fully cover the complex 
 plane unless there is some overlap.

 Jose


> El 29 ago 2019, a las 20:56, Povolotskyi, Mykhailo via petsc-users 
> mailto:petsc-users@mcs.anl.gov>> escribió:
>
> Hello everyone,
>
> this is a question about  SLEPc.
>
> The problem that I need to solve is as follows.
>
> I have a matrix and I need a full spectrum of it (both eigenvalues and
> eigenvectors).
>
> The regular way is to use Lapack, but it is slow. I decided to try the
> following:
>
> a) compute the bounds of the spectrum using Krylov Schur approach.
>
> b) divide the complex eigenvalue plane into rectangular areas, then
> apply CISS to each area in parallel.
>
> However, I found that the solver is missing some eigenvalues, even if my
> rectangles cover the whole spectral area.
>
> My question: can this approach work in principle? If yes, how one can
> set-up CISS solver to not loose the eigenvalues?
>
> Thank you,
>
> Michael.
>


--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/



Re: [petsc-users] question about CISS

2019-08-29 Thread Jed Brown via petsc-users
Elemental also has distributed-memory eigensolvers that should be at
least as good as ScaLAPACK's.  There is support for Elemental in PETSc,
but not yet in SLEPc.

"Povolotskyi, Mykhailo via petsc-users"  writes:

> Thank you for suggestion.
>
> Is it interfaced to SLEPC?
>
>
> On 08/29/2019 04:14 PM, Jose E. Roman wrote:
>> I am not an expert in contour integral eigensolvers. I think difficulties 
>> come with corners, so ellipses are the best choice. I don't think ring 
>> regions are relevant here.
>>
>> Have you considered using ScaLAPACK. Some time ago we were able to address 
>> problems of size up to 400k   https://doi.org/10.1017/jfm.2016.208
>>
>> Jose
>>
>>
>>> El 29 ago 2019, a las 21:55, Povolotskyi, Mykhailo  
>>> escribió:
>>>
>>> Thank you, Jose,
>>>
>>> what about rings? Are they better than rectangles?
>>>
>>> Michael.
>>>
>>>
>>> On 08/29/2019 03:44 PM, Jose E. Roman wrote:
 The CISS solver is supposed to estimate the number of eigenvalues 
 contained in the contour. My impression is that the estimation is less 
 accurate in case of rectangular contours, compared to elliptic ones. But 
 of course, with ellipses it is not possible to fully cover the complex 
 plane unless there is some overlap.

 Jose


> El 29 ago 2019, a las 20:56, Povolotskyi, Mykhailo via petsc-users 
>  escribió:
>
> Hello everyone,
>
> this is a question about  SLEPc.
>
> The problem that I need to solve is as follows.
>
> I have a matrix and I need a full spectrum of it (both eigenvalues and
> eigenvectors).
>
> The regular way is to use Lapack, but it is slow. I decided to try the
> following:
>
> a) compute the bounds of the spectrum using Krylov Schur approach.
>
> b) divide the complex eigenvalue plane into rectangular areas, then
> apply CISS to each area in parallel.
>
> However, I found that the solver is missing some eigenvalues, even if my
> rectangles cover the whole spectral area.
>
> My question: can this approach work in principle? If yes, how one can
> set-up CISS solver to not loose the eigenvalues?
>
> Thank you,
>
> Michael.
>


Re: [petsc-users] question about CISS

2019-08-29 Thread Povolotskyi, Mykhailo via petsc-users
Thank you for suggestion.

Is it interfaced to SLEPC?


On 08/29/2019 04:14 PM, Jose E. Roman wrote:
> I am not an expert in contour integral eigensolvers. I think difficulties 
> come with corners, so ellipses are the best choice. I don't think ring 
> regions are relevant here.
>
> Have you considered using ScaLAPACK. Some time ago we were able to address 
> problems of size up to 400k   https://doi.org/10.1017/jfm.2016.208
>
> Jose
>
>
>> El 29 ago 2019, a las 21:55, Povolotskyi, Mykhailo  
>> escribió:
>>
>> Thank you, Jose,
>>
>> what about rings? Are they better than rectangles?
>>
>> Michael.
>>
>>
>> On 08/29/2019 03:44 PM, Jose E. Roman wrote:
>>> The CISS solver is supposed to estimate the number of eigenvalues contained 
>>> in the contour. My impression is that the estimation is less accurate in 
>>> case of rectangular contours, compared to elliptic ones. But of course, 
>>> with ellipses it is not possible to fully cover the complex plane unless 
>>> there is some overlap.
>>>
>>> Jose
>>>
>>>
 El 29 ago 2019, a las 20:56, Povolotskyi, Mykhailo via petsc-users 
  escribió:

 Hello everyone,

 this is a question about  SLEPc.

 The problem that I need to solve is as follows.

 I have a matrix and I need a full spectrum of it (both eigenvalues and
 eigenvectors).

 The regular way is to use Lapack, but it is slow. I decided to try the
 following:

 a) compute the bounds of the spectrum using Krylov Schur approach.

 b) divide the complex eigenvalue plane into rectangular areas, then
 apply CISS to each area in parallel.

 However, I found that the solver is missing some eigenvalues, even if my
 rectangles cover the whole spectral area.

 My question: can this approach work in principle? If yes, how one can
 set-up CISS solver to not loose the eigenvalues?

 Thank you,

 Michael.




Re: [petsc-users] question about CISS

2019-08-29 Thread Jose E. Roman via petsc-users
I am not an expert in contour integral eigensolvers. I think difficulties come 
with corners, so ellipses are the best choice. I don't think ring regions are 
relevant here.

Have you considered using ScaLAPACK. Some time ago we were able to address 
problems of size up to 400k   https://doi.org/10.1017/jfm.2016.208

Jose


> El 29 ago 2019, a las 21:55, Povolotskyi, Mykhailo  
> escribió:
> 
> Thank you, Jose,
> 
> what about rings? Are they better than rectangles?
> 
> Michael.
> 
> 
> On 08/29/2019 03:44 PM, Jose E. Roman wrote:
>> The CISS solver is supposed to estimate the number of eigenvalues contained 
>> in the contour. My impression is that the estimation is less accurate in 
>> case of rectangular contours, compared to elliptic ones. But of course, with 
>> ellipses it is not possible to fully cover the complex plane unless there is 
>> some overlap.
>> 
>> Jose
>> 
>> 
>>> El 29 ago 2019, a las 20:56, Povolotskyi, Mykhailo via petsc-users 
>>>  escribió:
>>> 
>>> Hello everyone,
>>> 
>>> this is a question about  SLEPc.
>>> 
>>> The problem that I need to solve is as follows.
>>> 
>>> I have a matrix and I need a full spectrum of it (both eigenvalues and
>>> eigenvectors).
>>> 
>>> The regular way is to use Lapack, but it is slow. I decided to try the
>>> following:
>>> 
>>> a) compute the bounds of the spectrum using Krylov Schur approach.
>>> 
>>> b) divide the complex eigenvalue plane into rectangular areas, then
>>> apply CISS to each area in parallel.
>>> 
>>> However, I found that the solver is missing some eigenvalues, even if my
>>> rectangles cover the whole spectral area.
>>> 
>>> My question: can this approach work in principle? If yes, how one can
>>> set-up CISS solver to not loose the eigenvalues?
>>> 
>>> Thank you,
>>> 
>>> Michael.
>>> 
> 



Re: [petsc-users] petsc on windows

2019-08-29 Thread Sam Guo via petsc-users
Thanks for the quick response. Attached please find the configure.log
containing the configure error.

Regarding our dup, our wrapper does support it. In fact, everything works
fine on Linux. I suspect on windows, PETSc picks the system mpi.h somehow.
I am investigating it.

Thanks,
Sam

On Thu, Aug 29, 2019 at 3:39 PM Matthew Knepley  wrote:

> On Thu, Aug 29, 2019 at 3:33 PM Sam Guo via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
>> Dear PETSc dev team,
>>I am looking some tips porting petsc to windows. We have our mpi
>> wrapper (so we can switch different mpi). I configure petsc using
>> --with-mpi-lib and --with-mpi-include
>>  ./configure --with-cc="win32fe cl" --with-fc=0 --download-f2cblaslapack
>> --with-mpi-lib=/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
>> --with-mpi-include=/home/xianzhongg/dev/star/base/src/mpi/include
>> --with-shared-libaries=1
>>
>> But I got error
>>
>> ===
>>  Configuring PETSc to compile on your system
>>
>> ===
>> TESTING: check from
>> config.libraries(config/BuildSystem/config/libraries.py:154)
>> ***
>>  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for
>> details):
>>
>> ---
>> --with-mpi-lib=['/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib']
>> and
>> --with-mpi-include=['/home/xianzhongg/dev/star/base/src/mpi/include'] did
>> not work
>>
>> ***
>>
>
> Your MPI wrapper should pass the tests here. Send the configure.log
>
>
>> To fix the configuration error,  in config/BuildSystem/config/package.py,
>> I removed
>>  self.executeTest(self.checkDependencies)
>>  self.executeTest(self.configureLibrary)
>>  self.executeTest(self.checkSharedLibrary)
>>
>> To link, I add my mpi wrapper
>> to ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
>> PCC_LINKER_FLAGS =-MD -wd4996 -Z7
>> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
>>
>> I got libpetstc.dll and libpetstc.lib. When I try to test it inside our
>> code, PETSc somehow crates a duplicate of communicator with only 1 MPI
>> process and PETSC_COMM_WORLD is set to 2. If I set PETSC_COMM_WORLD to 1
>> (our MPI_COMM_WORLD), PETSc is hanging.
>>
>
> We do dup the communicator on entry. Shouldn't that be supported by your
> wrapper?
>
>   Thanks,
>
>  Matt
>
>
>> I am wondering if you could give me some tips how to debug this problem.
>>
>> BR,
>> Sam
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


configure.log
Description: Binary data


Re: [petsc-users] question about CISS

2019-08-29 Thread Povolotskyi, Mykhailo via petsc-users
Thank you, Jose,

what about rings? Are they better than rectangles?

Michael.


On 08/29/2019 03:44 PM, Jose E. Roman wrote:
> The CISS solver is supposed to estimate the number of eigenvalues contained 
> in the contour. My impression is that the estimation is less accurate in case 
> of rectangular contours, compared to elliptic ones. But of course, with 
> ellipses it is not possible to fully cover the complex plane unless there is 
> some overlap.
>
> Jose
>   
>
>> El 29 ago 2019, a las 20:56, Povolotskyi, Mykhailo via petsc-users 
>>  escribió:
>>
>> Hello everyone,
>>
>> this is a question about  SLEPc.
>>
>> The problem that I need to solve is as follows.
>>
>> I have a matrix and I need a full spectrum of it (both eigenvalues and
>> eigenvectors).
>>
>> The regular way is to use Lapack, but it is slow. I decided to try the
>> following:
>>
>> a) compute the bounds of the spectrum using Krylov Schur approach.
>>
>> b) divide the complex eigenvalue plane into rectangular areas, then
>> apply CISS to each area in parallel.
>>
>> However, I found that the solver is missing some eigenvalues, even if my
>> rectangles cover the whole spectral area.
>>
>> My question: can this approach work in principle? If yes, how one can
>> set-up CISS solver to not loose the eigenvalues?
>>
>> Thank you,
>>
>> Michael.
>>



Re: [petsc-users] question about CISS

2019-08-29 Thread Jose E. Roman via petsc-users
The CISS solver is supposed to estimate the number of eigenvalues contained in 
the contour. My impression is that the estimation is less accurate in case of 
rectangular contours, compared to elliptic ones. But of course, with ellipses 
it is not possible to fully cover the complex plane unless there is some 
overlap.

Jose
 

> El 29 ago 2019, a las 20:56, Povolotskyi, Mykhailo via petsc-users 
>  escribió:
> 
> Hello everyone,
> 
> this is a question about  SLEPc.
> 
> The problem that I need to solve is as follows.
> 
> I have a matrix and I need a full spectrum of it (both eigenvalues and 
> eigenvectors).
> 
> The regular way is to use Lapack, but it is slow. I decided to try the 
> following:
> 
> a) compute the bounds of the spectrum using Krylov Schur approach.
> 
> b) divide the complex eigenvalue plane into rectangular areas, then 
> apply CISS to each area in parallel.
> 
> However, I found that the solver is missing some eigenvalues, even if my 
> rectangles cover the whole spectral area.
> 
> My question: can this approach work in principle? If yes, how one can 
> set-up CISS solver to not loose the eigenvalues?
> 
> Thank you,
> 
> Michael.
> 



[petsc-users] petsc on windows

2019-08-29 Thread Sam Guo via petsc-users
Dear PETSc dev team,
   I am looking some tips porting petsc to windows. We have our mpi wrapper
(so we can switch different mpi). I configure petsc using --with-mpi-lib
and --with-mpi-include
 ./configure --with-cc="win32fe cl" --with-fc=0 --download-f2cblaslapack
--with-mpi-lib=/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
--with-mpi-include=/home/xianzhongg/dev/star/base/src/mpi/include
--with-shared-libaries=1

But I got error
===
 Configuring PETSc to compile on your system
===
TESTING: check from
config.libraries(config/BuildSystem/config/libraries.py:154)
***
 UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for
details):
---
--with-mpi-lib=['/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib']
and
--with-mpi-include=['/home/xianzhongg/dev/star/base/src/mpi/include'] did
not work
***

To fix the configuration error,  in config/BuildSystem/config/package.py, I
removed
 self.executeTest(self.checkDependencies)
 self.executeTest(self.configureLibrary)
 self.executeTest(self.checkSharedLibrary)

To link, I add my mpi wrapper
to ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
PCC_LINKER_FLAGS =-MD -wd4996 -Z7
/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib

I got libpetstc.dll and libpetstc.lib. When I try to test it inside our
code, PETSc somehow crates a duplicate of communicator with only 1 MPI
process and PETSC_COMM_WORLD is set to 2. If I set PETSC_COMM_WORLD to 1
(our MPI_COMM_WORLD), PETSc is hanging.

I am wondering if you could give me some tips how to debug this problem.

BR,
Sam


[petsc-users] question about CISS

2019-08-29 Thread Povolotskyi, Mykhailo via petsc-users
Hello everyone,

this is a question about  SLEPc.

The problem that I need to solve is as follows.

I have a matrix and I need a full spectrum of it (both eigenvalues and 
eigenvectors).

The regular way is to use Lapack, but it is slow. I decided to try the 
following:

a) compute the bounds of the spectrum using Krylov Schur approach.

b) divide the complex eigenvalue plane into rectangular areas, then 
apply CISS to each area in parallel.

However, I found that the solver is missing some eigenvalues, even if my 
rectangles cover the whole spectral area.

My question: can this approach work in principle? If yes, how one can 
set-up CISS solver to not loose the eigenvalues?

Thank you,

Michael.