Thanks for the fast answer !
The error coming from MUMPS is :
On return from DMUMPS, INFOG(1)= -9
On return from DMUMPS, INFOG(2)=29088157
The matrix size : 4972410*4972410
I need only 1 eigen value, the one near zero.
In order to have more precision, i put ncv at 500.
I'm
Thank you. It is clear now.
On 11/19/2019 3:07 PM, Balay, Satish wrote:
> Not sure why you are looking at this flag and interpreting it - PETSc code
> uses the flag PETSC_HAVE_MPIUNI to check for a sequential build.
>
> [this one states the module MPI similar to BLASLAPACK etc in configure is
I see.
Actually, my goal is to compile petsc without real MPI to use it with libmesh.
You are saying that PETSC_HAVE_MPI is not a sign that Petsc is built with MPI.
It means you have MPIUNI which is a serial code, but has an interface of MPI.
Correct?
On 11/19/2019 3:00 PM, Matthew Knepley
Not sure why you are looking at this flag and interpreting it - PETSc code uses
the flag PETSC_HAVE_MPIUNI to check for a sequential build.
[this one states the module MPI similar to BLASLAPACK etc in configure is
enabled]
Satish
On Tue, 19 Nov 2019, Povolotskyi, Mykhailo via petsc-users
Let me explain the problem.
This log file has
#ifndef PETSC_HAVE_MPI
#define PETSC_HAVE_MPI 1
#endif
while I need to have PETSC without MPI.
On 11/19/2019 2:55 PM, Matthew Knepley wrote:
The log you sent has configure completely successfully. Please retry and send
the log for a failed run.
Why it did not work then?
On 11/19/2019 2:51 PM, Balay, Satish wrote:
> And I see from configure.log - you are using the correct option.
>
> Configure Options: --configModules=PETSc.Configure
> --optionsModule=config.compilerOptions --with-scalar-type=real --with-x=0
> --with-hdf5=0
On 11/19/2019 2:47 PM, Balay, Satish wrote:
> On Tue, 19 Nov 2019, Povolotskyi, Mykhailo via petsc-users wrote:
>
>> Hello,
>>
>> I'm trying to build PETSC without MPI.
>>
>> Even if I specify --with_mpi=0, the configuration script still activates
>> MPI.
>>
>> I attach the configure.log.
>>
>>
And I see from configure.log - you are using the correct option.
>>>
Configure Options: --configModules=PETSc.Configure
--optionsModule=config.compilerOptions --with-scalar-type=real --with-x=0
--with-hdf5=0 --with-single-library=1 --with-shared-libraries=0 --with-log=0
--with-mpi=0
On Tue, 19 Nov 2019, Povolotskyi, Mykhailo via petsc-users wrote:
> Hello,
>
> I'm trying to build PETSC without MPI.
>
> Even if I specify --with_mpi=0, the configuration script still activates
> MPI.
>
> I attach the configure.log.
>
> What am I doing wrong?
The option is --with-mpi=0
Are you getting an error from MUMPS or from BV? What is the error message you
get? What is the size of the matrix? How many eigenvalues do you need to
compute?
In principle you can use any KSP+PC, see section 3.4.1 of the users manual. If
you have a good preconditioner, then an alternative to
Hi all,
I'm trying to solve a huge generalize (unsymetric) eigen value problem
with SLEPc + MUMPS. We actually failed to allocate the requested memory
for MUMPS factorization (we tried BVVECS).
We would like to know if there is an alternate iterative way of solving
such problems.
Thank you,
Thanks for the fix. https://gitlab.com/petsc/petsc/pipelines/96957999
> On Nov 14, 2019, at 2:04 PM, hg wrote:
>
> Hello
>
> It turns out that hwloc is not installed on the cluster system that I'm
> using. Without hwloc, pastix will run into the branch using sched_setaffinity
> and
For a while I had put in an incorrect URL in the download location.
Perhaps you are using PETSc 3.12.0 and need to use 3.12.1 from
https://www.mcs.anl.gov/petsc/download/index.html
Otherwise please send configure.log
> On Nov 19, 2019, at 4:40 AM, Santiago Andres Triana via
Hello petsc-users:
I found this error when configure tries to download fblaslapack:
***
UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for
details):
14 matches
Mail list logo