Hi Evan,
I’m one of those users, who has multiple shared libraries.
It is quite stable since I create wrappers around the PETSc functions and hide
the native PETSc symbols.
But, yes, the interoperations are not possible.
Michael.
Sent from
Hello,
I want to use CISS for my eigenvalue problem. To test it I tried a
simple (2x2) matrix case, but the code failed with the following message:
[0]PETSC ERROR: - Error Message
--
[0]PETSC ERROR: Error in
n case
> of rectangular contours, compared to elliptic ones. But of course, with
> ellipses it is not possible to fully cover the complex plane unless there is
> some overlap.
>
> Jose
>
>
>> El 29 ago 2019, a las 20:56, Povolotskyi, Mykhailo via petsc-users
&
hat the estimation is less accurate in
>>> case of rectangular contours, compared to elliptic ones. But of course,
>>> with ellipses it is not possible to fully cover the complex plane unless
>>> there is some overlap.
>>>
>>> Jose
>&
Hello everyone,
this is a question about SLEPc.
The problem that I need to solve is as follows.
I have a matrix and I need a full spectrum of it (both eigenvalues and
eigenvectors).
The regular way is to use Lapack, but it is slow. I decided to try the
following:
a) compute the bounds of
upport for Elemental in PETSc,
but not yet in SLEPc.
Also if its symmetric, isn't https://elpa.mpcdf.mpg.de/ fairly scalable?
Matt
"Povolotskyi, Mykhailo via petsc-users"
mailto:petsc-users@mcs.anl.gov>> writes:
> Thank you for suggestion.
>
> Is it interfaced to SLEP
Hello,
I'm currently using CISS via SLEPc.
I would like to use parallelization by partitioning over quadrature points,
using the option -eps_ciss_partitions.
I have done the following:
1. created a matrix on each MPI rank
MatCreateDense(MPI_COMM_SELF, matrix_size, matrix_size,
ly recommended. The ring region is used
> for special situations, such as the one in our joint paper
> https://doi.org/10.1007/978-3-319-62426-6_2 where the eigenvalues lie on the
> unit circle but we want to avoid eigenvalues close to the origin.
>
> Jose
>
>
>> El 6 sept
Dear Petsc developers,
I found that MatCreateRedundantMatrix does not support dense matrices.
This causes the following problem: I cannot use CISS eigensolver from
SLEPC with dense matrices with parallelization over quadrature points.
Is it possible for you to add this support?
Thank you,
Hello,
I'm upgrading petsc from 3.8 to 3.11.
In doing so, I see an error message:
UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details):
---
Cannot use SuperLU_DIST with 64 bit BLAS/Lapack indices
.
On 09/20/2019 03:53 PM, Balay, Satish wrote:
> --with-64-bit-indices=1 => PetscInt = int64_t
> --known-64-bit-blas-indices=1 => blas specified uses 64bit indices.
>
> What is your requirement (use case)?
>
> Satish
>
> On Fri, 20 Sep 2019, Povolotskyi, Mykhailo via petsc-
wrote:
> --with-64-bit-indices=1 => PetscInt = int64_t
> --known-64-bit-blas-indices=1 => blas specified uses 64bit indices.
>
> What is your requirement (use case)?
>
> Satish
>
> On Fri, 20 Sep 2019, Povolotskyi, Mykhailo via petsc-users wrote:
>
>>
error disappeared, but it still exists (I had to
>> wait longer).
>>
>> The configuration log can be accessed here:
>>
>> https://www.dropbox.com/s/tmkksemu294j719/configure.log?dl=0
>>
>> Sorry for the last e-mail.
>>
>> Michael.
>>
>>
&g
Does it mean I have to configure petsc with --with-64-bit-indices=1 ?
On 09/20/2019 03:41 PM, Matthew Knepley wrote:
On Fri, Sep 20, 2019 at 1:55 PM Povolotskyi, Mykhailo via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:
Hello,
I'm upgrading petsc from 3.8 to 3.11.
In doing so,
gt;> El 19 sept 2019, a las 6:20, hong--- via petsc-users
>> escribió:
>>
>> Michael,
>> We have support of MatCreateRedundantMatrix for dense matrices. For example,
>> petsc/src/mat/examples/tests/ex9.c:
>> mpiexec -n 4 ./ex9 -mat_type dense -view_mat -nsubco
MatCreateRedundantMatrix for dense matrices. For example,
>> petsc/src/mat/examples/tests/ex9.c:
>> mpiexec -n 4 ./ex9 -mat_type dense -view_mat -nsubcomms 2
>>
>> Hong
>>
>> On Wed, Sep 18, 2019 at 5:40 PM Povolotskyi, Mykhailo via petsc-users
>>
k=0
> <<<<<<<
>
> And configure completed successfully. What issue are you encountering? Why do
> you think its activating MPI?
>
> Satish
>
>
> On Tue, 19 Nov 2019, Balay, Satish via petsc-users wrote:
>
>> On Tue, 19 Nov 2019, Povolotskyi, Mykh
On 11/19/2019 2:47 PM, Balay, Satish wrote:
> On Tue, 19 Nov 2019, Povolotskyi, Mykhailo via petsc-users wrote:
>
>> Hello,
>>
>> I'm trying to build PETSC without MPI.
>>
>> Even if I specify --with_mpi=0, the configuration script still activates
>
pleted successfully. What issue are you encountering? Why do
> you think its activating MPI?
>
> Satish
>
>
> On Tue, 19 Nov 2019, Balay, Satish via petsc-users wrote:
>
>> On Tue, 19 Nov 2019, Povolotskyi, Mykhailo via petsc-users wrote:
>>
>>> Hello,
.
Thanks,
Matt
On Tue, Nov 19, 2019 at 2:53 PM Povolotskyi, Mykhailo via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:
Why it did not work then?
On 11/19/2019 2:51 PM, Balay, Satish wrote:
> And I see from configure.log - you are using the correct option.
>
> Co
in configure is
> enabled]
>
> Satish
>
> On Tue, 19 Nov 2019, Povolotskyi, Mykhailo via petsc-users wrote:
>
>> Let me explain the problem.
>>
>> This log file has
>>
>> #ifndef PETSC_HAVE_MPI
>> #define PETSC_HAVE_MPI 1
>> #endif
>>
&
Dear Petsc developers,
in my application I have to solve millions of linear and non-linear
systems with small matrices (2x2, 3x3,..., 10x10).
I consider them as dense, and use SNES with KSP method PREONLY, and LU
preconditioner.
I found that when KSPSolve is called only 25% of time is spend
Hi Matthew,
is it possible to do in principle what I would like to do?
On 9/25/2019 3:12 AM, Matthew Knepley wrote:
On Wed, Sep 25, 2019 at 1:27 AM Povolotskyi, Mykhailo via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:
Dear Petsc developers,
in my application I have to solve mi
23 matches
Mail list logo