ices=1
>
> Satish
>
> On Tue, 2 Jun 2020, Evan Um wrote:
>
> > Dear PETSC users,
> >
> > Using C++, I tried to build a very large sparse matrix and had a problem
> > shown below. Indices of the matrix should be "long int" rather than "int"
Dear PETSC users,
Using C++, I tried to build a very large sparse matrix and had a problem
shown below. Indices of the matrix should be "long int" rather than "int"
but PETSC does not allow this. How could I dodge this problem? Thanks for
your comments.
Regards,
Evan
fe.cpp(2009): error:
Dear PETSC users,
My colleague and I found an complex number related error (passing arguments
of PetscMax and PetscMin) during PETSC installation. If you have a similar
experience, could you suggest a trouble shooting?
Thanks,
Evan
Installation configuration:
d=${HOME}/usr/lib/petsc-3.9.3
Configuring incomplete, errors occurred!
See also
"/home/evan/CLionProjects/hellopetsc/cmake-build-debug/CMakeFiles/CMakeOutput.log".
See also
"/home/evan/CLionProjects/hellopetsc/cmake-build-debug/CMakeFiles/CMakeError.log".
[Finished]
On Fri, May 11, 2018 at 3:10 PM, Je
Matthew Knepley <knep...@gmail.com> wrote:
> On Fri, May 11, 2018 at 5:06 PM, Evan Um <eva...@gmail.com> wrote:
>
>> Hi Jed,
>>
>> Thanks for the comment. I added the module but still saw errors (before I
>> arrived here, I had to do cp - r /home/e
--
On Fri, May 11, 2018 at 12:08 PM, Jed Brown <j...@jedbrown.org> wrote:
> Yes, it depends on this module from the same repository.
>
> Note that you can use pkg-config to find PETSc these days.
>
> Evan Um <eva...@gmail.com> writes:
>
> > Hi Stefa
e/evan/CLionProjects/hellopetsc/cmake-build-debug/CMakeFiles/CMakeError.log".
[Finished]
On Fri, May 11, 2018 at 11:34 AM, Stefano Zampini <stefano.zamp...@gmail.com
> wrote:
> CMAKE is case sensitive on this. You should use find_package(PETSc ….)
>
>
> > On May 11, 2018, a
Hi,
I would like to ask a question about FindPETSc.cmake. I place the cmake
file in the same directory where main.cpp is placed. I also placed the file
in /usr/share/cmake_xx/Modules. Where should i put the file? What else
should I do to use the file in cmake? Do I need any other lines in my
Dear PETSC users,
I was wondering if anyone already tried/developed an induced dimension
reduction (IDR) solver for PETSC? I think that it is a useful one but I
couldn't find its example with PETSC. If you have any idea about IDR
routines for PETSC, please let me know. Thanks!
Best,
Evan
Hi Hong,
Thanks for your reply. When I use standalone MUMPS for a symmetric matrix,
I pass a upper/lower triangular part to MUMPS. To do this, I use id.sym=2.
Under PETSC environment,
KSPCreate(PETSC_COMM_WORLD, );
KSPSetOperators(ksp, A, A);
KSPSetType (ksp, KSPPREONLY);
KSPGetPC(ksp, );
Dear PETSC-users,
I use parallel direct solver MUMPS inside PETSC and need to control some
MUMPS parameters inside PETSC. For example, I want to set up MUMPS
parameters as shown below.
ZMUMPS_STRUC_C id;
id.job=-1; /* Initialize mumps instance*/
id.par=1; /* 0: host is not involved in
wrote:
>
>
> On Wed, Oct 11, 2017 at 11:14 AM, Evan Um <eva...@gmail.com> wrote:
>
>> Hi Hong,
>>
>> Thanks for your kind email. I write another email to make sure I
>> understand it correctly. Does the zipped file from
>> https://www.mcs.anl.gov/petsc/deve
gt;>
>> Evan
>>
>>
>>
>>
>> On Wed, Oct 11, 2017 at 1:54 AM, Matthew Knepley <knep...@gmail.com>
>> wrote:
>>
>>> On Tue, Oct 10, 2017 at 10:30 PM, Evan Um <eva...@gmail.com> wrote:
>>>
>>>> Dear Hong,
&g
m the site a little bit in detail? Thank
you very much for your help.
Evan
On Wed, Oct 11, 2017 at 1:54 AM, Matthew Knepley <knep...@gmail.com> wrote:
> On Tue, Oct 10, 2017 at 10:30 PM, Evan Um <eva...@gmail.com> wrote:
>
>> Dear Hong,
>>
>> I just tried to ch
our regression tests, I'll merge it
> to petsc master branch.
>
> Hong
>
>
> On Sun, Sep 24, 2017 at 8:08 PM, Hong <hzh...@mcs.anl.gov> wrote:
>
>> I'll check it.
>> Hong
>>
>> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um <eva...@gmail.com> wrote:
a try. Once it passes our regression tests, I'll merge it
> to petsc master branch.
>
> Hong
>
>
> On Sun, Sep 24, 2017 at 8:08 PM, Hong <hzh...@mcs.anl.gov> wrote:
>
>> I'll check it.
>> Hong
>>
>> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um <
.1-cntl
>
> You may give it a try. Once it passes our regression tests, I'll merge it
> to petsc master branch.
>
> Hong
>
>
> On Sun, Sep 24, 2017 at 8:08 PM, Hong <hzh...@mcs.anl.gov> wrote:
>
>> I'll check it.
>> Hong
>>
>> On Sun, Se
was
wondering if we can still use BLR approximation for a preconditioner for
Krylov solvers.
Best,
Evan
On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith <bsm...@mcs.anl.gov> wrote:
>
> > On Sep 23, 2017, at 8:38 PM, Evan Um <eva...@gmail.com> wrote:
> >
> > Dear PETS
Dear PETSC Users,
My system matrix comes from finite element modeling and is complex and
unstructured. Its typical size is a few millions-by a few millions. I
wondering if I can use MUMPS parallel direct solver as a preconditioner in
PETSC. For example, I want to pass factored matrices to Krylov
cmumps_c.h // new
#endif
and replacing d to s for double real:
#if defined(PETSC_USE_REAL_SINGLE)
#include smumps_c.h
#else
//#include dmumps_c.h // old
#include smumps_c.h // new
#endif
Hong
On Fri, Jun 5, 2015 at 6:26 PM, Evan Um eva...@gmail.com wrote:
Dear Barry and PETSC
support that.
src/mat/impls/aij/mpi/mumps/mumps.c
Barry
On Oct 22, 2014, at 3:29 PM, Evan Um eva...@gmail.com wrote:
Dear PETSC users,
When MUMPS is used inside PETSC, The default MUMPS driver seems to be
double-precision MUMPS (i.e. DMUMPS). To reduce memory costs, I want to
test
Dear PETSC Users,
I tried to use a Cholesky factor (MUMPS results) as a preconditioner for
KSPSolve(). An example code is pasted below. When the code runs, the log
file indicates that Job=3 (i.e. backward/forward substitution) of MUMPS is
called every time inside the loop. Is there anyway to
Dear Matt,
Thanks for your quick reply. I mean avoiding MUMPS's internal back/forward
solvers (JOB=3). Does KSPSOLVE() have its own back/forward routines?
Evan
On Fri, Dec 5, 2014 at 12:20 PM, Matthew Knepley knep...@gmail.com wrote:
On Fri, Dec 5, 2014 at 2:11 PM, Evan Um eva...@gmail.com
2014 11:28:41 -0800
From: Evan Um eva...@gmail.com
To: petsc-users petsc-users@mcs.anl.gov
Subject: [petsc-users] Difference between PETSC 3.5.0.0 and 3.5.1.0
Message-ID:
CAP1yThRAs8h-Gii4DFsWB9YenUgcggdjNEXiyv=44Owb=
j+...@mail.gmail.com
Content-Type: text/plain; charset=utf-8
Dear PETSC users,
I would like to ask a question about using an external library with
MPI+OpenMP in PETSC. For example, within PETSC, I want to use MUMPS with
MPI+OpenMP. This means that if one node has 12 MPI processes and 24GB,
MUMPS uses 4 MPI processes with 6GB and each MPI process has 3
...@gmail.com wrote:
On Sat, Nov 15, 2014 at 9:24 PM, Evan Um eva...@gmail.com wrote:
Dear PETSC users,
I would like to show you a performance issue when Cholesky factor is
re-used as a direct solver or pre-conditioner many times with many
right-hand side vectors. Does anyone suggest a solution
Dear PETSC users,
I would like to show you a performance issue when Cholesky factor is
re-used as a direct solver or pre-conditioner many times with many
right-hand side vectors. Does anyone suggest a solution about this issue?
In advance, thanks for your help.
Regards,
Evan
Example 1: I used
Dear PETSC users,
I hope that I can have a comment about errors I got during sparse symmetric
matrix construction. In this example, I used three processes. The size of a
test matrix is 52105-by-52105. The length of array d_nnz and o_nnz is 17461
at rank 0, 17111 at rank 1 and 17535 at rank 2. The
() is wrong.
On Thu, Nov 6, 2014 at 12:23 PM, Jed Brown j...@jedbrown.org wrote:
Evan Um eva...@gmail.com writes:
Dear PETSC users,
I hope that I can have a comment about errors I got during sparse
symmetric
matrix construction. In this example, I used three processes. The size
of a
test
unstructured matrices?
Evan
On Thu, Nov 6, 2014 at 4:48 PM, Jed Brown j...@jedbrown.org wrote:
Evan Um eva...@gmail.com writes:
Dear Jed,
Thanks for your help many times. These numbers mean the total number of
elements to be added using MatSetValues(). For example, at process 0,
148767+5821
PETSC users,
As memory is at a premium in my problem, I want to compute only the upper
triangular elements of a matrix and then construct a parallel sparse
symmetric matrix. In this case, do I have to use MatSetOption
Dear PETSC users,
When MUMPS is used inside PETSC, The default MUMPS driver seems to be
double-precision MUMPS (i.e. DMUMPS). To reduce memory costs, I want to
test a single-precision MUMPS (SMUMPS) from PETSC. Does anyone know how to
switch from double to single-precision MUMPS inside PETSC? In
Dear users,
Is there any way to control the precision (i.e. decimal points) of matrices
and vectors when they are written into files? Otherwise, do I need to
access elements of a matrix by myself and then write them with my own
format? In advance, thanks for tour advice.
Regards,
Evan
Dear PETSC Users,
I try to solve a diffusion problem in the time domain. Its system matrix is
theoretically SPD. I use KSPCG solver. Direct MUMPS solver generates a
preconditioner. Thus, within 1-2 iterations, a solution is expected to
converge. Such convergence is observed in most test problems.
Dear PETSC and MUMPS users,
I try to use an iterative refinement option (ICNTL(10)=max # of iterative
refinement) of MUMPS in my PETSC application. MUMPS manual says that if the
solution is kept distributed (ICNTL(21)=1), the iterative refinement option
is disabled. When a problem is solved using
Dear PETSC users,
I tried to use SCOTCH 5.1.12b in my PETSC codes since MUMPS has
compatibility issues with the latest SCOTCH library.
I was told that SCOTCH/6.0.0 that comes with PETSC/3.5.0 is automatically
downloaded and installed. Is it still possible to use old SCOTCH library
5.1.2b in
Dear PETSC users,
I try to solve a large problem (about 9,000,000 unknowns) with large number
of processes (about 400 processes and 1TB). I guess that this is a
reasonably large resource for solving this problem because I was able to
solve the same problem using serial MUMPS with 500GB. Of
Dear PETSC users,
I recently downgraded my PETSC library from version 3.5.0 to version 3.4.4.
For my application, most differences between the twos were changes in PETSC
function interfaces. For examples, I got errors such as too few arguments
in function call, but some errors were about the new
38 matches
Mail list logo