For large problems, preconditioners have to take advantage of some
underlying mathematical structure of the operator to perform well (require few
iterations). Just black-boxing the system with simple preconditioners will not
be effective.
So, one needs to look at the Liouvillian
Dear Petsc devs,
I'm encountering an error running hypre on a single node with multiple GPUs.
The issue is in the setup phase. I'm trying to troubleshoot, but don't know
where to start.
Are the system routines PetScCUDAInitialize and PetScCUDAInitializeCheck
available in python?
How do I verify
Hi Anna,
Since you said "The code works with pc-type hypre on a single GPU.", I
was wondering if this is a CUDA devices to MPI ranks binding problem.
You can search TACC documentation to find how its job scheduler binds
GPUs to MPI ranks (usually via manipulating the CUDA_VISIBLE_DEVICES
Hi all,
I've been trying for the last couple of days to solve a linear system
using iterative methods. The system size itself scales exponentially
(64^N) with the number of components, so I receive sizes of
* (64, 64) for one component
* (4096, 4096) for two components
* (262144, 262144) for
Iterative solvers have to be designed for your particular operator.
You want to look in your field to see how people solve these problems. (eg,
zeros on the diagonal will need something like a block solver or maybe ILU
with a particular ordering)
I don't personally know anything about this
On Wed, Jan 31, 2024 at 8:21 AM Mark Adams wrote:
> Iterative solvers have to be designed for your particular operator.
> You want to look in your field to see how people solve these problems.
> (eg, zeros on the diagonal will need something like a block solver or maybe
> ILU with a particular
> On 31 Jan 2024, at 11:31 AM, Alain O' Miniussi wrote:
>
> Hi,
>
> It is indicated in:
> https://petsc.org/release/manualpages/Sys/PetscInitialize/
> that the init function will call MPI_Init.
>
> What if MPI_Init was already called (as it is the case in my application) and
> what about
On Wed, Jan 31, 2024 at 10:10 AM Alain O' Miniussi
wrote:
> Hi,
>
> It is indicated in:
> https://petsc.org/release/manualpages/Sys/PetscInitialize/
> that the init function will call MPI_Init.
>
> What if MPI_Init was already called (as it is the case in my application)
>From the page: "
Hi,
It is indicated in:
https://petsc.org/release/manualpages/Sys/PetscInitialize/
that the init function will call MPI_Init.
What if MPI_Init was already called (as it is the case in my application) and
what about MPI_Init_thread.
Wouldn't it be more convenient to have a Petsc init function