Re: [petsc-users] petsc4py - Spike in memory usage when loading a matrix in parallel

2021-10-08 Thread Michael Werner
esses have the same peak memory usage. If it were only process 0 then it wouldn't matter, because with enough processes the overhead would be negligible. Best regards, Michael On 07.10.21 18:32, Matthew Knepley wrote: > On Thu, Oct 7, 2021 at 11:59 AM Michael Werner <mailto:michael.wer..

Re: [petsc-users] petsc4py - Spike in memory usage when loading a matrix in parallel

2021-10-07 Thread Michael Werner
(with 4 processes) each process shows a peak memory usage of 10.8GB Best regards, Michael On 07.10.21 17:55, Barry Smith wrote: > > >> On Oct 7, 2021, at 11:35 AM, Michael Werner > <mailto:michael.wer...@dlr.de>> wrote: >> >> Currently I'm using psutil to query e

Re: [petsc-users] petsc4py - Spike in memory usage when loading a matrix in parallel

2021-10-07 Thread Michael Werner
internal MPI buffers might explain some blip. > > > Is it possible that we free the memory, but the OS has just not given > back that memory for use yet? How are you measuring memory usage? > >   Thanks, > >      Matt >   > >   Barry > > > > On Oct 7, 2021,

[petsc-users] petsc4py - Spike in memory usage when loading a matrix in parallel

2021-10-07 Thread Michael Werner
the matrix and to explicitly preallocate the necessary NNZ (with A.setSizes(dim) and A.setPreallocationNNZ(nnz), respectively) before loading, but that didn't help. As mentioned above, I'm using petsc4py together with PETSc-3.16 on a Linux workstation. Best regards, Mich

Re: [petsc-users] SLEPc - st_type cayley choice of shift and antishift

2019-09-27 Thread Michael Werner via petsc-users
t; manual): > > (A-sigma*B)^{-1}*(A+nu*B)x = \theta x > > So nu=-sigma is a forbidden value, otherwise both factors cancel out (I will > fix the interface so that this is catched). > > In your case you should do -eps_target -1 -st_cayley_antishift -1 > > Jose > > >&

Re: [petsc-users] SLEPc - st_type cayley choice of shift and antishift

2019-09-27 Thread Michael Werner via petsc-users
t work with target -1? > Can you send me the matrices so that I can reproduce the issue? > > Jose > > >> El 27 sept 2019, a las 13:11, Michael Werner >> escribió: >> >> Thank you for the link to the paper, it's quite interesting and pretty >> close to wh

Re: [petsc-users] SLEPc - st_type cayley choice of shift and antishift

2019-09-27 Thread Michael Werner via petsc-users
o, it doesn't matter if I'm using exact or inexact solves. Changing the values of shift and antishift also doesn't change the behaviour. Do I need to make additional adjustments to get cayley to work? Best regards, Michael Am 25.09.19 um 17:21 schrieb Jose E. Roman: > >> El 25 sept 2019

[petsc-users] SLEPc - st_type cayley choice of shift and antishift

2019-09-25 Thread Michael Werner via petsc-users
-eps_target_real With sinvert, it is easy to understand how to chose the target, but for Cayley I'm not sure how to set shift and antishift. What is the mathematical meaning of the antishift? Best regards, Michael Werner

Re: [petsc-users] Usage of AMG as preconditioner

2018-09-28 Thread Michael Werner
Jed Brown writes: Michael Werner writes: >> > It uses unpreconditioned GMRES to estimate spectral >> > bounds for >> > the operator before using a Chebychev smoother. It is GMRES preconditioned by the diagonal. Moreover, in the incompressible limit, the co

Re: [petsc-users] Usage of AMG as preconditioner

2018-09-28 Thread Michael Werner
Matthew Knepley writes: On Fri, Sep 28, 2018 at 8:13 AM Michael Werner wrote: Matthew Knepley writes: > On Fri, Sep 28, 2018 at 7:43 AM Michael Werner > > wrote: > >> >> Matthew Knepley writes: >> >> > On Fri, Sep 28, 2018 at 3:23 AM Michael We

Re: [petsc-users] Usage of AMG as preconditioner

2018-09-28 Thread Michael Werner
Matthew Knepley writes: On Fri, Sep 28, 2018 at 7:43 AM Michael Werner wrote: Matthew Knepley writes: > On Fri, Sep 28, 2018 at 3:23 AM Michael Werner > > wrote: > >> Hello, >> >> I'm having trouble with getting the AMG preconditioners >> worki

Re: [petsc-users] Usage of AMG as preconditioner

2018-09-28 Thread Michael Werner
Matthew Knepley writes: On Fri, Sep 28, 2018 at 3:23 AM Michael Werner wrote: Hello, I'm having trouble with getting the AMG preconditioners working. I tried all of them (gamg, ml, hypre-boomeramg), with varying degrees of "success": - GAMG: CMD options: -ksp

[petsc-users] Usage of AMG as preconditioner

2018-09-28 Thread Michael Werner
Hello, I'm having trouble with getting the AMG preconditioners working. I tried all of them (gamg, ml, hypre-boomeramg), with varying degrees of "success": - GAMG: CMD options: -ksp_rtol 1e-8 -ksp_monitor_true_residual -ksp_max_it 20 -ksp_type fgmres -pc_type gamg -pc_gamg_sym_graph TRUE

Re: [petsc-users] SLEPc - Davidson-type solvers options

2018-08-07 Thread Michael Werner
convergence for the gd solver? As far as I know it doesn't use a ksp , so the only way I can think of to improve convergence would be using a higher quality preconditioner, right? Kind regards, Michael Jose E. Roman writes: El 6 ago 2018, a las 14:44, Michael Werner escribió: Michael Werner writes

Re: [petsc-users] SLEPc - Davidson-type solvers options

2018-08-06 Thread Michael Werner
Michael Werner writes: Hello, I want to use a Davidson-type solver (probably jd) to find the eigenvalues with the smallest real part, but so far I'm strugglung to get them to converge. So I was hoping to get some advice on the various options available for those solvers. For my test

[petsc-users] SLEPc - Davidson-type solvers options

2018-08-06 Thread Michael Werner
xplanation for -- Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR) Institut für Aerodynamik und Strömungstechnik | Bunsenstr. 10 | 37073 Göttingen Michael Werner Telefon 0551 709-2627 | Telefax 0551 709-2811 | michael.wer...@dlr.de DLR.de

Re: [petsc-users] slepc4py - Using shell matrix and explicit preconditioning matrix to solve EPS

2018-08-01 Thread Michael Werner
are problem dependent. Also, you can try GD instead of JD, which is simpler and often gives better performance. See a detailed explanation here: https://doi.org/10.1145/2543696 Jose El 1 ago 2018, a las 10:43, Michael Werner escribió: Thanks for the quick reply, your suggestion worked perfect

Re: [petsc-users] slepc4py - Using shell matrix and explicit preconditioning matrix to solve EPS

2018-08-01 Thread Michael Werner
onditioner matrix should be exactly the same as A-sigma*B, otherwise you may get unexpected results. Davidson-type methods allow using a different preconditioner. Jose El 1 ago 2018, a las 10:11, Michael Werner escribió: Hello, I'm trying to find the smallest eigenvalues of a linear syste

[petsc-users] slepc4py - Using shell matrix and explicit preconditioning matrix to solve EPS

2018-08-01 Thread Michael Werner
Hello, I'm trying to find the smallest eigenvalues of a linear system created by CFD simulations. To reduce memory requirements, I want to use a shell matrix (A_Shell) to provide the matrix-vector product, and a lower-order explicit matrix (P) as preconditioner. As I'm solving a generalized

Re: [petsc-users] Parallelizing a matrix-free code

2017-10-18 Thread Michael Werner
necessary. So now its possible to simply gather the correct values by their global ID, pass them to the external code and then scatter the result back to the parallel vector. Now my code is working as intended. Thanks for your help! Kind regards, Michael Werner Am 18.10.2017 um 12:01 schrieb

Re: [petsc-users] Parallelizing a matrix-free code

2017-10-18 Thread Michael Werner
. But this would create a lot of communication between the different processes and seems quite clunky. Is there a more elegant way? Is it maybe possible to manually assign the size of the PETSc subdomains? Kind regards, Michael Werner Am 17.10.2017 um 12:31 schrieb Matthew Knepley: On Tue, Oct 17

Re: [petsc-users] Parallelizing a matrix-free code

2017-10-17 Thread Michael Werner
several contesting instances of the computation on the whole domain. But maybe that's only because I haven't completly understood how MPI really works in such cases... Kind regards, Michael Am 17.10.2017 um 11:50 schrieb Matthew Knepley: On Tue, Oct 17, 2017 at 5:46 AM, Michael Werner <michael.

Re: [petsc-users] Parallelizing a matrix-free code

2017-10-17 Thread Michael Werner
I'm not sure what you mean with this question? The external CFD code, if that was what you referred to, can be run in parallel. Am 17.10.2017 um 11:11 schrieb Matthew Knepley: On Tue, Oct 17, 2017 at 4:21 AM, Michael Werner <michael.wer...@dlr.de <mailto:michael.wer...@dlr.de&g

Re: [petsc-users] Parallelizing a matrix-free code

2017-10-17 Thread Michael Werner
Zampini <stefano.zamp...@gmail.com <mailto:stefano.zamp...@gmail.com>> wrote: 2017-10-16 10:26 GMT+03:00 Michael Werner <michael.wer...@dlr.de <mailto:michael.wer...@dlr.de>>: Hello, I'm having trouble with parallelizing a matrix-free code with

[petsc-users] Parallelizing a matrix-free code

2017-10-16 Thread Michael Werner
Hello, I'm having trouble with parallelizing a matrix-free code with PETSc. In this code, I use an external CFD code to provide the matrix-vector product for an iterative solver in PETSc. To increase convergence rate, I'm using an explicitly stored Jacobian matrix to precondition the solver.