Re: [petsc-users] Combine PETSc with CVode example

2018-11-08 Thread Smith, Barry F. via petsc-users
> On Nov 8, 2018, at 9:14 PM, Jed Brown via petsc-users > wrote: > > All PETSc TS examples can be run with -ts_type sundials. Note that in general PETSc uses its integrators across the entire grid, not a separate integrator at each point, though that is possible. You need to create a

Re: [petsc-users] Combine PETSc with CVode example

2018-11-08 Thread Jed Brown via petsc-users
All PETSc TS examples can be run with -ts_type sundials. Aroli Marcellinus via petsc-users writes: > Hi, > > > Is there any simple example about using CVode in PETSc properly? Like > solving ODE in each node at some 3D-mesh? > > Thank you. > > Aroli Marcellinus > > *Kumoh Institute of

Re: [petsc-users] Vec, Mat and binaryfiles.

2018-11-08 Thread Jed Brown via petsc-users
Sal Am writes: > Yes I was just hoping there'd be more than that. > > I have tried using one of them as basis: > > import PetscBinaryIO > import numpy as np > import scipy.sparse > > b_vector = np.array(np.fromfile('Vector_b.bin' > ,dtype=np.dtype((np.float64,2; > A_matrix = >

Re: [petsc-users] How to fix max linear steps in SNES

2018-11-08 Thread Smith, Barry F. via petsc-users
> On Nov 8, 2018, at 9:06 AM, Yingjie Wu via petsc-users > wrote: > > Dear Petsc developer: > Hi, > I recently debugged my program, which is a two-dimensional nonlinear PDEs > problem, and solved by SNES. I find that the residual drop in KSP is slow. I > want to fix the number of steps in

Re: [petsc-users] [petsc-maint] Correct use of PCFactorSetMatOrderingType

2018-11-08 Thread Mark Adams via petsc-users
I am not that familiar with hypre's options. AMG is complicated and I barely keep my options straight. OpenFOAM seems to have highly specialized solvers so being with 50% of them is decent. On Thu, Nov 8, 2018 at 12:03 PM Edoardo alinovi wrote: > Yes, it is like you are saying. This is mostly

Re: [petsc-users] [petsc-maint] Correct use of PCFactorSetMatOrderingType

2018-11-08 Thread Mark Adams via petsc-users
To repeat: You seem to be saying that OpenFOAM solves the problem in 10 seconds and PETSc solves it in 14 seconds. Is that correct? On Thu, Nov 8, 2018 at 3:42 AM Edoardo alinovi via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hello Mark, > > Yes, there are 5 KSP calls within a time-step

Re: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue

2018-11-08 Thread Mark Adams via petsc-users
> > > I did not configured PETSc with ParMetis support. Should I? > > I figured it out when I tried to use "-pc_gamg_repartition". PETSc > complained that it was not compiled with ParMetis support. > You need ParMetis, or some parallel mesh partitioner, configured to use repartitioning. I would

[petsc-users] How to fix max linear steps in SNES

2018-11-08 Thread Yingjie Wu via petsc-users
Dear Petsc developer: Hi, I recently debugged my program, which is a two-dimensional nonlinear PDEs problem, and solved by SNES. I find that the residual drop in KSP is slow. I want to fix the number of steps in the linear step, because I can not choose a suitable ksp_rtol. I use the command:

Re: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue

2018-11-08 Thread Alberto F. Martín
On 08/11/18 13:01, Matthew Knepley wrote: On Thu, Nov 8, 2018 at 6:41 AM "Alberto F. Martín" via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: Dear Mark, thanks for your quick and comprehensive reply. Before moving to the results of the experiments that u suggested,

Re: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue

2018-11-08 Thread Matthew Knepley via petsc-users
On Thu, Nov 8, 2018 at 6:41 AM "Alberto F. Martín" via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear Mark, > > thanks for your quick and comprehensive reply. > > Before moving to the results of the experiments that u suggested, let me > clarify two points > on my original e-mail and your

Re: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue

2018-11-08 Thread Alberto F. Martín
Dear Mark, thanks for your quick and comprehensive reply. Before moving to the results of the experiments that u suggested, let me clarify two points on my original e-mail and your answer: (1) The raw timings and #iters. provided in my first e-mail were actually obtained with

Re: [petsc-users] need help with vector interpolation on nonuniform DMDA grids

2018-11-08 Thread Francesco Magaletti via petsc-users
Dear Matt & Dave, I really appreciate your support! I was not aware of the existence of DMSwarm object and it has been a really exiting discovery, since in our research group we have people working with kind of PIC methods and I suppose that they will be grateful for your work Dave!

Re: [petsc-users] [petsc-maint] Correct use of PCFactorSetMatOrderingType

2018-11-08 Thread Edoardo alinovi via petsc-users
Hello Mark, Yes, there are 5 KSP calls within a time-step (3 for the solution of momentum equation + 2 for the solution of pressure), this is the classical non iterative PISO by Issa ( the exact sequence of operations is : solve momentum implicitly, solve pressure-correction, momentum explicitly,