> On Nov 8, 2018, at 9:14 PM, Jed Brown via petsc-users
> wrote:
>
> All PETSc TS examples can be run with -ts_type sundials.
Note that in general PETSc uses its integrators across the entire grid, not
a separate integrator at each point, though that is possible. You need to
create a
All PETSc TS examples can be run with -ts_type sundials.
Aroli Marcellinus via petsc-users writes:
> Hi,
>
>
> Is there any simple example about using CVode in PETSc properly? Like
> solving ODE in each node at some 3D-mesh?
>
> Thank you.
>
> Aroli Marcellinus
>
> *Kumoh Institute of
Sal Am writes:
> Yes I was just hoping there'd be more than that.
>
> I have tried using one of them as basis:
>
> import PetscBinaryIO
> import numpy as np
> import scipy.sparse
>
> b_vector = np.array(np.fromfile('Vector_b.bin'
> ,dtype=np.dtype((np.float64,2;
> A_matrix =
>
> On Nov 8, 2018, at 9:06 AM, Yingjie Wu via petsc-users
> wrote:
>
> Dear Petsc developer:
> Hi,
> I recently debugged my program, which is a two-dimensional nonlinear PDEs
> problem, and solved by SNES. I find that the residual drop in KSP is slow. I
> want to fix the number of steps in
I am not that familiar with hypre's options. AMG is complicated and I
barely keep my options straight.
OpenFOAM seems to have highly specialized solvers so being with 50% of them
is decent.
On Thu, Nov 8, 2018 at 12:03 PM Edoardo alinovi
wrote:
> Yes, it is like you are saying. This is mostly
To repeat:
You seem to be saying that OpenFOAM solves the problem in 10 seconds and
PETSc solves it in 14 seconds. Is that correct?
On Thu, Nov 8, 2018 at 3:42 AM Edoardo alinovi via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Hello Mark,
>
> Yes, there are 5 KSP calls within a time-step
>
>
> I did not configured PETSc with ParMetis support. Should I?
>
> I figured it out when I tried to use "-pc_gamg_repartition". PETSc
> complained that it was not compiled with ParMetis support.
>
You need ParMetis, or some parallel mesh partitioner, configured to use
repartitioning. I would
Dear Petsc developer:
Hi,
I recently debugged my program, which is a two-dimensional nonlinear PDEs
problem, and solved by SNES. I find that the residual drop in KSP is slow.
I want to fix the number of steps in the linear step, because I can not
choose a suitable ksp_rtol. I use the command:
On 08/11/18 13:01, Matthew Knepley wrote:
On Thu, Nov 8, 2018 at 6:41 AM "Alberto F. Martín" via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:
Dear Mark,
thanks for your quick and comprehensive reply.
Before moving to the results of the experiments that u suggested,
On Thu, Nov 8, 2018 at 6:41 AM "Alberto F. Martín" via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Dear Mark,
>
> thanks for your quick and comprehensive reply.
>
> Before moving to the results of the experiments that u suggested, let me
> clarify two points
> on my original e-mail and your
Dear Mark,
thanks for your quick and comprehensive reply.
Before moving to the results of the experiments that u suggested, let me
clarify two points
on my original e-mail and your answer:
(1) The raw timings and #iters. provided in my first e-mail were actually
obtained with
Dear Matt & Dave,
I really appreciate your support!
I was not aware of the existence of DMSwarm object and it has been a really
exiting discovery, since in our research group we have people working with kind
of PIC methods and I suppose that they will be grateful for your work Dave!
Hello Mark,
Yes, there are 5 KSP calls within a time-step (3 for the solution of
momentum equation + 2 for the solution of pressure), this is the classical
non iterative PISO by Issa ( the exact sequence of operations is : solve
momentum implicitly, solve pressure-correction, momentum explicitly,
13 matches
Mail list logo