On Mon, Nov 5, 2018 at 4:11 PM Appel, Thibaut
wrote:
> "Local" as in serial?
>
Block Jacobi with ILU as the solver on each block. Each block corresponds
to an MPI process by default. So it is completely parallel it is just not a
true ILU. I the limit of one equation per processor it is just
"Local" as in serial?
Thibaut
On 5 Nov 2018, at 20:12, Mark Adams mailto:mfad...@lbl.gov>>
wrote:
On Mon, Nov 5, 2018 at 12:50 PM Thibaut Appel
mailto:t.appe...@imperial.ac.uk>> wrote:
Hi Mark,
Yes it doesn't seem to be usable. Unfortunately we're aiming to do 3D so direct
solvers are
On Mon, Nov 5, 2018 at 3:23 PM Justin Chang via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Hi everyone,
>
> I am working on a generic AC optimal power flow solver, and I hope to use
> DMNetwork's data structure and TAO's optimization solvers for this purpose.
> Last time I inquired about IPM
On Mon, Nov 5, 2018 at 12:50 PM Thibaut Appel
wrote:
> Hi Mark,
>
> Yes it doesn't seem to be usable. Unfortunately we're aiming to do 3D so
> direct solvers are not a viable solution and PETSc' ILU is not parallel and
> we can't use HYPRE (complex arithmetic)
>
I think SuperLU has a parallel
Hi Mark,
Yes it doesn't seem to be usable. Unfortunately we're aiming to do 3D so
direct solvers are not a viable solution and PETSc' ILU is not parallel
and we can't use HYPRE (complex arithmetic)
Thibaut
On 01/11/2018 20:42, Mark Adams wrote:
On Wed, Oct 31, 2018 at 8:11 PM Smith,
Fixed in the branch barry/fix-fortran-petscfileviewersetname soon to be
fixed in maint and then the next patch release of PETSc.
Thanks for the report
Barry
> On Nov 5, 2018, at 7:29 AM, Tim Steinhoff via petsc-users
> wrote:
>
> Dear PETSc Team,
>
> I am having the issue that
Sal Am via petsc-users writes:
> Hi,
>
> I am trying to solve a Ax=b complex system. the vector b and "matrix" A are
> both binary and NOT created by PETSc. So I keep getting error messages that
> they are not correct format when I read the files with PetscViewBinaryOpen,
> after some digging it
On Mon, Nov 5, 2018 at 8:41 AM Karol Lewandowski via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Hi,
>
> I am solving a highly nonlinear problem using SNES solver. Under certain
> conditions during the iterations I already know that the step will diverge
> (in the next few iterations). Is
Yingjie Wu via petsc-users writes:
> Thank you very much for your reply.
> My equation is a neutron diffusion equation with eigenvalues, which is why
> I use DMConposite because there is a single non-physical field variable,
> eigenvalue. I am not very familiar with FieldSplit. I can understand
On Mon, Nov 5, 2018 at 10:37 AM Yingjie Wu wrote:
> Thank you very much for your reply.
> My equation is a neutron diffusion equation with eigenvalues, which is why
> I use DMConposite because there is a single non-physical field variable,
> eigenvalue.
>
OK, DMComposite might be your best
Thank you very much for your reply.
My equation is a neutron diffusion equation with eigenvalues, which is why
I use DMConposite because there is a single non-physical field variable,
eigenvalue. I am not very familiar with FieldSplit. I can understand it
first.
It seems not the problem of
DMComposite is not very mature, the last time I checked and I don't of
anyone having worked on it recently, and it is probably not what you want
anyway. FieldSplit is most likely what you want.
What are your equations and discretization? eg, Stokes with cell centered
pressure? There are probably
12 matches
Mail list logo