Assuming you preallocate/assemble a MPIAIJ matrix “by hand” using a
ISLocalToGlobalMapping object obtained from DMDA: what’s the easiest way to get
global column indices that do not belong to a process?
Say, if you have a finite-difference stencil and the stencil exceeds the
portion owned by
Hi Matthew,
I’ve just attached on that page https://bitbucket.org/petsc/petsc/issues/262
the same source code written in C that runs just fine from what I can see.
Which means there’s either a bug in ISLocalToGlobalMappingGetIndicesF90 or
something missing in the documentation.
Best,
Thibaut
Hi Matthew,
Yes I need F90 and the syntax in the file / yours
PetscInt, pointer :: id_ltog(:)
Is the exact same as
PetscInt, dimension(:), pointer :: id_ltog
They’re both modern fortran ”correct”
Anyways I tried both and I still get the same error message.
Thibaut
On 24 Feb 2019, at 23:38,
Is that for the id_ltog argument? I tried declaring it as IS, and you can’t
declare an deferred-shape array as target. Same message. XXX(:) is syntactic
sugar for dimension(:) :: XXX
I’m really confused because this
Hi Matthew,
With the following data declaration and routines calls:
DM :: da
ISLocalToGlobalMapping :: ltog
PetscInt, DIMENSION(:), POINTER :: id_ltog
CALL
DMDACreate2D(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,(nx+1),(ny+1),
&
Hi Matthew,
I think I need the whole G2L not just the sizes.
Build my own matrix, create a “fake” DMDA, extract the G2L mapping and apply it
to my preallocation/assembly routines (where I would basically replace my
natural ordering with the DMDA ordering from the G2L mapping)
(cf the mail I
Hi Barry,
> Le 15 nov. 2018 à 18:16, Smith, Barry F. a écrit :
>
>
>
>> On Nov 15, 2018, at 4:48 AM, Appel, Thibaut via petsc-users
>> wrote:
>>
>> Good morning,
>>
>> I would like to ask about the importance of the initial choice of
Good morning,
I would like to ask about the importance of the initial choice of ordering the
unknowns when feeding a matrix to PETSc.
I have a regular grid, using high-order finite differences and I simply divide
rows of the matrix with PetscSplitOwnership using vertex major, natural
maximum iterations=1, initial guess is zero
>> tolerances: relative=1e-05, absolute=1e-50, divergence=1.
>> left preconditioning
>> using DEFAULT norm type for convergence test
>> PC Object: (fieldsplit_2_) 1 MPI processes
>> type not yet set
>&
e:
> On Oct 31, 2018, at 5:39 PM, Appel, Thibaut via petsc-users
> mailto:petsc-users@mcs.anl.gov>> wrote:
>
> Well yes naturally for the residual but adding -ksp_true_residual just gives
>
> 0 KSP unpreconditioned resid norm 3.583290589961e+00 true resid norm
> 3.5
t you meant? If you could let me know what should be
corrected.
Thanks for your support,
Thibaut
On 31/10/2018 16:43, Mark Adams wrote:
On Tue, Oct 30, 2018 at 5:23 PM Appel, Thibaut via petsc-users
wrote:
Dear users,
Following a suggestion from Matthew Knepley I’ve been trying to a
Dear users,
Following a suggestion from Matthew Knepley I’ve been trying to apply
fieldsplit/gamg for my set of PDEs but I’m still encountering issues despite
various tests. pc_gamg simply won’t start.
Note that direct solvers always yield the correct, physical result.
Removing the fieldsplit
12 matches
Mail list logo