Hi Pierre,

I do not discount that I might not be doing something right.

However, this code has not changed and worked fine for versions up to and 
including 3.16.

Can you please explain what changed in 3.17 that is causing this?

Are you saying that now you have to explicitly set each 3x3 dense block, even 
if they are not used and that was not the case before?

Randy

> On May 3, 2022, at 10:09 AM, Pierre Jolivet <[email protected]> wrote:
> 
> 
> 
>> On 3 May 2022, at 6:54 PM, Randall Mackie <[email protected] 
>> <mailto:[email protected]>> wrote:
>> 
>> Hi Pierre,
>> 
>> Here is how I create the matrix, and how I’ve done it for many years:
>> 
>>       ! Create global matrix to hold system of equations resulting from 
>> finite discretization
>>       ! of the Maxwell equations. 
>>       ngrow3=ginfo%nx*ginfo%ny*ginfo%nz*3
>>       call MatCreate(comm,A,ierr)
>>       call MatSetSizes(A,mloc3,mloc3,ngrow3,ngrow3,ierr)
>>       call MatSetBlockSize(A,i3,ierr)
> 
> I don’t know enough about your discretization stencil, but again, since a 
> simple call to MatConvert(A,MATBAIJ,MAT_INITIAL_MATRIX,C,ierr) fails in your 
> MWE, I doubt that this line is correct.
> 
>>       call MatSetType(A,MATAIJ,ierr)
>>       call MatSeqAIJSetPreallocation(A,i15,PETSC_NULL_INTEGER,ierr)
>>       call 
>> MatMPIAIJSetPreallocation(A,i15,PETSC_NULL_INTEGER,i7,PETSC_NULL_INTEGER,ierr)
>>       call MatSetOption(A,MAT_NEW_NONZERO_ALLOCATION_ERR,PETSC_FALSE,ierr)
>> 
>> This is a staggered grid formulation (but derived many years before PETSc 
>> had a DMStag class) and so I use regular DMDA….for those locations that are 
>> not involved, I put a 1 on the diagonal.
> 
> If you want to use a block size of dimension 3, you also need to explicitly 
> MatSetValue (or MatSetValues or MatSetValuesBlocked) zeros for all 3x3 dense 
> blocks.
> 
> Thanks,
> Pierre 
> 
>> Randy
>> 
>>> On May 3, 2022, at 9:37 AM, Pierre Jolivet <[email protected] 
>>> <mailto:[email protected]>> wrote:
>>> 
>>> Thanks for the reproducer.
>>> My guess is that your AIJ matrix has not a block size of 3.
>>> A simple call such as: call MatConvert(A,MATBAIJ,MAT_INITIAL_MATRIX,C,ierr) 
>>> is also failing, while it shouldn’t if your AIJ Mat is truly made of 3x3 
>>> dense blocks.
>>> How did you determine the block size of your Mat?
>>> Are you allocating 3x3 dense blocks everywhere or are you skipping zero 
>>> coefficients in your AIJ Mat?
>>> In the meantime, you can bypass the issue by not setting a block size of 3 
>>> on your Mat, or by setting different block size for the column and row 
>>> distributions, see MatSetBlockSizes().
>>> 
>>> Thanks,
>>> Pierre
>>> 
>>>> On 3 May 2022, at 5:39 PM, Randall Mackie <[email protected] 
>>>> <mailto:[email protected]>> wrote:
>>>> 
>>>> Dear PETSc team:
>>>> 
>>>> A part of our code that has worked for years and previous versions is now 
>>>> failing with the latest version 3.17.1, on the KSP solve with the 
>>>> following error:
>>>> 
>>>> [0]PETSC ERROR: --------------------- Error Message 
>>>> --------------------------------------------------------------
>>>> [0]PETSC ERROR: Invalid argument
>>>> [0]PETSC ERROR: Block size 3 is incompatible with the indices: non 
>>>> consecutive indices 153055 153124
>>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ 
>>>> <https://fra01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fpetsc.org%2Frelease%2Ffaq%2F&data=05%7C01%7CRandall.Mackie%40CGG.com%7C323fb230ca4f45d45c7408da2d18c44e%7C307ea68275e14701a1146c42f9ff0d24%7C0%7C0%7C637871881639501468%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=3iAxFsi8wDXPcxKRBVFotn7DpoVNc9%2Fq60r%2F57nn7Lw%3D&reserved=0>
>>>>  for trouble shooting.
>>>> [0]PETSC ERROR: Petsc Release Version 3.17.1, Apr 28, 2022 
>>>> [0]PETSC ERROR: ./test on a linux-gfortran-complex-debug named 
>>>> rmackie-VirtualBox by rmackie Tue May  3 08:12:15 2022
>>>> [0]PETSC ERROR: Configure options --with-clean=1 
>>>> --with-scalar-type=complex --with-debugging=1 --with-fortran=1 
>>>> --download-mpich=../external/mpich-4.0.1.tar.gz
>>>> [0]PETSC ERROR: #1 ISSetBlockSize() at 
>>>> /home/rmackie/PETSc/petsc-3.17.1/src/vec/is/is/interface/index.c:1898
>>>> [0]PETSC ERROR: #2 MatIncreaseOverlap() at 
>>>> /home/rmackie/PETSc/petsc-3.17.1/src/mat/interface/matrix.c:7086
>>>> [0]PETSC ERROR: #3 PCSetUp_ASM() at 
>>>> /home/rmackie/PETSc/petsc-3.17.1/src/ksp/pc/impls/asm/asm.c:238
>>>> [0]PETSC ERROR: #4 PCSetUp() at 
>>>> /home/rmackie/PETSc/petsc-3.17.1/src/ksp/pc/interface/precon.c:990
>>>> [0]PETSC ERROR: #5 KSPSetUp() at 
>>>> /home/rmackie/PETSc/petsc-3.17.1/src/ksp/ksp/interface/itfunc.c:407
>>>> [0]PETSC ERROR: #6 KSPSolve_Private() at 
>>>> /home/rmackie/PETSc/petsc-3.17.1/src/ksp/ksp/interface/itfunc.c:843
>>>> [0]PETSC ERROR: #7 KSPSolve() at 
>>>> /home/rmackie/PETSc/petsc-3.17.1/src/ksp/ksp/interface/itfunc.c:1078
>>>> 
>>>> 
>>>> I have a small test program with binary matrix and right hand side that 
>>>> will show the problem and I can send it as a zip file, please advise what 
>>>> email address to use or where to send it.
>>>> 
>>>> Thanks, 
>>>> 
>>>> Randy M.
>>> 
>> 
> 

Reply via email to