Dear Mark,

Thanks for your response. I found the bug. I used wrong M N when MatCreateShell(). Now it works in both uniprocessor and mpirun cases.

Best,
Yi

On 3/11/24 13:10, Mark Adams wrote:
I think you misunderstand "global" vector. The global solution is never on one processor. A global vector in this case seems to have 81 global values and 27 local values. It looks like you create a "global" vector that has 81 local values that should never be created other than for debigging. GlobalToLocal refers to a "local" vector with ghost cells, so it would have > 27 values in this case and you use it for local operations only (its communicator is PETSC_COMM_SELF).
Hope this helps,
Mark

On Mon, Mar 11, 2024 at 4:48 AM Yi Hu <[email protected]> wrote:

    Dear Barry,

    Thanks for your response. Now I am doing simple debugging for my
    customized mat_mult_op of my shell jacobian matrix. As far as I
    understand, because the input and output of shell jacobian are all
    global vectors, I need to do the global to local mapping of my
    input vector (dF) by myself. Before starting debugging the
    mapping, I first try to verify the size match of input and output.

    Because the input and output of a mat_mult_op of my shell matrix
    should have the same size. So I tried just equating them in my
    customized mat_mult_op. Basically like this

    subroutineGK_op(Jac,dF_global,output_global,err_PETSc)

    DM                                   :: dm_local ! Yi: later for is,ie

    Vec                                  :: dF_global

    Vec                                  :: output_global

    PetscErrorCode                       :: err_PETSc

    real(pREAL), pointer,dimension(:,:,:,:) ::dF_scal, output_scal

    callSNESGetDM(SNES_mech,dm_local,err_PETSc)

    CHKERRQ(err_PETSc)

    output_global =dF_global

    end subroutineGK_op

    When I run with mpirun -np 3, it gives me similar error like
    previous, “Preconditioner number of local rows 27 does not equal
    input vector size 81”, (I changed my problem size so the numbers
    are different).

    Maybe simple equating input and output is not valid (due to
    ownership of different portion of a global vector). Then it may
    give me different error message. In fact my global dF has size
    9*3*3*3, when running on 3 processors, the local dF has size
    9*3*3*1 (I split my domain in z direction). The error message
    seems to suggest I am using a local dF rather than a global dF.
    And the output and input vector sizes seems to be different. Do I
    miss something here?

    Best regards,

    Yi

    *From:*Barry Smith <[email protected]>
    *Sent:* Sunday, March 10, 2024 6:45 PM
    *To:* Yi Hu <[email protected]>
    *Cc:* Mark Adams <[email protected]>; petsc-users
    <[email protected]>
    *Subject:* Re: [petsc-users] snes matrix-free jacobian fails with
    preconditioner shape mismatch



        On Mar 10, 2024, at 10:16 AM, Yi Hu <[email protected]> wrote:

        This Message Is From an External Sender

        This message came from outside your organization.

        Dear Mark,

        Thanks for your reply. I see this mismatch. In fact my global
        DoF is 324. It seems like I always get the local size = global
        Dof / np^2, np is my processor number. By the way, I used
        DMDASNESsetFunctionLocal() to set my form function. Is it
        eligible to mix DMDASNESsetFunctionLocal() and a native
        SNESSetJacobian()?

    Yes



        Best,

        Yi

        On 3/10/24 13:55, Mark Adams wrote:

            It looks like your input vector is the global vector, size
            162, and the local matrix size is 81.

            Mark

            [1]PETSC ERROR: Preconditioner number of local rows 81
            does not equal

            input vector size 162

            On Sun, Mar 10, 2024 at 7:21 AM Yi Hu <[email protected]> wrote:

                *This Message Is From an External Sender*

                This message came from outside your organization.

                Dear petsc team,

                I implemented a matrix-free jacobian, and it can run
                sequentially. But

                running parallel I got the pc error like this (running
                with mpirun -np

                2, only error from rank1 is presented here)

                [1]PETSC ERROR: --------------------- Error Message

                --------------------------------------------------------------

                [1]PETSC ERROR: Nonconforming object sizes

                [1]PETSC ERROR: Preconditioner number of local rows 81
                does not equal

                input vector size 162

                [1]PETSC ERROR: See
                
https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!ahmistzr4wD3TJ0OvI0JWxB9aVSIbP78Jcs2X_6KMb4LdoR8drLB_DkHvaguhrca22RgFer0PlyUrtdfCA$
                
<https://urldefense.us/v3/__https:/petsc.org/release/faq/__;!!G_uCfscf7eWS!ahmistzr4wD3TJ0OvI0JWxB9aVSIbP78Jcs2X_6KMb4LdoR8drLB_DkHvaguhrca22RgFer0PlyUrtdfCA$>
                for trouble shooting.

                [1]PETSC ERROR: Petsc Release Version 3.17.3, Jun 29, 2022

                [1]PETSC ERROR:
                /home/yi/workspace/DAMASK_yi/bin/DAMASK_grid on a

                arch-linux-c-opt named carbon-x1 by yi Sun Mar 10
                12:01:46 2024

                [1]PETSC ERROR: Configure options --download-fftw
                --download-hdf5

                --with-hdf5-fortran-bindings --download-fblaslapack
                --download-chaco

                --download-hypre --download-metis --download-mumps
                --download-parmetis

                --download-scalapack --download-suitesparse
                --download-superlu

                --download-superlu_dist --download-triangle
                --download-zlib

                --download-cmake --with-cxx-dialect=C++11 --with-c2html=0

                --with-debugging=0 --with-ssl=0 --with-x=0
                COPTFLAGS=-O3 CXXOPTFLAGS=-O3

                FOPTFLAGS=-O3

                [1]PETSC ERROR: #1 PCApply() at

                /home/yi/App/petsc-3.17.3/src/ksp/pc/interface/precon.c:424

                [1]PETSC ERROR: #2 KSP_PCApply() at

                /home/yi/App/petsc-3.17.3/include/petsc/private/kspimpl.h:376

                [1]PETSC ERROR: #3 KSPInitialResidual() at

                /home/yi/App/petsc-3.17.3/src/ksp/ksp/interface/itres.c:64

                [1]PETSC ERROR: #4 KSPSolve_GMRES() at

                /home/yi/App/petsc-3.17.3/src/ksp/ksp/impls/gmres/gmres.c:242

                [1]PETSC ERROR: #5 KSPSolve_Private() at

                /home/yi/App/petsc-3.17.3/src/ksp/ksp/interface/itfunc.c:902

                [1]PETSC ERROR: #6 KSPSolve() at

                /home/yi/App/petsc-3.17.3/src/ksp/ksp/interface/itfunc.c:1078

                [1]PETSC ERROR: #7 SNESSolve_NEWTONLS() at

                /home/yi/App/petsc-3.17.3/src/snes/impls/ls/ls.c:222

                [1]PETSC ERROR: #8 SNESSolve() at

                /home/yi/App/petsc-3.17.3/src/snes/interface/snes.c:4756

                [1]PETSC ERROR: #9 User provided function() at User file:0

                However, from snes matrix-free documentation

                
(https://urldefense.us/v3/__https://petsc.org/release/manual/snes/*matrix-free-methods__;Iw!!G_uCfscf7eWS!ahmistzr4wD3TJ0OvI0JWxB9aVSIbP78Jcs2X_6KMb4LdoR8drLB_DkHvaguhrca22RgFer0Ply6ZbywOw$
                
<https://urldefense.us/v3/__https:/petsc.org/release/manual/snes/*matrix-free-methods__;Iw!!G_uCfscf7eWS!ahmistzr4wD3TJ0OvI0JWxB9aVSIbP78Jcs2X_6KMb4LdoR8drLB_DkHvaguhrca22RgFer0Ply6ZbywOw$>),
                it is said

                matrix-free is used with pcnone. So I assume it would
                not apply

                preconditioner, but it did use preconditioning
                probably the same as my

                matrix-free shell matrix. Here is how i initialize my
                shell matrix and

                the corresponding customized multiplication.

                  call
                MatCreateShell(PETSC_COMM_WORLD,PETSC_DECIDE,PETSC_DECIDE,&

                int(9*product(cells(1:2))*cells3,pPETSCINT),&

                int(9*product(cells(1:2))*cells3,pPETSCINT),&

                                      F_PETSc,Jac_PETSc,err_PETSc)

                  call
                MatShellSetOperation(Jac_PETSc,MATOP_MULT,GK_op,err_PETSc)

                  call SNESSetDM(SNES_mech,DM_mech,err_PETSc)

                  call

                
SNESSetJacobian(SNES_mech,Jac_PETSc,Jac_PETSc,PETSC_NULL_FUNCTION,0,err_PETSc)

                  call SNESGetKSP(SNES_mech,ksp,err_PETSc)

                  call PCSetType(pc,PCNONE,err_PETSc)

                And my GK_op is like

                subroutine GK_op(Jac,dF_global,output_local,err_PETSc)

                  DM                                   :: dm_local

                  Vec                                  :: dF_global,
                dF_local, output_local

                  Mat                                  :: Jac

                  PetscErrorCode                       :: err_PETSc

                  real(pREAL), pointer,dimension(:,:,:,:) :: dF_scal,
                output_scal

                  real(pREAL), dimension(3,3,cells(1),cells(2),cells3)
                :: &

                    dF

                  real(pREAL), dimension(3,3,cells(1),cells(2),cells3)
                :: &

                    output

                  call SNESGetDM(SNES_mech,dm_local,err_PETSc)

                  call DMGetLocalVector(dm_local,dF_local,err_PETSc)

                  call

                
DMGlobalToLocalBegin(dm_local,dF_global,INSERT_VALUES,dF_local,err_PETSc)

                  call

                
DMGlobalToLocalEnd(dm_local,dF_global,INSERT_VALUES,dF_local,err_PETSc)

                  call
                DMDAVecGetArrayReadF90(dm_local,dF_local,dF_scal,err_PETSc)

                  dF = reshape(dF_scal, [3,3,cells(1),cells(2),cells3])

                .......

                  call
                
DMDAVecRestoreArrayF90(dm_local,output_local,output_scal,err_PETSc)

                  CHKERRQ(err_PETSc)

                  call
                DMDAVecRestoreArrayF90(dm_local,dF_local,dF_scal,err_PETSc)

                  CHKERRQ(err_PETSc)

                end subroutine GK_op

                I checked my cells3, it corresponds to my local size,
                and it seems the

                local size of dF_local is ok.

                I am a bit lost here to find the reason for the
                preconditioner bug.

                Could you help me on this? Thanks.

                Best regards,

                Yi

                -------------------------------------------------

                Stay up to date and follow us on LinkedIn, Twitter and
                YouTube.

                Max-Planck-Institut für Eisenforschung GmbH

                Max-Planck-Straße 1

                D-40237 Düsseldorf

                Handelsregister B 2533

                Amtsgericht Düsseldorf

                Geschäftsführung

                Prof. Dr. Gerhard Dehm

                Prof. Dr. Jörg Neugebauer

                Prof. Dr. Dierk Raabe

                Dr. Kai de Weldige

                Ust.-Id.-Nr.: DE 11 93 58 514

                Steuernummer: 105 5891 1000

                Please consider that invitations and e-mails of our
                institute are

                only valid if they end with …@mpie.de
                
<https://urldefense.us/v3/__http:/mpie.de__;!!G_uCfscf7eWS!eslPb6ZEnby8M7NKRe_U_RT-95w2J8O2ngsc8rmrQVlMpEcB6ZTKlN2g65crXCWv7D2F9ubkzYI5nDpoeQ$>.


                If you are not sure of the validity please contact
                [email protected]

                Bitte beachten Sie, dass Einladungen zu
                Veranstaltungen und E-Mails

                aus unserem Haus nur mit der Endung …@mpie.de
                
<https://urldefense.us/v3/__http:/mpie.de__;!!G_uCfscf7eWS!eslPb6ZEnby8M7NKRe_U_RT-95w2J8O2ngsc8rmrQVlMpEcB6ZTKlN2g65crXCWv7D2F9ubkzYI5nDpoeQ$>
                gültig sind.

                In Zweifelsfällen wenden Sie sich bitte an [email protected]

                -------------------------------------------------



        ------------------------------------------------------------------------

        -------------------------------------------------
        Stay up to date and follow us on LinkedIn, Twitter and YouTube.

        Max-Planck-Institut für Eisenforschung GmbH
        Max-Planck-Straße 1
        D-40237 Düsseldorf

        Handelsregister B 2533
        Amtsgericht Düsseldorf

        Geschäftsführung
        Prof. Dr. Gerhard Dehm
        Prof. Dr. Jörg Neugebauer
        Prof. Dr. Dierk Raabe
        Dr. Kai de Weldige

        Ust.-Id.-Nr.: DE 11 93 58 514
        Steuernummer: 105 5891 1000


        Please consider that invitations and e-mails of our institute are
        only valid if they end with …@mpie.de 
<https://urldefense.us/v3/__http://mpie.de__;!!G_uCfscf7eWS!cggahkQVL4itzNc4xCIuB95w85Py0YzJ_d0ueBg3CxKFPTOpO7ANg7u8XPX1w9DBPF_tpI5-RXi12HTwVA$
 >.
        If you are not sure of the validity please contact [email protected]
        <mailto:[email protected]>

        Bitte beachten Sie, dass Einladungen zu Veranstaltungen und E-Mails
        aus unserem Haus nur mit der Endung …@mpie.de
        
<https://urldefense.us/v3/__http://mpie.de__;!!G_uCfscf7eWS!cggahkQVL4itzNc4xCIuB95w85Py0YzJ_d0ueBg3CxKFPTOpO7ANg7u8XPX1w9DBPF_tpI5-RXi12HTwVA$
 > gültig sind.
        In Zweifelsfällen wenden Sie sich bitte an [email protected]
        <mailto:[email protected]>
        -------------------------------------------------



    ------------------------------------------------------------------------
    -------------------------------------------------
    Stay up to date and follow us on LinkedIn, Twitter and YouTube.

    Max-Planck-Institut für Eisenforschung GmbH
    Max-Planck-Straße 1
    D-40237 Düsseldorf

    Handelsregister B 2533
    Amtsgericht Düsseldorf

    Geschäftsführung
    Prof. Dr. Gerhard Dehm
    Prof. Dr. Jörg Neugebauer
    Prof. Dr. Dierk Raabe
    Dr. Kai de Weldige

    Ust.-Id.-Nr.: DE 11 93 58 514
    Steuernummer: 105 5891 1000


    Please consider that invitations and e-mails of our institute are
    only valid if they end with …@mpie.de 
<https://urldefense.us/v3/__http://mpie.de__;!!G_uCfscf7eWS!cggahkQVL4itzNc4xCIuB95w85Py0YzJ_d0ueBg3CxKFPTOpO7ANg7u8XPX1w9DBPF_tpI5-RXi12HTwVA$
 >.
    If you are not sure of the validity please contact [email protected]

    Bitte beachten Sie, dass Einladungen zu Veranstaltungen und E-Mails
    aus unserem Haus nur mit der Endung …@mpie.de
    
<https://urldefense.us/v3/__http://mpie.de__;!!G_uCfscf7eWS!cggahkQVL4itzNc4xCIuB95w85Py0YzJ_d0ueBg3CxKFPTOpO7ANg7u8XPX1w9DBPF_tpI5-RXi12HTwVA$
 > gültig sind.
    In Zweifelsfällen wenden Sie sich bitte an [email protected]
    -------------------------------------------------


-------------------------------------------------
Stay up to date and follow us on LinkedIn, Twitter and YouTube.

Max-Planck-Institut für Eisenforschung GmbH
Max-Planck-Straße 1
D-40237 Düsseldorf

Handelsregister B 2533 Amtsgericht Düsseldorf

Geschäftsführung
Prof. Dr. Gerhard Dehm
Prof. Dr. Jörg Neugebauer
Prof. Dr. Dierk Raabe
Dr. Kai de Weldige

Ust.-Id.-Nr.: DE 11 93 58 514 Steuernummer: 105 5891 1000


Please consider that invitations and e-mails of our institute are only valid if they end with …@mpie.de. If you are not sure of the validity please contact [email protected]

Bitte beachten Sie, dass Einladungen zu Veranstaltungen und E-Mails
aus unserem Haus nur mit der Endung …@mpie.de gültig sind. In Zweifelsfällen wenden Sie sich bitte an [email protected]
-------------------------------------------------

Reply via email to