On 21/2/2018 9:12 AM, Matthew Knepley wrote:
On Tue, Feb 20, 2018 at 8:08 PM, TAY wee-beng <zon...@gmail.com <mailto:zon...@gmail.com>> wrote:


    On 21/2/2018 9:00 AM, Matthew Knepley wrote:
    On Tue, Feb 20, 2018 at 7:54 PM, TAY wee-beng <zon...@gmail.com
    <mailto:zon...@gmail.com>> wrote:

        Hi,

        When I run my CFD code with a grid size of 1119x1119x499 (
        total grid size =    624828339 ), I got the error saying I
        need to compile PETSc with 64-bit indices.

        So I tried to compile PETSc again and then compile my CFD
        code with the newly compiled PETSc. However, now I got
        segmentation error:

        rm: cannot remove `log': No such file or directory
        [409]PETSC ERROR:
        ------------------------------------------------------------------------
        [409]PETSC ERROR: [535]PETSC ERROR: [410]PETSC ERROR:
        ------------------------------------------------------------------------
        [410]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
        Violation, probably memory access out of range
        [410]PETSC ERROR: Try option -start_in_debugger or
        -on_error_attach_debugger
        [410]PETSC ERROR: [536]PETSC ERROR:
        ------------------------------------------------------------------------
        [536]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
        Violation, probably memory access out of range
        [536]PETSC ERROR: Try option -start_in_debugger or
        -on_error_attach_debugger
        [536]PETSC ERROR: or see
        http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
        <http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind>
        [536]PETSC ERROR: or try http://valgrind.org on GNU/linux and
        Apple Mac OS X to find memory corruption errors
        [536]PETSC ERROR: likely location of problem given in stack below
        [536]PETSC ERROR: ---------------------  Stack Frames
        ------------------------------------
        [536]PETSC ERROR: Note: The EXACT line numbers in the stack
        are not available,
        [536]PETSC ERROR:       INSTEAD the line number of the start
        of the function
        [536]PETSC ERROR:       is given.
        [536]PETSC ERROR: [536] DMDACheckOwnershipRanges_Private line
        581
        /home/users/nus/tsltaywb/source/petsc-3.7.6/src/dm/impls/da/da.c
        [536]PETSC ERROR: or see
        http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
        <http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind>
        [410]PETSC ERROR: or try http://valgrind.org on GNU/linux and
        Apple Mac OS X to find memory corruption errors
        [410]PETSC ERROR: likely location of problem given in stack below
        [410]PETSC ERROR: ---------------------  Stack Frames
        ------------------------------------
        [410]PETSC ERROR: Note: The EXACT line numbers in the stack
        are not available,
        [897]PETSC ERROR: [536] DMDASetOwnershipRanges line 613
        /home/users/nus/tsltaywb/source/petsc-3.7.6/src/dm/impls/da/da.c
        [536]PETSC ERROR: [536] DMDACreate3d line 1434
        /home/users/nus/tsltaywb/source/petsc-3.7.6/src/dm/impls/da/da3.c
        [536]PETSC ERROR: --------------------- Error Message
        --------------------------------------------------------------

        The CFD code worked previously but increasing the problem
        size results in segmentation error. It seems to be related to
        DMDACreate3d and DMDASetOwnershipRanges. Any idea where the
        problem lies?

        Besides, I want to know when and why do I have to use PETSc
        with 64-bit indices?


    1) A 32-bit integer can hold numbers up to 2^32 = 4.2e9, so if
    you have a 3D velocity, pressure, and energy, you already have
    3e9 unknowns,
        before you even start to count nonzero entries in the matrix.
    64-bit integers allow you to handle these big sizes.

        Also, can I use the 64-bit indices version with smaller sized
        problems?


    2) Yes

        And is there a speed difference between using the 32-bit and
        64-bit indices ver?


    3) I have seen no evidence of this

    4) My guess is that you have defines regular integers in your
    code and passed them to PETSc, rather than using PetscInt as the
    type.
    Oh that seems probable. So I am still using integer(4) when it
    should be integer(8) for some values, is that so? If I use
    PetscInt, is it the same as integer(8)? Or does it depend on the
    actual number?


PetscInt will be integer(4) if you configure with 32-bit ints, and integer(8) if you configure with 64-bit ints. If you use it consistently, you can avoid problems
with matching the PETSc API.

    I wonder if I replace all my integer to PetscInt, will there be a
    large increase in memory usage, because all integer(4) now becomes
    integer(8)?


Only if you have large integer storage. Most codes do not.
Hi,

What do you mean by "large integer storage"?

Btw, I got the following error when I ran a simple small test case with my CFD code:

[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Out of memory. This could be due to allocating
[0]PETSC ERROR: too large an object or bleeding by not properly
[0]PETSC ERROR: destroying unneeded objects.
[0]PETSC ERROR: Memory allocated 0 Memory used by process 52858880
[0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info.
[0]PETSC ERROR: Memory requested 6917565139726106624
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.8.3, Dec, 09, 2017
[0]PETSC ERROR: ./a.out on a petsc-3.8.3_intel_64_rel named nus02 by tsltaywb Thu Feb 22 10:34:29 2018 [0]PETSC ERROR: Configure options --with-mpi-dir=/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64 --with-blaslapack-dir=/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mkl/lib/intel64 --download-hypre=/home/users/nus/tsltaywb/source/git.hypre.tar.gz --with-debugging=0 --prefix=/home/users/nus/tsltaywb/lib/petsc-3.8.3_intel_64_rel --with-shared-libraries=0 --known-mpi-shared-libraries=0 --with-fortran-interfaces=1 --CFLAGS="-xHost -g -O3" --CXXFLAGS="-xHost -g -O3" --FFLAGS="-xHost -g -O3" --with-64-bit-indices [0]PETSC ERROR: #105 DMSetUp_DA() line 18 in /home/users/nus/tsltaywb/source/petsc-3.8.3/src/dm/impls/da/dareg.c [0]PETSC ERROR: #106 DMSetUp_DA() line 18 in /home/users/nus/tsltaywb/source/petsc-3.8.3/src/dm/impls/da/dareg.c [0]PETSC ERROR: #107 DMSetUp() line 720 in /home/users/nus/tsltaywb/source/petsc-3.8.3/src/dm/interface/dm.c [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Arguments are incompatible
[0]PETSC ERROR: Ownership ranges sum to 4294967337 but global dimension is 41 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.8.3, Dec, 09, 2017
[0]PETSC ERROR: ./a.out on a petsc-3.8.3_intel_64_rel named nus02 by tsltaywb Thu Feb 22 10:34:29 2018 [0]PETSC ERROR: Configure options --with-mpi-dir=/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64 --with-blaslapack-dir=/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mkl/lib/intel64 --download-hypre=/home/users/nus/tsltaywb/source/git.hypre.tar.gz --with-debugging=0 --prefix=/home/users/nus/tsltaywb/lib/petsc-3.8.3_intel_64_rel --with-shared-libraries=0 --known-mpi-shared-libraries=0 --with-fortran-interfaces=1 --CFLAGS="-xHost -g -O3" --CXXFLAGS="-xHost -g -O3" --FFLAGS="-xHost -g -O3" --with-64-bit-indices [0]PETSC ERROR: #108 DMDACheckOwnershipRanges_Private() line 548 in /home/users/nus/tsltaywb/source/petsc-3.8.3/src/dm/impls/da/da.c [0]PETSC ERROR: #109 DMDASetOwnershipRanges() line 580 in /home/users/nus/tsltaywb/source/petsc-3.8.3/src/dm/impls/da/da.c [0]PETSC ERROR: #110 DMDACreate3d() line 1444 in /home/users/nus/tsltaywb/source/petsc-3.8.3/src/dm/impls/da/da3.c

What's the problem?

Thanks.


  Thanks,

    Matt

    Thanks.

      Thanks,

         Matt


-- Thank you very much.

        Yours sincerely,

        ================================================
        TAY Wee-Beng (Zheng Weiming) 郑伟明
        Personal research webpage:
        http://tayweebeng.wixsite.com/website
        <http://tayweebeng.wixsite.com/website>
        Youtube research showcase:
        https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
        <https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA>
        linkedin: www.linkedin.com/in/tay-weebeng
        <http://www.linkedin.com/in/tay-weebeng>
        ================================================




-- What most experimenters take for granted before they begin their
    experiments is infinitely more interesting than any results to
    which their experiments lead.
    -- Norbert Wiener

    https://www.cse.buffalo.edu/~knepley/
    <http://www.caam.rice.edu/%7Emk51/>




--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/%7Emk51/>

Reply via email to