I have it set up as:
DMCompositeCreate(PETSC_COMM_WORLD, &user.packer);
DMRedundantCreate(PETSC_COMM_WORLD, 0, 3, &user.p_dm);
DMCompositeAddDM(user.packer,user.p_dm);
DMDACreate1d(PETSC_COMM_WORLD,DM_BOUNDARY_GHOSTED,
nx, 4, 1, NULL, &user.Q_dm);
DMCompositeAddDM(user.packer,user.Q_dm);
DMCreateGlobalVector(user.packer,&U);
where the user.packer structure has
DM packer;
DM p_dm, Q_dm;
Q_dm holds the field variables and p_dm holds the scalar values (the nonlinear
eigenvalues).
Here are some of the errors that are generated:
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Argument out of range
[0]PETSC ERROR: New nonzero at (0,3) caused a malloc
Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off
this check
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.5.3, unknown
[0]PETSC ERROR: ./blowup_batch2 on a arch-macports named gs_air by gideon Thu
Aug 27 22:40:54 2015
[0]PETSC ERROR: Configure options --prefix=/opt/local
--prefix=/opt/local/lib/petsc --with-valgrind=0 --with-shared-libraries
--with-debugging=0 --with-c2html-dir=/opt/local --with-x=0
--with-blas-lapack-lib=/System/Library/Frameworks/Accelerate.framework/Versions/Current/Accelerate
--with-hwloc-dir=/opt/local --with-suitesparse-dir=/opt/local
--with-superlu-dir=/opt/local --with-metis-dir=/opt/local
--with-parmetis-dir=/opt/local --with-scalapack-dir=/opt/local
--with-mumps-dir=/opt/local --with-superlu_dist-dir=/opt/local
CC=/opt/local/bin/mpicc-mpich-mp CXX=/opt/local/bin/mpicxx-mpich-mp
FC=/opt/local/bin/mpif90-mpich-mp F77=/opt/local/bin/mpif90-mpich-mp
F90=/opt/local/bin/mpif90-mpich-mp COPTFLAGS=-Os CXXOPTFLAGS=-Os FOPTFLAGS=-Os
LDFLAGS="-L/opt/local/lib -Wl,-headerpad_max_install_names"
CPPFLAGS=-I/opt/local/include CFLAGS="-Os -arch x86_64" CXXFLAGS=-Os FFLAGS=-Os
FCFLAGS=-Os F90FLAGS=-Os PETSC_ARCH=arch-macports
--with-mpiexec=mpiexec-mpich-mp
[0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 530 in
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.3/src/mat/impls/aij/mpi/mpiaij.c
[0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[1]PETSC ERROR: Argument out of range
[1]PETSC ERROR: Inserting a new nonzero (40003, 0) into matrix
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.5.3, unknown
[1]PETSC ERROR: ./blowup_batch2 on a arch-macports named gs_air by gideon Thu
Aug 27 22:40:54 2015
[1]PETSC ERROR: Configure options --prefix=/opt/local
--prefix=/opt/local/lib/petsc --with-valgrind=0 --with-shared-libraries
--with-debugging=0 --with-c2html-dir=/opt/local --with-x=0
--with-blas-lapack-lib=/System/Library/Frameworks/Accelerate.framework/Versions/Current/Accelerate
--with-hwloc-dir=/opt/local --with-suitesparse-dir=/opt/local
--with-superlu-dir=/opt/local --with-metis-dir=/opt/local
--with-parmetis-dir=/opt/local --with-scalapack-dir=/opt/local
--with-mumps-dir=/opt/local --with-superlu_dist-dir=/opt/local
CC=/opt/local/bin/mpicc-mpich-mp CXX=/opt/local/bin/mpicxx-mpich-mp
FC=/opt/local/bin/mpif90-mpich-mp F77=/opt/local/bin/mpif90-mpich-mp
F90=/opt/local/bin/mpif90-mpich-mp COPTFLAGS=-Os CXXOPTFLAGS=-Os FOPTFLAGS=-Os
LDFLAGS="-L/opt/local/lib -Wl,-headerpad_max_install_names"
CPPFLAGS=-I/opt/local/include CFLAGS="-Os -arch x86_64" CXXFLAGS=-Os FFLAGS=-Os
FCFLAGS=-Os F90FLAGS=-Os PETSC_ARCH=arch-macports
--with-mpiexec=mpiexec-mpich-mp
[1]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 561 in
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.3/src/mat/impls/aij/mpi/mpiaij.c
[1]PETSC ERROR: #2 MatSetValues() line 1135 in
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.3/src/mat/interface/matrix.c
-gideon
> On Aug 27, 2015, at 10:37 PM, Barry Smith <[email protected]> wrote:
>
>
> We need the full error message.
>
> But are you using a DMDA for the scalars? You should not be, you should be
> using a DMRedundant for the scalars.
>
> Barry
>
> Though you should not get this error even if you are using a DMDA there.
>
>> On Aug 27, 2015, at 9:32 PM, Gideon Simpson <[email protected]> wrote:
>>
>> I’m getting the following errors:
>>
>> [1]PETSC ERROR: Argument out of range
>> [1]PETSC ERROR: Inserting a new nonzero (40003, 0) into matrix
>>
>> Could this have to do with me using the DMComposite with one da holding the
>> scalar parameters and the other holding the field variables?
>>
>> -gideon
>>
>>> On Aug 27, 2015, at 10:15 PM, Matthew Knepley <[email protected]> wrote:
>>>
>>> On Thu, Aug 27, 2015 at 9:11 PM, Gideon Simpson <[email protected]>
>>> wrote:
>>> HI Barry,
>>>
>>> Nope, I’m not doing any grid sequencing. Clearly that makes a lot of sense,
>>> to solve on a spatially coarse mesh for the field variables, interpolate
>>> onto the finer mesh, and then solve again. I’m not entirely clear on the
>>> practical implementation
>>>
>>> SNES should do this automatically using -snes_grid_sequence <k>. If this
>>> does not work, complain. Loudly.
>>>
>>> Matt
>>>
>>> -gideon
>>>
>>>> On Aug 27, 2015, at 10:02 PM, Barry Smith <[email protected]> wrote:
>>>>
>>>>
>>>> Gideon,
>>>>
>>>> Are you using grid sequencing? Simply solve on a coarse grid,
>>>> interpolate u1 and u2 to a once refined version of the grid and use that
>>>> plus the mu lam as initial guess for the next level. Repeat to as fine a
>>>> grid as you want. You can use DMRefine() and DMGetInterpolation() to get
>>>> the interpolation needed to interpolate from the coarse to finer mesh.
>>>>
>>>> Then and only then you can use multigrid (with or without fieldsplit) to
>>>> solve the linear problems for finer meshes. Once you have the grid
>>>> sequencing working we can help you with this.
>>>>
>>>> Barry
>>>>
>>>>> On Aug 27, 2015, at 7:00 PM, Gideon Simpson <[email protected]>
>>>>> wrote:
>>>>>
>>>>> I’m working on a problem which, morally, can be posed as a system of
>>>>> coupled semi linear elliptic PDEs together with unknown nonlinear
>>>>> eigenvalue parameters, loosely, of the form
>>>>>
>>>>> -\Delta u_1 + f(u_1, u_2) = lam * u1 - mu * du2/dx
>>>>> -\Delta u_2 + g(u_1, u_2) = lam * u2 + mu * du1/dx
>>>>>
>>>>> Currently, I have it set up with a DMComposite with two sub da’s, one for
>>>>> the parameters (lam, mu), and one for the vector field (u_1, u_2) on the
>>>>> mesh. I have had success in solving this as a fully coupled system with
>>>>> SNES + sparse direct solvers (MUMPS, SuperLU).
>>>>>
>>>>> Lately, I am finding that, when the mesh resolution gets fine enough
>>>>> (i.e. 10^6-10^8 lattice points), my SNES gets stuck with the function
>>>>> norm = O(10^{-4}), eventually returning reason -6 (failed line search).
>>>>>
>>>>> Perhaps there is another way around the above problem, but one thing I
>>>>> was thinking of trying would be to get away from direct solvers, and I
>>>>> was hoping to use field split for this. However, it’s a bit beyond what
>>>>> I’ve seen examples for because it has 2 types of variables: scalar
>>>>> parameters which appear globally in the system and vector valued field
>>>>> variables. Any suggestions on how to get started?
>>>>>
>>>>> -gideon
>>>>>
>>>>
>>>
>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>
>