Nevermind, it was my fault in thinking the error was with u_abs and not u. I switched from local array based value setting for initial conditions to VecSetValues when converting the uniprocessor example to an MPI program. While I removed VecRestoreArray and swapped u_local[*ptr] assignments with VecSetValues, I missed out on adding VecAssembleBegin/End to compensate. Thanks for pointing out that the error was with u and not u_abs.
On Thu, Jan 17, 2019 at 1:29 PM Sajid Ali <sajidsyed2...@u.northwestern.edu> wrote: > As requested : > > [sajid@xrm free_space]$ ./ex_modify > Solving a linear TS problem on 1 processor > m : 256, slices : 1000.000000, lambda : 1.239800e-10 > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Object is in wrong state > [0]PETSC ERROR: Not for unassembled vector > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: > 0cd88d33dca7e1f18a10cbb6fcb08f83d068c5f4 GIT Date: 2019-01-06 13:27:26 > -0600 > [0]PETSC ERROR: ./ex_modify on a named xrm by sajid Thu Jan 17 13:29:12 > 2019 > [0]PETSC ERROR: Configure options > --prefix=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/petsc-develop-2u6vuwagkoczyvnpsubzrubmtmpfhhkj > --with-ssl=0 --download-c2html=0 --download-sowing=0 --download-hwloc=0 > CFLAGS= FFLAGS= CXXFLAGS= > --with-cc=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/mpich-3.3-z5uiwmx24jylnivuhlnqjjmm674ozj6x/bin/mpicc > --with-cxx=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/mpich-3.3-z5uiwmx24jylnivuhlnqjjmm674ozj6x/bin/mpic++ > --with-fc=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/mpich-3.3-z5uiwmx24jylnivuhlnqjjmm674ozj6x/bin/mpif90 > --with-precision=double --with-scalar-type=complex > --with-shared-libraries=1 --with-debugging=1 --with-64-bit-indices=0 > --with-debugging=%s > --with-blaslapack-lib="/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/intel-mkl-2019.0.117-wzqlcijwx7odz2x5chembudo5leqpfh2/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64/libmkl_intel_lp64.so > /raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/intel-mkl-2019.0.117-wzqlcijwx7odz2x5chembudo5leqpfh2/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64/libmkl_sequential.so > /raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/intel-mkl-2019.0.117-wzqlcijwx7odz2x5chembudo5leqpfh2/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64/libmkl_core.so > /lib64/libpthread.so /lib64/libm.so /lib64/libdl.so" --with-x=1 > --with-clanguage=C --with-scalapack=0 --with-metis=1 > --with-metis-dir=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/metis-5.1.0-nhgzn4kjskctzmzv35mstvd34nj2ugek > --with-hdf5=1 > --with-hdf5-dir=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/hdf5-1.10.4-ltstvsxvyjue2gxfegi4nvr6c5xg3zww > --with-hypre=0 --with-parmetis=1 > --with-parmetis-dir=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/parmetis-4.0.3-hw3j2ss7mjsc5x5f2gaflirnuufzptil > --with-mumps=0 --with-trilinos=0 --with-cxx-dialect=C++11 > --with-superlu_dist-include=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/superlu-dist-develop-cpspq4ca2hnyvhx4mz7zsupbj3do6md3/include > --with-superlu_dist-lib=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/superlu-dist-develop-cpspq4ca2hnyvhx4mz7zsupbj3do6md3/lib/libsuperlu_dist.a > --with-superlu_dist=1 --with-suitesparse=0 > --with-zlib-include=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/zlib-1.2.11-ldu43taplg2nbkxtem346zq4ibhad64i/include > --with-zlib-lib="-L/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/zlib-1.2.11-ldu43taplg2nbkxtem346zq4ibhad64i/lib > -lz" --with-zlib=1 > [0]PETSC ERROR: #1 VecCopy() line 1571 in > /tmp/sajid/spack-stage/spack-stage-nwxY3Q/petsc/src/vec/vec/interface/vector.c > [0]PETSC ERROR: #2 Monitor() line 296 in > /raid/home/sajid/packages/xwp_petsc/1d/free_space/ex_modify.c > [0]PETSC ERROR: #3 TSMonitor() line 3929 in > /tmp/sajid/spack-stage/spack-stage-nwxY3Q/petsc/src/ts/interface/ts.c > [0]PETSC ERROR: #4 TSSolve() line 3843 in > /tmp/sajid/spack-stage/spack-stage-nwxY3Q/petsc/src/ts/interface/ts.c > [0]PETSC ERROR: #5 main() line 188 in > /raid/home/sajid/packages/xwp_petsc/1d/free_space/ex_modify.c > [0]PETSC ERROR: No PETSc Option Table entries > [0]PETSC ERROR: ----------------End of Error Message -------send entire > error message to petsc-ma...@mcs.anl.gov---------- > application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0 > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=73 > : > system msg for write_line failure : Bad file descriptor > > -- Sajid Ali Applied Physics Northwestern University