**
Hi, guys,
I tried to use parallel in my code, but got error. I traced my code and
found where the error is:
I defined vector using:
    NumericVector < Number > &U11 = system.add_vector("U11");
and init it:
    U11.init(system.n_dofs(), system.n_local_dofs());
After solve, I stored the solution in the vector:
    U11 = *system.solution;
Then, in the following, I want to use the solution value in the vector:
    a = U11(dof_indices[i]);
But I found that U11 only contains local dofs, when I want to access dofs
not stored in local, I got an error.
So where am I wrong?
Thank you for your help!

Error:

 --------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 DUP FROM 0
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC
 ERROR: or try http://valgrind.org
 on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: --------------------- Error Message
------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 5, Mon Sep 27
11:51:54 CDT 2010
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: ./main-opt on a linux-gnu named ubuntu by cyw Sat Aug
25 18:40:36 2012
[0]PETSC ERROR: Libraries linked from
/build/buildd/petsc-3.1.dfsg/linux-gnu-c-opt/lib
[0]PETSC ERROR: Configure run at Mon Mar  7 18:34:33 2011
[0]PETSC ERROR: Configure options --with-shared --with-debugging=0
--useThreads 0 --with-clanguage=C++ --with-c-support
--with-fortran-interfaces=1 --with-mpi-dir=/usr/lib/openmpi
--with-mpi-shared=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack
--with-blacs=1 --with-blacs-include=/usr/include
--with-blacs-lib="[/usr/lib/libblacsCinit-openmpi.so,/usr/lib/libblacs-openmpi.so]"
--with-scalapack=1 --with-scalapack-include=/usr/include
--with-scalapack-lib=/usr/lib/libscalapack-openmpi.so --with-mumps=1
--with-mumps-include=/usr/include
--with-mumps-lib="[/usr/lib/libdmumps.so,/usr/lib/libzmumps.so,/usr/lib/libsmumps.so,/usr/lib/libcmumps.so,/usr/lib/libmumps_common.so,/usr/lib/libpord.so]"
--with-umfpack=1 --with-umfpack-include=/usr/include/suitesparse
--with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]"
--with-spooles=1 --with-spooles-include=/usr/include/spooles
--with-spooles-lib=/usr/lib/libspooles.so --with-hypre=1
--with-hypre-dir=/usr --with-scotch=1
--with-scotch-include=/usr/include/scotch
--with-scotch-lib=/usr/lib/libscotch.so --with-hdf5=1
--with-hdf5-dir=/usr
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory
unknown file
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 2734 on
node ubuntu exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
------------------------------
Cai Yuanwu  蔡园武
Dept. of Engineering Mechanics,
Dalian University of Technology,
Dalian 116024, China
------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
Libmesh-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to