You need to run in the debugger, one of your arguments to the function call is not right for some reason.
Barry On Oct 28, 2011, at 5:24 AM, PEREZ CERQUERA MANUEL RICARDO wrote: > Hi everybody, > > I would like to know How do I add values from a sequential > PETSC vector from each process into a parallel vector? , > I'm doing this: > > PetscScatter ctx > PetscInt NOfTotalBEMFunctions > idx=(/(i,i=0,NOfTotalBEMFunctions-1)/) > > I create the Vectors LocalZNearNOfNonZeros and > GlobalZNearNOfNonZeros with VecCreateSeq(...) and > VecCreateMPI(...) respectively > > CALL > ISCreateGeneral(PETSC_COMM_SELF,NOfTotalBEMFunctions,idx,from,ierr); > CALL > ISCreateGeneral(PETSC_COMM_WORLD,NOfTotalBEMFunctions,idx,towards,ierr); > CALL > VecScatterCreate(LocalZNearNOfNonZeros,from,GlobalZNearNOfNonZeros,towards,ctx,ierr); > CALL > VecScatterBegin(ctx,LocalZNearNOfNonZeros,GlobalZNearNOfNonZeros,ADD_VALUES,SCATTER_FORWARD,ierr) > CALL > VecScatterEnd(ctx,LocalZNearNOfNonZeros,GlobalZNearNOfNonZeros,ADD_VALUES,SCATTER_FORWARD,ierr) > CALL VecScatterDestroy(ctx,ierr); > > So when I run in two Process , It crashes in > ISCreateGeneral and I got this error: > > -------- > [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation > Violation, probably m > emory access out of range > [1]PETSC ERROR: Try option -start_in_debugger or > -on_error_attach_debugger > [1]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.h > tml#valgrind[1]PETSC ERROR: or try http://valgrind.org on > GNU/linux and Apple Ma > c OS X to find memory corruption errors > [1]PETSC ERROR: likely location of problem given in stack > below > [1]PETSC ERROR: --------------------- Stack Frames > ---------------------------- > -------- > [0]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation > Violation, probably m > emory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or > -on_error_attach_debugger > [0]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.h > tml#valgrind[0]PETSC ERROR: or try http://valgrind.org on > GNU/linux and Apple Ma > c OS X to find memory corruption errors > [0]PETSC ERROR: likely location of problem given in stack > below > [0]PETSC ERROR: --------------------- Stack Frames > ---------------------------- > -------- > [1]PETSC ERROR: Note: The EXACT line numbers in the stack > are not available, > [1]PETSC ERROR: INSTEAD the line number of the start > of the function > [1]PETSC ERROR: is given. > [0]PETSC ERROR: Note: The EXACT line numbers in the stack > are not available, > [0]PETSC ERROR: INSTEAD the line number of the start > of the function > [0]PETSC ERROR: is given. > [1]PETSC ERROR: --------------------- Error Message > ---------------------------- > -------- > [1]PETSC ERROR: Signal received! > [1]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [1]PETSC ERROR: Petsc Release Version 3.2.0, Patch 2, Fri > Sep 16 10:10:45 CDT 20 > 11 > [1]PETSC ERROR: See docs/changes/index.html for recent > updates. > [1]PETSC ERROR: See docs/faq.html for hints about trouble > shooting. > [1]PETSC ERROR: See docs/index.html for manual pages. > [1]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [1]PETSC ERROR: C:\Documents and > Settings\d022117\Desktop\MPIRunsPatrju\PAtreju. > exe on a arch-mswi named GVSRV by d022117 Fri Oct 28 > 12:18:28 2011 > [1]PETSC ERROR: Libraries linked from > /home/d022117/petsc-3.2-p2/arch-mswin-cxx- > debug/lib > [1]PETSC ERROR: Configure run at Fri Sep 30 18:13:15 2011 > [1]PETSC ERROR: Configure options --with-cc="win32fe cl" > --with-fc="win32fe ifor > t" --with-cxx="win32fe cl" --download-f-blas-lapack=1 > --with-scalar-type=complex > --with-clanguage=cxx --useThreads=0 > [1]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [1]PETSC ERROR: User provided function() line 0 in unknown > directory unknown fil > e > application called MPI_Abort(MPI_COMM_WORLD, 59) - process > 1 > [0]PETSC ERROR: --------------------- Error Message > ---------------------------- > -------- > [0]PETSC ERROR: Signal received! > [0]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 2, Fri > Sep 16 10:10:45 CDT 20 > 11 > [0]PETSC ERROR: See docs/changes/index.html for recent > updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble > shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: C:\Documents and > Settings\d022117\Desktop\MPIRunsPatrju\PAtreju. > exe on a arch-mswi named GVSRV by d022117 Fri Oct 28 > 12:18:28 2011 > [0]PETSC ERROR: Libraries linked from > /home/d022117/petsc-3.2-p2/arch-mswin-cxx- > debug/lib > [0]PETSC ERROR: Configure run at Fri Sep 30 18:13:15 2011 > [0]PETSC ERROR: Configure options --with-cc="win32fe cl" > --with-fc="win32fe ifor > t" --with-cxx="win32fe cl" --download-f-blas-lapack=1 > --with-scalar-type=complex > --with-clanguage=cxx --useThreads=0 > [0]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: User provided function() line 0 in unknown > directory unknown fil > e > application called MPI_Abort(MPI_COMM_WORLD, 59) - process > 0 > > job aborted: > rank: node: exit code[: error message] > 0: gvsrv.delen.polito.it: 59: process 0 exited without > calling finalize > 1: gvsrv.delen.polito.it: 59: process 1 exited without > calling finalize > > I don't know how to solve it, and I would like to know if > I'm really doing well the gatter operation. > > Thanks, Manuel . > > > Eng. Manuel Ricardo Perez Cerquera. MSc. Ph.D student > Antenna and EMC Lab (LACE) > Istituto Superiore Mario Boella (ISMB) > Politecnico di Torino > Via Pier Carlo Boggio 61, Torino 10138, Italy > Email: manuel.perezcerquera at polito.it > Phone: +39 0112276704 > Fax: +39 011 2276 299 >
