On Fri, 21 Oct 2016, Barry Smith wrote: > > > On Oct 21, 2016, at 5:16 PM, Satish Balay <ba...@mcs.anl.gov> wrote: > > > > The issue with this test code is - using MatLoad() twice [with the > > same object - without destroying it]. Not sure if thats supporsed to > > work.. > > If the file has two matrices in it then yes a second call to MatLoad() > with the same matrix should just load in the second matrix from the file > correctly. Perhaps we need a test in our test suite just to make sure that > works.
This test code crashes with: MatLoad() MatView() MatLoad() MatView() Satish -------- balay@asterix /home/balay/download-pine/x/superlu_dist_test $ cat ex16.c static char help[] = "Reads matrix and debug solver\n\n"; #include <petscksp.h> #undef __FUNCT__ #define __FUNCT__ "main" int main(int argc,char **args) { Mat A; PetscViewer fd; /* viewer */ char file[PETSC_MAX_PATH_LEN]; /* input file name */ PetscErrorCode ierr; PetscBool flg; PetscInitialize(&argc,&args,(char*)0,help); ierr = PetscOptionsGetString(NULL,NULL,"-f",file,PETSC_MAX_PATH_LEN,&flg); CHKERRQ(ierr); if (!flg) SETERRQ(PETSC_COMM_WORLD,1,"Must indicate binary file with the -f option"); ierr = MatCreate(PETSC_COMM_WORLD,&A); CHKERRQ(ierr); ierr = PetscPrintf(PETSC_COMM_WORLD, "First MatLoad! \n");CHKERRQ(ierr); ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,file,FILE_MODE_READ,&fd); CHKERRQ(ierr); ierr = MatLoad(A,fd); CHKERRQ(ierr); ierr = PetscViewerDestroy(&fd); CHKERRQ(ierr); ierr = MatView(A,0);CHKERRQ(ierr); ierr = PetscPrintf(PETSC_COMM_WORLD, "Second MatLoad! \n");CHKERRQ(ierr); ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,file,FILE_MODE_READ,&fd); CHKERRQ(ierr); ierr = MatLoad(A,fd); CHKERRQ(ierr); ierr = PetscViewerDestroy(&fd); CHKERRQ(ierr); ierr = MatView(A,0);CHKERRQ(ierr); ierr = MatDestroy(&A); CHKERRQ(ierr); ierr = PetscFinalize(); return 0; } balay@asterix /home/balay/download-pine/x/superlu_dist_test $ make ex16 mpicc -o ex16.o -c -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 -I/home/balay/petsc/include -I/home/balay/petsc/arch-idx64-slu/include `pwd`/ex16.c mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 -o ex16 ex16.o -Wl,-rpath,/home/balay/petsc/arch-idx64-slu/lib -L/home/balay/petsc/arch-idx64-slu/lib -lpetsc -Wl,-rpath,/home/balay/petsc/arch-idx64-slu/lib -lsuperlu_dist -llapack -lblas -lparmetis -lmetis -lX11 -lpthread -lm -Wl,-rpath,/home/balay/soft/mpich-3.1.4/lib -L/home/balay/soft/mpich-3.1.4/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/6.2.1 -L/usr/lib/gcc/x86_64-redhat-linux/6.2.1 -lmpifort -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpicxx -lstdc++ -Wl,-rpath,/home/balay/soft/mpich-3.1.4/lib -L/home/balay/soft/mpich-3.1.4/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/6.2.1 -L/usr/lib/gcc/x86_64-redhat-linux/6.2.1 -ldl -Wl,-rpath,/home/balay/soft/mpich-3.1.4/lib -lmpi -lgcc_s -ldl /usr/bin/rm -f ex16.o balay@asterix /home/balay/download-pine/x/superlu_dist_test $ mpiexec -n 2 ./ex16 -f ~/datafiles/matrices/small First MatLoad! Mat Object: 2 MPI processes type: mpiaij row 0: (0, 4.) (1, -1.) (6, -1.) row 1: (0, -1.) (1, 4.) (2, -1.) (7, -1.) row 2: (1, -1.) (2, 4.) (3, -1.) (8, -1.) row 3: (2, -1.) (3, 4.) (4, -1.) (9, -1.) row 4: (3, -1.) (4, 4.) (5, -1.) (10, -1.) row 5: (4, -1.) (5, 4.) (11, -1.) row 6: (0, -1.) (6, 4.) (7, -1.) (12, -1.) row 7: (1, -1.) (6, -1.) (7, 4.) (8, -1.) (13, -1.) row 8: (2, -1.) (7, -1.) (8, 4.) (9, -1.) (14, -1.) row 9: (3, -1.) (8, -1.) (9, 4.) (10, -1.) (15, -1.) row 10: (4, -1.) (9, -1.) (10, 4.) (11, -1.) (16, -1.) row 11: (5, -1.) (10, -1.) (11, 4.) (17, -1.) row 12: (6, -1.) (12, 4.) (13, -1.) (18, -1.) row 13: (7, -1.) (12, -1.) (13, 4.) (14, -1.) (19, -1.) row 14: (8, -1.) (13, -1.) (14, 4.) (15, -1.) (20, -1.) row 15: (9, -1.) (14, -1.) (15, 4.) (16, -1.) (21, -1.) row 16: (10, -1.) (15, -1.) (16, 4.) (17, -1.) (22, -1.) row 17: (11, -1.) (16, -1.) (17, 4.) (23, -1.) row 18: (12, -1.) (18, 4.) (19, -1.) (24, -1.) row 19: (13, -1.) (18, -1.) (19, 4.) (20, -1.) (25, -1.) row 20: (14, -1.) (19, -1.) (20, 4.) (21, -1.) (26, -1.) row 21: (15, -1.) (20, -1.) (21, 4.) (22, -1.) (27, -1.) row 22: (16, -1.) (21, -1.) (22, 4.) (23, -1.) (28, -1.) row 23: (17, -1.) (22, -1.) (23, 4.) (29, -1.) row 24: (18, -1.) (24, 4.) (25, -1.) (30, -1.) row 25: (19, -1.) (24, -1.) (25, 4.) (26, -1.) (31, -1.) row 26: (20, -1.) (25, -1.) (26, 4.) (27, -1.) (32, -1.) row 27: (21, -1.) (26, -1.) (27, 4.) (28, -1.) (33, -1.) row 28: (22, -1.) (27, -1.) (28, 4.) (29, -1.) (34, -1.) row 29: (23, -1.) (28, -1.) (29, 4.) (35, -1.) row 30: (24, -1.) (30, 4.) (31, -1.) row 31: (25, -1.) (30, -1.) (31, 4.) (32, -1.) row 32: (26, -1.) (31, -1.) (32, 4.) (33, -1.) row 33: (27, -1.) (32, -1.) (33, 4.) (34, -1.) row 34: (28, -1.) (33, -1.) (34, 4.) (35, -1.) row 35: (29, -1.) (34, -1.) (35, 4.) Second MatLoad! Mat Object: 2 MPI processes type: mpiaij [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Argument out of range [0]PETSC ERROR: Column too large: col 32628 max 35 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.7.4-1729-g4c4de23 GIT Date: 2016-10-20 22:22:58 +0000 [0]PETSC ERROR: ./ex16 on a arch-idx64-slu named asterix by balay Fri Oct 21 18:31:45 2016 [0]PETSC ERROR: Configure options --download-metis --download-parmetis --download-superlu_dist PETSC_ARCH=arch-idx64-slu [0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 585 in /home/balay/petsc/src/mat/impls/aij/mpi/mpiaij.c [0]PETSC ERROR: #2 MatSetValues() line 1278 in /home/balay/petsc/src/mat/interface/matrix.c [0]PETSC ERROR: #3 MatView_MPIAIJ_ASCIIorDraworSocket() line 1404 in /home/balay/petsc/src/mat/impls/aij/mpi/mpiaij.c [0]PETSC ERROR: #4 MatView_MPIAIJ() line 1440 in /home/balay/petsc/src/mat/impls/aij/mpi/mpiaij.c [0]PETSC ERROR: #5 MatView() line 989 in /home/balay/petsc/src/mat/interface/matrix.c [0]PETSC ERROR: #6 main() line 30 in /home/balay/download-pine/x/superlu_dist_test/ex16.c [0]PETSC ERROR: PETSc Option Table entries: [0]PETSC ERROR: -display :0.0 [0]PETSC ERROR: -f /home/balay/datafiles/matrices/small [0]PETSC ERROR: -malloc_dump [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-ma...@mcs.anl.gov---------- application called MPI_Abort(MPI_COMM_WORLD, 63) - process 0 [cli_0]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, 63) - process 0 =================================================================================== = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES = PID 4434 RUNNING AT asterix = EXIT CODE: 63 = CLEANING UP REMAINING PROCESSES = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES =================================================================================== balay@asterix /home/balay/download-pine/x/superlu_dist_test $