Gerard Gorman emailed the following on 21/02/12 16:22: > Hi > > I would like to harvest a selection of typical use cases for > benchmarking PETSc/OpenMP. Ideally they would have features such as > preallocating matrices rather than adding one at a time, be easy to > configure for different problem sizes etc. Has anyone already put > together such a list - what would be considered good/best practice here? > > Cheers > Gerard > Further to this - do I recall someone saying they had unpushed bug fixes to the test cases? For example, I've tried:
ggorman at cynic:~/projects/petsc/petsc-dev/src/ksp/ksp/examples/tests$ ./ex10 -m 4 m = 2, N=375 [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Argument out of range! [0]PETSC ERROR: New nonzero at (6,119) caused a malloc! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Development HG revision: c7afeaa80089c71502abd0b0aca07e7ea2f3d431 HG Date: Tue Feb 21 09:48:53 2012 -0600 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./ex10 on a arch-linu named cynic by ggorman Tue Feb 21 16:28:33 2012 [0]PETSC ERROR: Libraries linked from /home/ggorman/projects/petsc/petsc-dev/arch-linux2-c-debug/lib [0]PETSC ERROR: Configure run at Tue Feb 21 16:23:30 2012 [0]PETSC ERROR: Configure options [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: MatSetValues_SeqAIJ() line 331 in /home/ggorman/projects/petsc/petsc-dev/src/mat/impls/aij/seq/aij.c [0]PETSC ERROR: MatSetValues() line 1142 in /home/ggorman/projects/petsc/petsc-dev/src/mat/interface/matrix.c [0]PETSC ERROR: AddElement() line 198 in src/ksp/ksp/examples/tests/ex10.c [0]PETSC ERROR: GetElasticityMatrix() line 130 in src/ksp/ksp/examples/tests/ex10.c [0]PETSC ERROR: main() line 41 in src/ksp/ksp/examples/tests/ex10.c -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 63. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- Pretty much the only value for the problem size (ie the -m option) is the default. Cheers Gerard
