Hi, I am trying to configure my petsc install with an MPI installation to make use of a dual quad-core desktop system running Ubuntu. But eventhough the configure/make process went through without problems, the scalability of the programs don't seem to reflect what I expected. My configure options are
--download-f-blas-lapack=1 --with-mpi-dir=/usr/lib/ --download-mpich=1 --with-mpi-shared=0 --with-shared=0 --COPTFLAGS=-g --download-parmetis=1 --download-superlu_dist=1 --download-hypre=1 --download-blacs=1 --download-scalapack=1 --with-clanguage=C++ --download-plapack=1 --download-mumps=1 --download-umfpack=yes --with-debugging=1 --with-errorchecking=yes Is there something else that needs to be done as part of the configure process to enable a decent scaling ? I am only comparing programs with mpiexec (-n 1) and (-n 2) but they seem to be taking approximately the same time as noted from -log_summary. If it helps, I've been testing with snes/examples/tutorials/ex20.c for all purposes with a custom -grid parameter from command-line to control the number of unknowns. If there is something you've witnessed before in this configuration or if you need anything else to analyze the problem, do let me know. Thanks, Vijay
