Hello As was suggested on this list, I configured petsc with --download-plapack:
./config/configure.py PETSC_ARCH=intel_R_mpich2 --download-plapack --with-debugging=0 --with-blas-lapack-dir=/opt/intel/mkl/10.1.1.019/lib/em64t/ --with-mpi-dir=/opt/mpich2/mpich2-1.1-intel-11.1 --with-gnu-compilers=0 Trying to run the PLAPACK tests on our cluster fails. I am in contact with Robert van de Geijn (the author) of PLAPCK and he thinks these issues are related to compiling the code on a 64bit machine. So I was wondering if anyone in PETSc community is using PLAPCK in their projects and if they've encounter such problems. For example, after configuring and building PETSc I ran the test in 'externalpackages/PLAPACKR32-hg/EXAMPLES/LU' and got this: Fatal error in MPI_Scatterv: Other MPI error, error stack: MPI_Scatterv(344)..................: MPI_Scatterv(sbuf=0x10f8680, scnts=0x10f02c0, displs=0x10a8fc0, MPI_DOUBLE_COMPLEX, rbuf=0x1526770, rcount=5760, MPI_DOUBLE_COMPLEX, root=0, comm=0xc4000002) failed MPIR_Scatterv(133).................: MPIC_Recv(83)......................: MPIC_Wait(405).....................: MPIDI_CH3I_Progress(150)...........: MPID_nem_mpich2_blocking_recv(1074): MPID_nem_tcp_connpoll(1667)........: state_commrdy_handler(1517)........: MPID_nem_tcp_recv_handler(1413)....: socket closed thanks in advance, Hamid
