Hi, I think you should specify the variable LD_LIBRARY_PATH in your system.
Something like (in your .bashrc or .profile) LD_LIBRARY_PATH=$LD_LIBRARY_PATH:path_to_libmpichcxx; export LD_LIBRARY_PATH or in your .tcshrc, if you have the tcsh shell setenv LD_PATH $LD_LIBRARY_PATH setenv LD_LIBRARY_PATH $LD_PATH:path_to_libmpichcxx where "path_to_libmpichcxx" is something like /ima/home/hsun/petsc/externalpackages/mpich2-1.0.8/lib Thomas ------------------------------------------------- On Mon, 30 Nov 2009, Huan Sun wrote: > Hi all, > > Thanks for the help. > > Somehow I got a new prblem -- I am new to the MPI area. > I enable the MPI download when configuring Petsc following the > instructions in FAQ on deal.ii webpage. Everything works fine. And I > configured deal.ii's compilers as Toby and Thomas suggested: > > ./configure --disable-threads --enable-shared > CC=/ima/home/hsun/petsc/externalpackages/mpich2-1.0.8/bin/mpicc > CXX=/ima/home/hsun/petsc/externalpackages/mpich2-1.0.8/bin/mpicxx > > deal.ii complained it cannot find -lmpichcxx. > > > ======================optimized========= Linking library: > libpetscall.so > ======================debug============= Linking library: > libpetscall.g.so > /usr/bin/ld: cannot find -lmpichcxx > collect2: ld returned 1 exit status > make[1]: *** [libpetscall.so] Error 1 > make[1]: *** Waiting for unfinished jobs.... > /usr/bin/ld: cannot find -lmpichcxx > collect2: ld returned 1 exit status > make[1]: *** [libpetscall.g.so] Error 1 > > I wonder if any more options I need to specify when configuring deal.ii. > > Thanks a lot! > > Huan > > > > > > > On Sun, 2009-11-29 at 14:55 +0100, Toby D. Young wrote: > > > [0]PETSC ERROR: Configure run at Fri Nov 27 23:13:43 2009 > > > [0]PETSC ERROR: Configure > > > options --with-cc=gcc --with-fc=gfortran --with-cxx=g++ > > > --download-f-blas-lapack=1 > > > --download-mpich=1 --with-shared=1 --with-dynamic=1 --with-clanguage=C++ > > > --with-x=0 > > > > In any case, to run step-17 on multi processors (which I think you are > > trying to do here), you will need to be compiling with MPI. This means > > configuring PETSc with: > > --with-mpi=1 --with-mpi-dir=path/to/mpi > > and configuringdealii with the same compilers by: > > CC=mpicc CXX=mpicxx > > > > As Thomas pointed out, there is some wierdness using the DataOut class > > with PETSc when threading is enabled. Try configuring > > dealii with: > > --disable-threads > > to be sure to get step-17 to run. Step-18 does not use DataOut in the same > > way and seems to somehow avoid these problems. > > > > There are other hints on the dealii FAQ which can help you get things > > working the way you really want them to. Maybe that helps? > > > > Best, > > Toby > > > > > > ----- > > > > Toby D. Young > > Assistant Professor > > Philosophy-Physics > > Polish Academy of Sciences > > Warszawa, Polska > > > > www: http://www.ippt.gov.pl/~tyoung > > skype: stenografia > > > _______________________________________________ > dealii mailing list http://poisson.dealii.org/mailman/listinfo/dealii > _______________________________________________ dealii mailing list http://poisson.dealii.org/mailman/listinfo/dealii
