Hi,

This is it:
> slax@master$ ssh  node1 'echo $PATH'
> 
> gives me the reduced path on the slave node.

I'm sorry, I was wrong. You typed it correctly. AFAIK, this command logs
in your node but the PATH variable is still just as on your master. I
had this issue and I solved it by editing the .bashrc file on the
master, NOT the node. That worked for me. Try editing the PATH and
LD_LIBRARY_PATH on the master, on the computer you run the mpirun
command.

So, for example, if you have on the nodes the MPI installation
in /openMPI/, with subfolders "bin" and "lib", try putting these lines
into your .bashrc file on the master:
export PATH=$PATH:/openMPI/bin
export LD_RUN_FLAG=$LD_LIBRARY_PATH:/openMPI/lib

It shouldn't matter where is your MPI installation on the master. The
nodes matter!

Note: I am a openMPI beginner, I am not involved in development, I'm
just sharing my experience on the same problem and how I solved it. No
guarantee...

Dr. Eddy

Tomislav Maric píše v Ne 02. 08. 2009 v 16:09 +0200:
> Dominik Táborský wrote:
> > Hi Tomislav,
> > 
> > I also had this issue. When you try to trace it, you'll find out that
> > when you manually connect to a machine and immediately execute a
> > command, it will inherit your environment, not the environment of the
> > node. See:
> > $ ssh node1 && echo $PATH
> > 
> > This will echo the PATH on your computer, the master one, not the node.
> > But if you do this:
> > $ ssh node1
> > node1$ echo $PATH
> > 
> > it will echo the PATH on your node.
> 
> I've tried it:
> 
> $ ssh node1 && echo $PATH
> 
> at first does nothing, leaving me loged in on the node, but when I exit,
> it writes out the $PATH on the master node.
> 
> ssh node1
> 
> slax@node1$ echo $PATH
> 
> gives me the path on the slave node1
> 
> and
> 
> slax@master$ ssh  node1 'echo $PATH'
> 
> gives me the reduced path on the slave node. I think that the problem is
> exactly the same as the last line - when I execute a bash script, it is
> envoked in a non-interactive mode (login mode, because of the ssh), and
> maybe some other config file is read instead of .bash_profile or
> .bashrc? This reduced PATH and LD_LIBRARY_PATH cause problems for mpirun
> to find the right libraries and binaries.
> 
> > Solution to this is to write the path to the executables and path to
> > libraries to the variables you have set on your own computer, tha
> > master.
> 
> The master computer already has everything set, because the Live DVD is
> configured properly (i ran a test case on dual core - mpirun runs fine
> locally), I'm not sure I understand, could you please explain a bit more
> - this is all new to me.
> 
> Thank you very much for your advice and time!
> 
> Best regards,
> 
> Tomislav
> 
> 
> 
> > 
> > Let me know how that works for you!
> > 
> > Dr. Eddy
> > 
> > 
> > Tomislav Maric píše v Ne 02. 08. 2009 v 13:09 +0200:
> >> Prasadcse Perera wrote:
> >>> Hi,
> >>> One workaround is you can define PATH and LD_LIBRARY_PATH in your common
> >>> .bashrc and have a resembling  paths of installation in two nodes. This
> >>> works for me nicely with my three node installation :).
> >>>
> >> Thank you very much for the advice. Actually I'm running OpenFOAM (read:
> >> a program parallelized to run with Open MPI) from SLAX Live DVD, so the
> >> installation paths are identical, as well as everything else.
> >>
> >> I've added commands that set enviromental variables in .bashrc on both
> >> nodes, but you mention "common .bashrc". Common in what way? I'm sorry
> >> for newbish question, again, I'm supposed to be a Mechanical Engineer.
> >> :))))
> >>
> >> OpenFOAM toolkit carries a separate directory for third-party support
> >> software. In this directory there are programs for postprocessing
> >> simulation results and analyze data and Open MPI. Therefore, in my case,
> >> Open MPI is built in a separate directory and the build is automated.
> >>
> >> After the build of both programs, there is a special bashrc located in
> >>
> >> some/path/OpenFOAM/OpenFOAM-1.5-dev/etc/
> >>
> >> that sets all the variables needed to use Open FOAM, such as
> >> FOAM_TUTORIALS (where are the tutorials), FOAM_RUN (where is the working
> >> dir), WM_COMPILER (what compiler to use), etc. This bashrc also sets
> >> LD_LIBRARY_PATH and PATH so that locally installed Open MPI can be found.
> >>
> >> I've tried this installation on the Live DVD on my laptop with two
> >> cores, decomposed the case and ran the simulation in parallel without a
> >> problem.
> >>
> >> I hope this information is more helpful.
> >>
> >> Best regards,
> >> Tomislav
> >>
> >> _______________________________________________
> >> users mailing list
> >> us...@open-mpi.org
> >> http://www.open-mpi.org/mailman/listinfo.cgi/users
> > 
> > _______________________________________________
> > users mailing list
> > us...@open-mpi.org
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
> 
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users

Reply via email to