Hi Samuel,

> > > Can you have a startup script set
> > > HWLOC_XMLFILE=/common/path/${hostname}.xml in the system-wide
> > > environment?
> > 
> > That's precisely what I'd been trying, but it proved exceedingly
> > difficult to figure out what environment file is read when mpirun is
> > run directly, when torque runs the job, etc. It seems that neither
> > bashrc nor profile do the job, so I'm now exploring possibilities
> > for ssh environment variable injection... I'll keep digging, thanks
> > a lot for your help and pointers.
> 
> Perhaps running something like
> 
> mpiexec sh -c "/usr/bin/env
> HWLOC_XMLFILE=/common/path/\${hostname}.xml theapplication"
> 
> ?

Thanks for the suggestion. I tried:

mpirun --prefix /opt/openmpi-1.10.0 --hostfile node1 -np 44 sh -c
"/usr/bin/env HWLOC_XMLFILE=/etc/hwloc_\${hostname}.xml python
testmpi.py"

and:

mpirun --prefix /opt/openmpi-1.10.0 --hostfile node1 -np 44 sh -c
"/usr/bin/env HWLOC_XMLFILE=/etc/hwloc_`hostname`.xml python testmpi.py"

and neither worked. It seems to me that the environment is driven by
ssh and, even though the docs say that non-interactive shells should
read /etc/profile, mpirun via ssh doesn't do that.

I then tried to export HWLOC_XMLFILE in ~/.ssh/environment, where I
added PermitUserEnvironment option to sshd_config, but that didn't work
either.

Digging further...

For the record, this is ubuntu 14.04, using bash as the default shell.

Thanks,
Andrej

Reply via email to