If you're going to use this method, you might want to consider a second stanza created specifically for PBS.

if PBS_ENVIRONMENT set
  do useful stuff
endif

The difference is that you will still get your useful env when you run PBS jobs interactively.

        Jeremy


At 11:17 AM 1/10/2003 -0500, Brian Williams wrote:
Turns out that part of my .tcshrc has a set of commands that are basically

if in interactive shell
  set useful stuff
endif

So, I think that if i move the 'useful stuff' block out to the rest of the
.tcshrc then those variables will get created when the shell is started to
run my scripts.
or, i can just c/p those variable commands into the pbs scripts, which I
think will be easier to maintain, anyway. Eventually i'm gonna write a perl
script to interactively write these scripts for me and execute them anyway.
Thanks for your help,
Brian

-----Original Message-----
From: Jeremy Enos [mailto:[EMAIL PROTECTED]]
Sent: Thursday, January 09, 2003 7:19 PM
To: Brian Williams
Cc: [EMAIL PROTECTED]
Subject: RE: [Oscar-users] Running a single job on multiple machines


Yes.. it starts new shells on each machine, and doesn't necessarily take
all the env vars with it.  I suggest setting them in your script which you
feed to qsub, although that is also only run on a single compute node.  If
you want environment propagated to every node in your job, it gets worse-
you have to make a script which is called by your parallel launcher
(pbsdsh, mpirun, or whatever you choose), which is called by your qsub
target script, which is called by qsub.  Hope you followed that- it's ugly,
but the reality.

         Jeremy

Reply via email to