Of course it is possible and the very first version of the script was
like this, but my goal is not to drain all resources, only the
specified amount.

On 8 Lis, 19:32, "Mark A. Grondona" <[email protected]> wrote:
> On Tue, 8 Nov 2011 10:17:19 -0800, "Lipari, Don" <[email protected]> wrote:
>
> Non-text part: multipart/alternative
>
>
>
>
>
>
>
>
>
> > I agree with Carles.  Your job script generator should look something like 
> > this, where the first argument is the number of tasks you want to run.
>
> > #!/bin/bash
>
> > declare -i count=$1
>
> > echo '#!/bin/bash'> job.cmd
>
> > for (( i=1 ; i <= $count; i++ ))
> >   do
> >   echo 'srun -N1 -n1 --exclusive hostname &' >> job.cmd
> > done
>
> > echo 'wait' >> job.cmd
>
> > sbatch -n $count --exclusive job.cmd
>
> Just curious, since each srun is only running a single task, and
> none of the tasks seem dependent on each other, can you just
> run each task as a seperate batch script? That way each of
> your tasks will be  a separate job in the queue and you can
> let SLURM handle running as many jobs as there are resources
> in the system, while the others stay queued.
>
> mark
>
>
>
>
>
>
>
>
>
>
>
> > From: [email protected] 
> > [mailto:[email protected]] On Behalf Of Carles Fenoy
> > Sent: Tuesday, November 08, 2011 9:19 AM
> > To: [email protected]
> > Subject: Re: [slurm-dev] Re: Running multiple tasks from a single batch 
> > script
>
> > Hi Chris,
>
> > try adding the --exclusive to the srun command. I've solved it this way
>
> > Carles fenoy
> > On Tue, Nov 8, 2011 at 3:24 PM, Chris Rataj 
> > <[email protected]<mailto:[email protected]>> wrote:
> > You won't be able to run this, since the script uses SCHROEDINGER
> > module, but i'm sending it anyway:
>
> >http://hotfile.com/dl/134508063/ad24a08/Multi-syfer_copy.sh.html
>
> Non-text part: text/html

Reply via email to