Greetings, I have a user that submits many many many jobs at once in an array. Happily, he's a very nice user and doesn't often cause trouble.
The documentation for the job array (https://slurm.schedmd.com/job_array.html) says: "A maximum number of simultaneously running tasks from the job array may be specified using a "%" separator. For example "--array=0-15%4" will limit the number of simultaneously running tasks from this job array to 4." Awesome. That's exactly what he is doing. A big job recently just finished and I'm looking at my queue noticing that no one bothered to load it to the brim this weekend leaving me with several idle compute nodes. Meanwhile, this user has got quite a few jobs still waiting to run with a "JobArrayTaskLimit". :-/ I've been poking at it for the last 20-30 minutes, but I'm not seeing how I, with the power of root, can update his own "self imposed" array limit. It's late, my attempts have not worked, and my google-fu isn't returning any helpful results. I'm not really concerned by it, but I would like to know should this happen again. How can I increase a JobArrayTaskLimit? So using the documentation example, how would I "scontrol update" the array to be "--array=0-15%6" when jobs 0-3 are already running? Or maybe just say "Grab X number of jobs and run them anyway"? So again with the documentation example, maybe 0-3 are done, 4-7 are running, and I just want to manually tell 8-10 to run anyway on available resources leaving 11-15 under current constraints. Thank you! ~Stack~
Description: OpenPGP digital signature