Dear all,

Does Galaxy have any mechanisms to set the priority of jobs
submitted to the cluster? Specifically I am interested in SGE
via DRMAA, but this is a general issue. If there is some existing
code, I might be able to use it for the following situation:

The specific motivation is for balancing competing BLAST
searches from multiple users: I want to be able to prioritize
short jobs (e.g. under 100 queries) over large jobs (tens of
thousands of queries).

In the (experimental) task splitting code, I would like to be
able to give short jobs higher priority (e.g. normal) than large
jobs (e.g. low priority) based on the size of the split file.

One idea would be a scaling bases on the (split) input file's
size (in bytes), or perhaps a per-file format size threshold,
e.g. when splitting FASTA files, 1000 sequences might trigger
lower priority.

The advantage of this kind of assessment is it would also work
on both the current split mechanisms, "number_of_parts" and
"to_size", as well as any hybrid of the two:

<parallelism method="multi" split_mode="number_of_parts" split_size="8" ..." />
<parallelism method="multi" split_mode="to_size" split_size="1000" ... />
http://lists.bx.psu.edu/pipermail/galaxy-dev/2012-May/009647.html

So, is there any existing code in Galaxy that would be helpful
here - or existing plans in this area?

Thanks,

Peter
___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/

Reply via email to