I'm need this for something I'm implementing at the moment, and how I'm
thinking about it is making a tool that uses the API to call a workflow.
There are a few problems though, correct me if I'm wrong:
1) In order to make an input history item available to the called
workflow, the tool needs to somehow know about history items, but the
tool xml passes in parameters as data files. This could probably be
remedied by providing a type="history_item" parameter to <param> that
would provide the id associated with the history item. In the interem,
just to test things, I'm passing in parameters as a history:history_item
string (yeah I know, ugly!).
2) My particular tool needs to take a history item, splits it into
partitions, and call a workflow with each of those partitions. For this
to work, the partition needs to be uploaded as a new history item, but
that is currently not possible. The other possibility is to create a
tool that does the split, have it in a single-tool workflow (because
workflows can be called from the API in such a way that their output
goes to a new history, whereas I don't see that in the tool interface)
and then iterate through the history that contains the split data,
calling the analysis workflow on each item.
P.S. for my particular problem - call a bunch of tools, once for each
row in a file of tabular data - it would be WAY easier to just write
everything in a Python script, but I'm trying to see what is do-able
On 20/05/2012 16:55, Jeremy Goecks wrote:
>> Is there any way we can speed up the implementation of this issue?
> Community contributions and always encouraged and welcomed. Partial solutions
> are fine, and self-contained contributions are likely to be included more
> quickly because they are easier to review.
Please keep all replies on the list by using "reply all"
in your mail client. To manage your subscriptions to this
and other Galaxy lists, please use the interface at: