Thanks Jeremy

I've given up on that approach to nested workflows - too inefficient,
too fragile. If I have lots of time (and a grad student to help out)
I've got some ideas about having a deep look at the tool runner side of
Galaxy with an eye to efficiently implementing the mapping of input data
to (sub)workflows. Right now the process of spawning a new Galaxy task
for e.g. each line in a tabular file is just too heavyweight, thus the
sub-workflow got implemented in Python.

Peter

On 24/05/2012 04:30, Jeremy Goecks wrote:
> Peter,
>
> I'm not ignoring you. However, there are others on the Galaxy team that are 
> more familiar with the API and can provider better answers. I expect they'll 
> chime in soon to address your questions.
>
> Best,
> J.
>
> On May 20, 2012, at 5:39 PM, Peter van Heusden wrote:
>
>> Hi Jeremy
>>
>> I'm need this for something I'm implementing at the moment, and how I'm
>> thinking about it is making a tool that uses the API to call a workflow.
>> There are a few problems though, correct me if I'm wrong:
>>
>> 1) In order to make an input history item available to the called
>> workflow, the tool needs to somehow know about history items, but the
>> tool xml passes in parameters as data files. This could probably be
>> remedied by providing a type="history_item" parameter to <param> that
>> would provide the id associated with the history item. In the interem,
>> just to test things, I'm passing in parameters as a history:history_item
>> string (yeah I know, ugly!).
>>
>> 2) My particular tool needs to take a history item, splits it into
>> partitions, and call a workflow with each of those partitions. For this
>> to work, the partition needs to be uploaded as a new history item, but
>> that is currently not possible. The other possibility is to create a
>> tool that does the split, have it in a single-tool workflow (because
>> workflows can be called from the API in such a way that their output
>> goes to a new history, whereas I don't see that in the tool interface)
>> and then iterate through the history that contains the split data,
>> calling the analysis workflow on each item.   
>>
>> Peter
>> P.S. for my particular problem - call a bunch of tools, once for each
>> row in a file of tabular data - it would be WAY easier to just write
>> everything in a Python script, but I'm trying to see what is do-able
>> within Galaxy.
>>
>> On 20/05/2012 16:55, Jeremy Goecks wrote:
>>>> Is there any way we can speed up the implementation of this issue?
>>> Community contributions and always encouraged and welcomed. Partial 
>>> solutions are fine, and self-contained contributions are likely to be 
>>> included more quickly because they are easier to review.
>>>
>>> Thanks,
>>> J.
>>>
>> ___________________________________________________________
>> Please keep all replies on the list by using "reply all"
>> in your mail client.  To manage your subscriptions to this
>> and other Galaxy lists, please use the interface at:
>>
>>  http://lists.bx.psu.edu/

___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/

Reply via email to