[galaxy-dev] casOT wrapper

2014-01-21 Thread Shaun Webb


Hi,

I was just wondering if anyone has put any work in to developing  
wrappers for casOT:

http://eendb.zfgenetics.org/casot/index.php

Couldn't see anything in the toolshed. If not I'll have a go myself.

Shaun Webb

--
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.


___
Please keep all replies on the list by using reply all
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
 http://lists.bx.psu.edu/

To search Galaxy mailing lists use the unified search at:
 http://galaxyproject.org/search/mailinglists/


Re: [galaxy-dev] toolshed image file location problem

2014-01-21 Thread Eric Kuyt
Ok thanks, I will vote it up, and for now try to manually edit
lib/tool_shed/util/shed_util_common.py

Thanks

On 20 January 2014 15:51, Bjoern Gruening bjoern.gruen...@gmail.com wrote:
 Hi Eric,

 please see the following ticket for it and vote it up :)

 https://trello.com/c/dWvkfBKC

 Sorry that there is no fix for it yet,
 Bjoern


 e the actual image is located at


___
Please keep all replies on the list by using reply all
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  http://lists.bx.psu.edu/

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/mailinglists/


[galaxy-dev] numpy and scipy

2014-01-21 Thread David Hoover
Are numpy and scipy included with the standard python modules/eggs 
supplied by galaxy?


David Hoover
Helix Systems Staff
___
Please keep all replies on the list by using reply all
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  http://lists.bx.psu.edu/

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/mailinglists/

Re: [galaxy-dev] numpy and scipy

2014-01-21 Thread Bjoern Gruening
Hi David,

numpy and scipy are not included as modules with the galaxy
distribution, but you can use the repositories from the toolshed and
depend on them.

 http://toolshed.g2.bx.psu.edu/view/iuc/package_numpy_1_7 
 http://toolshed.g2.bx.psu.edu/view/iuc/package_scipy_0_12 

An example usage is here:

https://github.com/bgruening/galaxytools/blob/master/statistics/tool_dependencies.xml

Hope that helps!
Bjoern

 Are numpy and scipy included with the standard python modules/eggs
 supplied by galaxy?
 
 David Hoover
 Helix Systems Staff
 ___
 Please keep all replies on the list by using reply all
 in your mail client.  To manage your subscriptions to this
 and other Galaxy lists, please use the interface at:
   http://lists.bx.psu.edu/
 
 To search Galaxy mailing lists use the unified search at:
   http://galaxyproject.org/search/mailinglists/


___
Please keep all replies on the list by using reply all
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  http://lists.bx.psu.edu/

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/mailinglists/


[galaxy-dev] ToolShed slow? esp uploading

2014-01-21 Thread Peter Cock
Hi guys,

I'm currently trying to update the blast_datatypes on the main
tool shed (thanks Nicola for the reminder),
http://toolshed.g2.bx.psu.edu/view/devteam/blast_datatypes

Having clicked upload it is stuck at:

[Warning] Upload a single file or tarball. Uploading may take a while,
depending upon the size of the file. Wait until a message is displayed
in your browser after clicking the Upload button below.

At what point do I give up and retry? Its been 8 minutes since
I thought to start timing, call it 10 minutes so far...

In other tabs, the Tool Shed is running but the response feels
more sluggish than usual (just browsing tools etc).

Thanks,

Peter
___
Please keep all replies on the list by using reply all
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  http://lists.bx.psu.edu/

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/mailinglists/


Re: [galaxy-dev] ToolShed slow? esp uploading

2014-01-21 Thread Peter Cock
On Tue, Jan 21, 2014 at 6:19 PM, Peter Cock p.j.a.c...@googlemail.com wrote:
 Hi guys,

 I'm currently trying to update the blast_datatypes on the main
 tool shed (thanks Nicola for the reminder),
 http://toolshed.g2.bx.psu.edu/view/devteam/blast_datatypes

 Having clicked upload it is stuck at:

 [Warning] Upload a single file or tarball. Uploading may take a while,
 depending upon the size of the file. Wait until a message is displayed
 in your browser after clicking the Upload button below.

 At what point do I give up and retry? Its been 8 minutes since
 I thought to start timing, call it 10 minutes so far...

I gave up after ~20 minutes, closed that tab, and tried again.
It worked perfectly twice in a row (blast_datatypes v0.0.17
and v0.0.18).

Peter
___
Please keep all replies on the list by using reply all
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  http://lists.bx.psu.edu/

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/mailinglists/


Re: [galaxy-dev] Can't view file_name in histories via API unless admin?

2014-01-21 Thread Dannon Baker
Hey Neil,

While poking through a few options for you, I remembered we added
'expose_dataset_path' to universe_wsgi.ini.  If you're comfortable with
users knowing full pathnames, you can enable this 'expose_dataset_path =
True' and everyone will be able to see their full paths.

-Dannon


On Sat, Jan 18, 2014 at 9:37 PM, neil.burd...@csiro.au wrote:

  Thanks Dannon,
   Apologies for the length of the email, I'll try and
 be as succint as possible.

  I am using Galaxy as a tool for medical image processing. We have a
 number of organisations (researchers and clinicians ) who would like to use
 the medical imaging tools we have developed, so are using Galaxy to make
 them available to the whole community. The researchers and clinicians would
 just prefer to select a file press execute and then everything would be
 done i.e. uploaded, processed, results returned.

  I've managed to achieve this (using a test account which is an admin
 user). In summary, using John Chilton's multi-file branch of Galaxy, the
 user can select multiple files and then presses execute. The code(and
 workflows) then upload all the files selected, splits them into smaller
 datasets (as the tool only needs 2 input files - for example the user may
 upload 0179_AV45.hdr,
 0179_AV45.img, 0279_AV45.hdr, 0179_AV45.img, 0199_AV45.hdr
 and 0199_AV45.img, given these 6 input files; 3 datasets will be created
 based on the filenames i.e. 0199_AV45.hdr and 0199_AV45.img will be in one
 dataset etc). Another tool is responsible for batching and executing the
 medical imaging tool with each of these 3 datasets, and finally all the
 results are returned and then emailed back to the user ( so no user
 interaction is required other than selecting files and pressing execute)

  This all worked fine as an admin user, but as a non admin user we are
 unable to get the file_name from the /api/histories/contents/ etc... being
 upload for the dataset_id.dat. We need to get the history id of that
 datase_id.dat filename so we can execute the workflow.

  As admin, I have a script that is able to get get all the files uploaded
 (under/histories/contents/etc ...) and then examine each history id to get
 the file_name and match it with the name that we just uploaded. From this
 we could then get the history_id. But seen as we can't get hold of the
 file_name unless your an admin user. Do you know how we can get hold of the
 history_id of the filename i.e. we can't just assume it's the last entry in
 /history/contents. So given only a database_'id'.dat filename how can I get
 the history id dynamically with no user interaction and not being an admin
 user?

  Thanks for any help
 Neil

  p.s. do you know where in the code it stops file_name from being
 displayed (using the scripts/api/display.py script)
  --
 *From:* Dannon Baker [dannon.ba...@gmail.com]
 *Sent:* Saturday, January 18, 2014 12:34 AM

 *To:* Burdett, Neil (CCI, Herston - RBWH)
 *Cc:* charles.girar...@embl.de; Galaxy Dev

 *Subject:* Re: [galaxy-dev] Can't view file_name in histories via API
 unless admin?

   Hi Niel,

  Galaxy does not expose filepaths to non-admin users intentionally.  For
 executing a workflow with that particular script, the 'file_id' in question
 in that example should be an hda, which is what api/history/contents will
 display for your users as the 'id' for each history item.

  -Dannon


 On Fri, Jan 17, 2014 at 6:12 AM, neil.burd...@csiro.au wrote:

 Hi Charles,
not a problem. In my previous post I specified the command
 line:

 /home/galaxy/milxcloud/scripts/api/display.py api_key
 http://barium-rbh:9100/extras/api/histories/ebfb8f50c6abde6d/contents/4a56addbcc836c23

  The api_key in your question refers to the api_key of an admin user,
 however, the history_id (ebfb8f50c6abde6d) refers to a history not owned
 by the admin user using the unique api_key i.e. another user, hence the
 error message

 I hope that answers your question?

 Neil

 
 From: Charles Girardot [charles.girar...@embl.de]
 Sent: Friday, January 17, 2014 6:31 PM
 To: Burdett, Neil (CCI, Herston - RBWH)
 Cc: galaxy-dev@lists.bx.psu.edu
 Subject: Re: [galaxy-dev] Can't view file_name in histories via API
 unless admin?

 Hi Neil,

 sorry, this is not an answer to your post, I hope you won t mind me
 stepping in your thread this way.
 Your message kept my attention because of your note: I am surprised by
 the error message you report when trying to use an admin API key.

 How does galaxy know the user who is making the call?

 Sorry if I am missing the obvious

 bw

 Charles

 On 17 Jan 2014, at 07:35, neil.burd...@csiro.au wrote:

  Hi,
  it seems that the entry file_name: does not appear when running
 the command
 
  /home/galaxy/milxcloud/scripts/api/display.py api_key
 http://barium-rbh:9100/extras/api/histories/ebfb8f50c6abde6d/contents/4a56addbcc836c23
 
  unless you are stated as as admin 

[galaxy-dev] Error when searching for tools in local instance.

2014-01-21 Thread Perez, Ricardo
Dear All,

When I go to my Galaxy Distibution I go the following place:
Admin - Manage installed tool shed repositories

When I search for a tool in here with the provided search bar, I get the 
following (I believe this is a bug):

URL: 
https://galaxy.tamu.edu/admin_toolshed/browse_repositories?async=falsesort=namepage=1show_item_checkboxes=falsef-deleted=Falsef-free-text-search=search
File 
'/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/middleware/error.py', 
line 149 in __call__
  app_iter = self.application(environ, sr_checker)
File 
'/usr/local/galaxy/galaxy-dist/eggs/Paste-1.7.5.1-py2.7.egg/paste/recursive.py',
 line 84 in __call__
  return self.application(environ, start_response)
File 
'/usr/local/galaxy/galaxy-dist/eggs/Paste-1.7.5.1-py2.7.egg/paste/httpexceptions.py',
 line 633 in __call__
  return self.application(environ, start_response)
File '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/base.py', line 132 
in __call__
  return self.handle_request( environ, start_response )
File '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/base.py', line 190 
in handle_request
  body = method( trans, **kwargs )
File '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/__init__.py', line 
229 in decorator
  return func( self, trans, *args, **kwargs )
File 
'/usr/local/galaxy/galaxy-dist/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py',
 line 153 in browse_repositories
  return self.installed_repository_grid( trans, **kwd )
File '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/helpers/grids.py', 
line 138 in __call__
  query = column.filter( trans, trans.user, query, column_filter )
File '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/helpers/grids.py', 
line 616 in filter
  clause_list.append( column.get_filter( trans, user, column_filter ) )
File '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/helpers/grids.py', 
line 400 in get_filter
  return self.get_single_filter( user, column_filter )
File '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/helpers/grids.py', 
line 413 in get_single_filter
  if self.key.find( '.' )  -1:
AttributeError: 'NoneType' object has no attribute 'find'

How would I go to fix this.

Thank you,
--Ricardo Perez
___
Please keep all replies on the list by using reply all
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  http://lists.bx.psu.edu/

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/mailinglists/


[galaxy-dev] setting up the drmaa for sge

2014-01-21 Thread Cantarel, Brandi L.
Hi all,
I have been trying to set up my galaxy instance to use SGE -- I created a 
job_conf.xml using the sample as a model and updated the galaxy user bashrc 
with the DRMAA_LIBRARY_PATH
export DRMAA_LIBRARY_PATH=/home/sge/lib/lx26-amd64/libdrmaa.so


But when I try to add data to a library I get the error:
Unable to run job due to a misconfiguration of the Galaxy job running system. 
Please contact a site administrator.

Not too comforting since I am the administrator.  The error in the log says:
galaxy.jobs.handler DEBUG 2014-01-21 14:57:39,478 (2) Dispatching to drmaa 
runner
galaxy.jobs.handler ERROR 2014-01-21 14:57:39,479 put(): (2) Invalid job 
runner: drmaa

What does that mean, Invalid job runner?  Does this mean I have an error in my 
job_conf.xml?



Here is my job_conf.xml
?xml version=1.0?
job_conf
  plugins workers=4
!-- workers is the number of threads for the runner's work queue.
 The default from plugins is used if not defined for a plugin.
  --
plugin id=sge type=runner 
load=galaxy.jobs.runners.drmaa:DRMAAJobRunner/
  /plugins
  handlers default=handlers
!-- Additional job handlers - the id should match the name of a
 [server:id] in universe_wsgi.ini.
  --
handler id=main tags=handlers/
  /handlers
  destinations default=sge_default
!-- Destinations define details about remote resources and how jobs
 should be executed on those remote resources.
  --
destination id=sge_default runner=drmaa/
!-- Define parameters that are native to the job runner plugin. --
/destinations
/job_conf


Thanks for your help!

~~~
Brandi Cantarel, PhD
Bioinformatics Research Scientist
Baylor Institute for Immunology Research
Baylor Health Care System
214-820-9064 (office)

**
This e-mail may contain confidential and/or privileged information. This 
information is intended only for the use of the individual(s) and entity(ies) 
to whom it is addressed. If you are the intended recipient, further disclosures 
are prohibited without proper authorization. If you are not the intended 
recipient (or have received this e-mail in error) please notify the sender 
immediately and destroy this e-mail. Any unauthorized copying, disclosure or 
distribution of the material in this e-mail is strictly forbidden and possibly 
a violation of federal or state law and regulations. Baylor Health Care System, 
its subsidiaries, and affiliates hereby claim all applicable privileges related 
to this information.
___
Please keep all replies on the list by using reply all
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  http://lists.bx.psu.edu/

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/mailinglists/

Re: [galaxy-dev] Error when searching for tools in local instance.

2014-01-21 Thread sam guerler
Hi Ricardo,

Are you using the newest version from the stable branch or from default?

Thanks,
Sam


On Tue, Jan 21, 2014 at 3:53 PM, Perez, Ricardo ricky_...@neo.tamu.eduwrote:

 Dear All,

 When I go to my Galaxy Distibution I go the following place:
 Admin - Manage installed tool shed repositories

 When I search for a tool in here with the provided search bar, I get the
 following (I believe this is a bug):

 URL:
 https://galaxy.tamu.edu/admin_toolshed/browse_repositories?async=falsesort=namepage=1show_item_checkboxes=falsef-deleted=Falsef-free-text-search=search
 File
 '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/middleware/error.py',
 line 149 in __call__
   app_iter = self.application(environ, sr_checker)
 File
 '/usr/local/galaxy/galaxy-dist/eggs/Paste-1.7.5.1-py2.7.egg/paste/recursive.py',
 line 84 in __call__
   return self.application(environ, start_response)
 File
 '/usr/local/galaxy/galaxy-dist/eggs/Paste-1.7.5.1-py2.7.egg/paste/httpexceptions.py',
 line 633 in __call__
   return self.application(environ, start_response)
 File '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/base.py',
 line 132 in __call__
   return self.handle_request( environ, start_response )
 File '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/base.py',
 line 190 in handle_request
   body = method( trans, **kwargs )
 File '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/__init__.py',
 line 229 in decorator
   return func( self, trans, *args, **kwargs )
 File
 '/usr/local/galaxy/galaxy-dist/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py',
 line 153 in browse_repositories
   return self.installed_repository_grid( trans, **kwd )
 File
 '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/helpers/grids.py',
 line 138 in __call__
   query = column.filter( trans, trans.user, query, column_filter )
 File
 '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/helpers/grids.py',
 line 616 in filter
   clause_list.append( column.get_filter( trans, user, column_filter ) )
 File
 '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/helpers/grids.py',
 line 400 in get_filter
   return self.get_single_filter( user, column_filter )
 File
 '/usr/local/galaxy/galaxy-dist/lib/galaxy/web/framework/helpers/grids.py',
 line 413 in get_single_filter
   if self.key.find( '.' )  -1:
 AttributeError: 'NoneType' object has no attribute 'find'

 How would I go to fix this.

 Thank you,
 --Ricardo Perez
 ___
 Please keep all replies on the list by using reply all
 in your mail client.  To manage your subscriptions to this
 and other Galaxy lists, please use the interface at:
   http://lists.bx.psu.edu/

 To search Galaxy mailing lists use the unified search at:
   http://galaxyproject.org/search/mailinglists/

___
Please keep all replies on the list by using reply all
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  http://lists.bx.psu.edu/

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/mailinglists/

[galaxy-dev] dbkey in cuffdiff

2014-01-21 Thread Margarita Schlackow
Hello,

I am using cuffdiff with the bias correction. I have uploaded the S.pombe 
genome file and changed the database/build option to the S.pombe genome and 
saved it. Afterwards I used cuffmerge and then I wanted to use cuffdiff. 
However, it seems that the dbkey for pombe is not recognised, but as it is a 
drop down menu, there is no way to change it. I get the following error in the 
logfile: cuffdiff: unrecognized option `--dbkey=Schizosaccharomyces_pombe_1.1' 

I would greatly appreciate your help!

All the best wishes,

Rita
___
Please keep all replies on the list by using reply all
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  http://lists.bx.psu.edu/

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/mailinglists/


Re: [galaxy-dev] Import workflows via API

2014-01-21 Thread Neil.Burdett
Hi Nicola,
 I've merged your changes into my version of the galaxy code 
(slightly older than galaxy-central) I believe, however, I get the following 
error when I try and execute your script:

Traceback (most recent call last):
  File get_wfs.py, line 13, in module
common.post(api_key, 'http://barium-rbh:9100/extras/api/workflows/import', 
data)
  File /home/galaxy/milxcloud/scripts/api/common.py, line 48, in post
return simplejson.loads( urllib2.urlopen( req ).read() )
  File /usr/lib/python2.7/urllib2.py, line 126, in urlopen
return _opener.open(url, data, timeout)
  File /usr/lib/python2.7/urllib2.py, line 406, in open
response = meth(req, response)
  File /usr/lib/python2.7/urllib2.py, line 519, in http_response
'http', request, response, code, msg, hdrs)
  File /usr/lib/python2.7/urllib2.py, line 444, in error
return self._call_chain(*args)
  File /usr/lib/python2.7/urllib2.py, line 378, in _call_chain
result = func(*args)
  File /usr/lib/python2.7/urllib2.py, line 527, in http_error_default
raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
urllib2.HTTPError: HTTP Error 500: Internal Server Error

so I printed out the some lines in common.py 

def post( api_key, url, data ):
# Do the actual POST.
url = make_url( api_key, url )
print URL is %s % url

 URL is 
 http://barium-rbh:9100/extras/api/workflows/import?key=34ee80757f0d03de36e33a1676d245e4

the script is:
import sys
#sys.path.insert(1, 'milxcloud/scripts/api')
import common
api_key = '34ee80757f0d03de36e33a1676d245e4'
workflows = common.get(api_key, 
'http://barium-rbh:9100/api/workflows?show_published=True')
print workflows
published_workflow_ids = [str(workflow[u'id']) for workflow in workflows if 
bool(workflow[u'published'])]
print published_workflow_ids

for pw_id in published_workflow_ids:
data = {}
data['workflow_id'] = pw_id
common.post(api_key, 'http://barium-rbh:9100/extras/api/workflows/import', 
data)


and I run it from ~/scripts/api

The output from the script is :
python get_wfs.py 
[{'name': 'SUVR', 'tags': [], 'url': '/extras/api/workflows/f2db41e1fa331b3e', 
'published': True, 'model_class': 'StoredWorkflow', 'id': 'f2db41e1fa331b3e'}, 
{'name': 'processSUVRData', 'tags': [], 'url': 
'/extras/api/workflows/f597429621d6eb2b', 'published': True, 'model_class': 
'StoredWorkflow', 'id': 'f597429621d6eb2b'}]
['f2db41e1fa331b3e', 'f597429621d6eb2b']
URL is 
http://barium-rbh:9100/extras/api/workflows/import?key=34ee80757f0d03de36e33a1676d245e4

Any ideas why the import maybe failing?

Thanks
Neil




From: Nicola Soranzo [sora...@crs4.it]
Sent: Monday, January 20, 2014 9:03 PM
To: Burdett, Neil (CCI, Herston - RBWH)
Cc: galaxy-dev@lists.bx.psu.edu
Subject: Re: Import workflows via API

Hi Neil,
to import all published workflows you can use a script like this:

import sys
sys.path.insert(1, 'galaxy-central/scripts/api')
import common
api_key = 'YOUR_USER_API_KEY'
workflows = common.get(api_key,
'http://YOUR_SERVER/api/workflows?show_published=True')
published_workflow_ids = [str(workflow[u'id']) for workflow in
workflows if bool(workflow[u'published'])]
for pw_id in published_workflow_ids:
 data['workflow_id'] = pw_id
 common.post(api_key, 'http://YOUR_SERVER/api/workflows/import',
data)

Nicola

Il 2014-01-19 03:12 neil.burd...@csiro.au ha scritto:
 Hi Nicola,
 that is exactly what I'm looking for, however, how do
 I execute the script/tool? I would like to import all published
 workflows. What is the name of the script to run and the arguments ?
 Can you give an example please?

 Thanks again
 Neil


 Date: Fri, 17 Jan 2014 11:36:03 +0100
 From: Nicola Soranzo sora...@crs4.it
 To: galaxy-dev@lists.bx.psu.edu
 Subject: Re: [galaxy-dev] Import workflows via API
 Message-ID: d834d247437269704a26e8bbf8f67...@crs4.it
 Content-Type: text/plain; charset=UTF-8; format=flowed

 Il 2014-01-17 06:45 neil.burd...@csiro.au ha scritto:
 Hi,

  I execute workflows via the API. However, if I want another user to
 use my workflows, I can publish my workflows, but the new user then
 has to go on to the web browser and import this workflow.

 Is there a method/script which I can call via the API which can
 import
 all available (published) workflows so the user doesn't have to
 click
 a button on the web browser import workflow ?

 Thanks for any help

 Hi Neil,
 you are very lucky, just yesterday my pull request implementing
 exactly
 this has been merged in galaxy-central:


 https://bitbucket.org/galaxy/galaxy-central/pull-request/298/api-display-and-import-workflows-shared-by/diff

 and is also available in BioBlend thanks to my colleague Simone Leo:

 https://github.com/afgane/bioblend/pull/51

 Best,
 Nicola

 --
 Nicola Soranzo, Ph.D.
 Bioinformatics Program, CRS4
 Loc. Piscina Manna, 09010 Pula (CA), Italy
 http://www.bioinformatica.crs4.it/


___