Thanks a lot, Thomas! It really helps, I added tools section followed your
suggestion...
here is my job_conf.xml ( I am using Torque, I have 3 servers. One for galaxy
server, two for cluster computing. )
?xml version=1.0?
job_conf
plugins
plugin id=pbs type=runner
Hello,
I am not a megablast expert but the
tools/metag_tools/megablast_wrapper.xml may help you to understand how
the program is run. You may want to tune the parameters.
Regards,
Thomas
On 16/07/2014 09:43, 王渭巍 wrote:
Thanks a lot, Thomas! It really helps, I added tools section followed your
and still no cluster options in megablast item. How can I see cluster
options in the page, for example, the page will let me choose to use local
server or a cluster.
Users can't control destinations for tool execution through any interface
at present AFAIK - tool destinations are automated
Hi Ben,
that is not possible at the moment. The idea is to keep the
user-inferface as easy as possible for the user. You, as admin, can
decide which resource a specific tool with a specific input will use.
You will never see any options like that in a tool, but you can write a
tool by
Hi,
galaxy provides a download_url to download zipped up files. However, the
user must first log into Galaxy to commence the download. Is there a way to
turn this feature off, so that users can click on the link and download the
data without first login
Thanks
Neil
The traditional upload tool cannot take in multiple files - but there
is a new upload widget that can be used to upload a large number of
files simultaneously. The upload widget can be launched by clicking on
the upload icon in the tools menu header (to the right of the word
tools).
I hope this
I am not really sure what link you are referring to - can you tell me
how you found this link? History - Export to file or is it via the
API?
Regardless, if you trust your users and want to disable security
mechanisms I believe (and I could be wrong) you should be able to by
finding the relevant
Hi Eric,
please have a look at:
https://github.com/bgruening/galaxytools/blob/master/datatypes/msa_datatypes/datatypes_conf.xml
You need somthing like:
datatype extension=genbank type=galaxy.datatypes.data:Text
subclass=True /
Lets try to split the EMBOSS datatypes a little bit into small
Indeed - ideally (once working) we can upload under the IUC ToolShed as a
community maintained resource rather than under a personal account which
becomes a single point of failure (the bus factor).
We (the ICU) have previously discussed doing this so that the EMBOSS
datatypes could become more
Forgive me, I'm not 100% clear on the custom plugin system used by galaxy, but
if I subclass from the text data type, will sniffers I implement override
text's and function? The lack of being able to add an entry to the sniffer
section (unlike with the tabular example) led me to believe my
Hi Eric,
Forgive me, I'm not 100% clear on the custom plugin system used by galaxy, but if I
subclass from the text data type, will sniffers I implement override text's
and function? The lack of being able to add an entry to the sniffer section (unlike with
the tabular example) led me to
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Hi Björn,
On 07/16/2014 09:20 AM, Björn Grüning wrote:
Hi Eric,
Forgive me, I'm not 100% clear on the custom plugin system used by
galaxy, but if I subclass from the text data type, will sniffers I
implement override text's and function? The
Hello to everybody
I'm developing my own tool that need to switch the numbers of output files
according to a parameter selected by the user from a list in the inputs
tag.
How can I do such thing?
Here is the XML code:
inputs
param name=input_dataset label=Input dataset type=data
On Wed, Jul 16, 2014 at 5:28 PM, Calogero Zarbo za...@fbk.eu wrote:
Hello to everybody
I'm developing my own tool that need to switch the numbers of output files
according to a parameter selected by the user from a list in the inputs
tag.
How can I do such thing?
Here is the XML code:
Il 2014-06-30 18:40 Anton Nekrutenko ha scritto:
Lance:
Here is github URL:
https://github.com/nekrut/freebayes [3]
On Wed, Jun 25, 2014 at 2:40 PM, Lance Parsons
lpars...@princeton.edu [4] wrote:
Thanks for the major update of the Freebayes wrapper, excellent!
I've run into two issues,
Thanks Peter, I guess I should then rely on API based tests.
On 15 July 2014 14:18, Peter Cock p.j.a.c...@googlemail.com wrote:
Hi Saket,
From memory the Twill tests are fragile with the output file order in the
XML.
John was discussing switching the default from the Twill to API backend,
On Wed, Jul 16, 2014 at 7:44 PM, Saket Choudhary sake...@gmail.com wrote:
Thanks Peter, I guess I should then rely on API based tests.
If it is just the order, make sure the order of the output files in the test
is consistent with that in the outputs and it make be OK with Twill...
I wonder if
I just want to clarify - do you have a tool with a multiple input data
parameter (e.g. param type=data multiple=true ... ) as some
intermediate step in a workflow? Or are you using a repeat or
something?
And are you saying the workflow editor shows multiple inputs going
into the tool but at
Is this going to work? I get that this would be a better design if
done from the beginning, but what happens if you install an emboss
repository upgrade (on an existing install) that brings in conflicting
types from other repositories that already exist and have been
previously installed? Does the
Assuming this comment:
Finally, we will talk to the devteam to
rewrite EMBOSS to depend on our separate data type repositories.
refers to the emboss_5 repository owned by devteam, then what is being proposed
should work (although I may not be fully understanding what is being proposed).
If
Hi,
I'm trying to use the cleanup_datasets.py file to remove all files on my
system older than 20 days. My crontab looks like this:
# m h dom mon dow command
34 10 * * * cd /export/barium-data3/galaxy-suvr python
scripts/cleanup_datasets/cleanup_datasets.py universe_wsgi.ini -d 20 -1
Hi John,
What I've implemented on our local system is the user uploads their
data, then automatically a workflow is run on that data (so the user doesn't
need to select a tool to execute). I email the user once the job is complete
with a link to that the user can download the
Hi, Bjoern
Would you share your procedure to make some tools to run on a cluster.
I have tried
https://wiki.galaxyproject.org/Admin/Config/Performance/Cluster using Torque,
but got errors.
I think maybe it's job_conf.xml. Would you share yours? Thanks a lot
Ben
You need to add the api key to the URL. E.g:
http://barium-rbh/capaibl/api/histories/1cd8e2f6b131e891/contents/f68b550c4724eeae/display?to_ext=htmlkey=APIKEYHERE
Otherwise, Galaxy doesn't know who's trying to access the data and if they have
authorization.
Regards,
Iyad Kandalaft
Hi all,
I am new to galaxy. We have a local install that we intend to turn into a
production server at some stage. We initially installed galaxy using the April
2014 version. We set up a Virtual Python Environment following the
instructions set out on this page:
Eleanor,
Can you please clarify if starting galaxy worked when you did:
. ./galaxy_env/bin/activate
cd galaxy_dist
sh run.sh
If it didn't, activate the python environment and run:
pip-2.6 install pycrypto
- or -
pip install pycrypto
If I recall correctly, the init script shipped with Galaxy
26 matches
Mail list logo