Thanks for the pointer Ross. It was really useful. I also found more
information about multiple output datasets on the wiki:
http://wiki.g2.bx.psu.edu/Admin/Tools/Multiple%20Output%20Files?highlight=%28dataset%29
I plan on writing a script that renames the output of the Grinder tool I
am wrappin
When you delete something in Galaxy it isn't purged from disk immediately
On Sep 16, 2011, at 6:32 PM, "Candace Seeve" wrote:
> I've deleted all of my histories and datasets and the status bar in the top
> right corner indicates that I am using 67% of my quotas. Is there any way to
> remove
I've deleted all of my histories and datasets and the status bar in the top
right corner indicates that I am using 67% of my quotas. Is there any way to
remove all data and jobs to return to 0%?
Thanks. _
I have the files ready for an update to the Putative SNP phenotypes
library. All the new files use hg19 coordinates. They should go either
into a new library or put subdirectories under the first where you chose
the build. What is the best way to add the metadata to these files? The
files a
I added a defuse galaxy tool to the toolshed: http://toolshed.g2.bx.psu.edu/
DeFuse is software developed by Andrew McPherson for gene fusion discovery
using RNA-Seq data.
http://sourceforge.net/projects/defuse/
___
Please keep all
The database 'version' in question is our internal migration number, not the
version of your postgres install. What has happened is that since you've last
updated galaxy, the database schema has changed, and you need to kick off the
migration scripts by running 'sh manage_db.sh upgrade' in your
On Fri, Sep 16, 2011 at 4:44 PM, David Matthews
wrote:
> Hi,
> We are getting ready to run a local setup of Galaxy here at bristol
> university on our HPC cluster, BlueCrystal. We have got a mini test setup
> running fine and now we want to install it properly on the main cluster and
> have the ga
Thank you dan, it works!
On 16/09/11 14:42, Daniel Blankenberg wrote:
Hi Fengyuan,
Try something like this:
with an InterMine instance
GENOME=${input1.dbkey} NAME=${input1.name}
INFO=${input1.info}
FlyMine
modMine
MetabolicMine
value="http://www.flymine.org/query/genomicRegionSe
Hi,
We are getting ready to run a local setup of Galaxy here at bristol university
on our HPC cluster, BlueCrystal. We have got a mini test setup running fine and
now we want to install it properly on the main cluster and have the galaxy
instance run in the queue along with all the other HOC jo
Did you try what Dan suggested ? Which issues did you find by doing so ?
PD: Someone should put the "repy to:" munging in this mailing list so that it
keeps the responses in the list automatically, there's a mailman setting for
this (reply_goes_to_list):
http://www.gnu.org/s/mailman/mailman-adm
Hi,
I don't mean to be pushy, but just in case my question is buried and
forgotten to be viewed. Could someone have a look at this, please?
Thanks
Fengyuan
On 12/09/11 10:20, Fengyuan Hu wrote:
Dear Galaxy developers,
I'm trying to create a tool to export galaxy interval data to
InterMine
Hi Leandro,
Is there an entry in your history for the upload? What file format does it
show? Is there any chance your original file was zipped? If Galaxy detected
it as a zip file on upload, it may have unzipped it and taken the first file in
it as the dataset.
That's at least the version o
Hi Ilya,
--genotype_likelihoods_model / -glm is available under advanced options for the
tool; it can be set to both, snp or indel. This parameter might be pulled out
from the advanced options heading and placed on the base options in the future
to make it easier to access. Thanks for the sugg
Hi Fengyuan,
Try something like this:
with an InterMine instance
GENOME=${input1.dbkey} NAME=${input1.name}
INFO=${input1.info}
FlyMine
modMine
MetabolicMine
http://www.flymine.org/query/genomicRegionSearch.do"; />
http://intermine.modencode.org/query/genomicRegionSearch.do"; />
h
Hi all,
We tried to find something in the docs and mailing list no luck. We
created a new datatype the is a straight subclass of Binary and then
when we upload such a file in the Galaxy UI and check the checksums
between the original file and the file located in the Galaxy
database/files/... dire
Hi,
I have built up a large collection of Galaxy data on my local account
and I am now finding the following operations very slow:
Showing saved histories
Refreshing a history containing many data items
Opening a library with ~200 files
Is there anyway to speed up performance? I am using a m
Hi;
On og., 2011.eko iraren 15a 20:40, Peter Cock wrote:
2011/9/15 Mikel Egaña Aranguren:
Hi;
I'm sure this has been mentioned before but perhaps it has been fixed.
Not yet, but you can follow this issue:
https://bitbucket.org/galaxy/galaxy-central/issue/325/
OK thanks
I'm writting a wr
17 matches
Mail list logo