Hi
Have you looked into using "Data Libraries", see:
http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Libraries
http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Uploading%20Library%20Files
you get access to the data without duplication.
Regards, Hans
On 01/17/2012 12:35 AM, Joshua Gross wro
Hi
I am in the process of upgrading all our Galaxy servers to the current
changeset ("b258de1e6cea", Nov. 18) and I have noticed an inconsistency:
If I upgrade one of our old servers (which was on "720455407d1c", June
23) with 'hg pull -u -r b258de1e6cea' I get the following:
haruhotz@sili
Hello Josh,
You'll want to upload them into a Galaxy Data Library using the File system
paths / Do not copy data into Galaxy approach. For details about all of the
ways to upload to Galaxy data libraries, see our wiki here:
http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Uploading%20Library%2
Hi Jeremy,
I have shared my history with you. Today, error that appears when I
try to run cufflinks has changed into: "Error executing tool: 'hg19'".
Thanks,
Milos
On Mon, Jan 16, 2012 at 7:21 PM, Jeremy Goecks wrote:
> Milos,
>
> Can you share your history with me (Options -> Share or Publish
Hi Ming,
From Galaxy, you can currently view BAM and VCF (VCF is available in -central,
but will be in the next -dist release) files within IGV. To do this on a local
set up, you will need to have your Galaxy instance behind a proxy (e.g. nginx
or apache); this enables byte-range requests. In
Dan,
any idea when the next dist release is scheduled? We need to upgrade our wur
instance seriously and from then keep up the pace, but it would be a pity if we
upgrade and extensively test it now, while the new release is out a few days
later.
The latest is from November 18th right?!
Thanks
Al
Hi Alex,
Yes, thats the current -dist.
I hope I don't regret saying this, but we are pushing for a new release coming
real soon, so you may want to hold off on the update if it will require
extensive testing on your end.
Thanks for using Galaxy,
Dan
On Jan 17, 2012, at 9:13 AM, Bossers,
Hi,
For information, here is my install script for rpy, because I had to
reinstall all of that on CentOS 5.4... :
#!/bin/bash
>
> #
>
> #
>
> #
>
>
>> echo "< RPY for R and Python **"
>
> cd rpy;
>
> export
>> LD_LIBRARY_PATH=/opt/gridengine/lib/lx26-amd64:/lib64:
Hi Dan, thanks for the response. Does Galaxy index VCF files, for
example with tabix of the "tribble" library? IGV currently requires
indexes for VCF files, mainly because the ones we generate here are
huge. This could be relaxed for smaller VCF files, and I have that on
my list.
best,
Hello,
We want to move Galaxy's jobs from our small TORQUE local install to a
big cluster running PBS Pro.
In the universe_wsgi.ini, I changed the cluster address as follows:
default_cluster_job_runner = pbs:///
to:
default_cluster_job_runner = pbs://sub-master/clng_new/
where sub-master is th
Sorry for the late response.
Thanks for your help.
On Wed, Jan 4, 2012 at 2:32 PM, Carlos Borroto wrote:
> On Wed, Jan 4, 2012 at 2:25 PM, Ryan wrote:
> > On Wed, Jan 4, 2012 at 12:04 PM, Langhorst, Brad
> wrote:
> >>
> >> Usha:
> >>
> >> Galaxy is essentially a wrapper around other command li
Hi Jim,
Yes, Galaxy ships the vcf files bgzip compressed (filename.vcf.gz) and makes a
tabix index available along side (filename.vcf.gz.tbi). In the case of Galaxy,
the pysam package is used to create the tabix index.
Thanks,
Dan
On Jan 17, 2012, at 9:44 AM, Jim Robinson wrote:
> Hi Dan,
So I just tried restarting Galaxy and it downloaded a bunch of new eggs
then errored out. My run.sh script didn't change so how do I start Galaxy
now???
[galaxy@bic galaxy-dist]$ ./run.sh --start-daemon
Some eggs are out of date, attempting to fetch...
Fetched http://eggs.g2.bx.psu.edu/Mako/Mak
That should be ./run.sh --daemon, not --start-daemon. The error is just that
--start-daemon is an unknown option.
-Dannon
On Jan 17, 2012, at 9:25 AM, Ryan Golhar wrote:
> So I just tried restarting Galaxy and it downloaded a bunch of new eggs then
> errored out. My run.sh script didn't cha
ah yes. Thanks.
On Tue, Jan 17, 2012 at 12:30 PM, Dannon Baker wrote:
> That should be ./run.sh --daemon, not --start-daemon. The error is just
> that --start-daemon is an unknown option.
>
> -Dannon
>
>
> On Jan 17, 2012, at 9:25 AM, Ryan Golhar wrote:
>
> > So I just tried restarting Galaxy
Hi,
I tried a few more things. I see when I try to update a dataset in a
library the "The resource could not be found" goes a way, although
still nothing gets updated:
./update.py
http://localhost:8080/api/libraries/f2db41e1fa331b3e/contents/8c49be448cfe29bc
name=foobar
Response
None
Th
Milos,
I successfully ran Cufflinks on your datasets; I've shared the history with you
that has the results. Perhaps you ran into a transient error.
Best,
J.
On Jan 17, 2012, at 8:13 AM, Milos Busarcevic wrote:
> Hi Jeremy,
>
> I have shared my history with you. Today, error that appears when
Hello all,
I recently have been having problems viewing/displaying datasets (with the
eye icon) as well as downloading datasets in Galaxy which I have uploaded,
although I can actually point to those datasets as input to other tools and
they show up on the drop down menus and it runs perfectly. Ev
Also, I just uploaded a 1.3GB FASTQ file and the small preview box in the
history pane shows the first few lines, and when I click on the eye it
actually displays in the window with the message "This dataset is large and
only the first megabyte is shown below. Show all | Save" and it shows the
firs
Hi Leon,
I can let you know that there is a hard limit of 50G per file allowed
for any dataset FTP'd and moved into a history. Datasets will exist for
3 days in the user-specific "holding area" before they are automatically
purged, if not moved into a history first. While in that holding area,
Hello all -
I'm working on a tool for Galaxy that generates an output file. I am able to
view the file without issue in Galaxy itself, but when I click on the
"Download" icon, the History sidebar produces the following error message:
"Unable to remove temporary library download archive and
21 matches
Mail list logo