Re: [galaxy-dev] Downloading large files from local galaxy

2011-10-08 Thread Roy Weckiewicz
Hi Dan,

This solution you provided took care of the downloading issue. Thanks!


- Roy

On Wed, Oct 5, 2011 at 2:41 PM, Daniel Blankenberg  wrote:

> Hi Roy,
>
> Have you tried setting debug=False and use_interactive=False in the
> universe_wsgi.ini file?
>
>
> Thanks for using Galaxy,
>
> Dan
>
>
> On Oct 5, 2011, at 2:32 PM, Glen Beane wrote:
>
> >
> > On Oct 5, 2011, at 2:25 PM, Roy Weckiewicz wrote:
> >
> >> Hi,
> >>
> >>
> >> We have a local galaxy instance running on mac osx connected to a mysql
> database. We are having an issue downloading files which are > 2Gb. I've
> read posts on this mailing list that mention that there is a browser limit
> on downloading files of this size, which is fine
> >
> > I believe the 2GB limit is for uploading through a browser, not
> downloading.  I have downloaded files larger than 2GB through a web browser
> from our local Galaxy instance a number of times, although I do remember a
> long time ago (around a year ago) we had similar trouble with our local
> instance -- unfortunately I do not remember how we resolved it.
> >
> >
> > --
> > Glen L. Beane
> > Senior Software Engineer
> > The Jackson Laboratory
> > (207) 288-6153
> >
> >
> > ___
> > Please keep all replies on the list by using "reply all"
> > in your mail client.  To manage your subscriptions to this
> > and other Galaxy lists, please use the interface at:
> >
> >  http://lists.bx.psu.edu/
>
>


-- 
Roy Weckiewicz
Texas A&M University
___
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/

Re: [galaxy-dev] Downloading large files from local galaxy

2011-10-05 Thread Daniel Blankenberg
Hi Roy,

Have you tried setting debug=False and use_interactive=False in the 
universe_wsgi.ini file?


Thanks for using Galaxy,

Dan


On Oct 5, 2011, at 2:32 PM, Glen Beane wrote:

> 
> On Oct 5, 2011, at 2:25 PM, Roy Weckiewicz wrote:
> 
>> Hi,
>> 
>> 
>> We have a local galaxy instance running on mac osx connected to a mysql 
>> database. We are having an issue downloading files which are > 2Gb. I've 
>> read posts on this mailing list that mention that there is a browser limit 
>> on downloading files of this size, which is fine 
> 
> I believe the 2GB limit is for uploading through a browser, not downloading.  
> I have downloaded files larger than 2GB through a web browser from our local 
> Galaxy instance a number of times, although I do remember a long time ago 
> (around a year ago) we had similar trouble with our local instance -- 
> unfortunately I do not remember how we resolved it.
> 
> 
> --
> Glen L. Beane
> Senior Software Engineer
> The Jackson Laboratory
> (207) 288-6153
> 
> 
> ___
> Please keep all replies on the list by using "reply all"
> in your mail client.  To manage your subscriptions to this
> and other Galaxy lists, please use the interface at:
> 
>  http://lists.bx.psu.edu/


___
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/


Re: [galaxy-dev] Downloading large files from local galaxy

2011-10-05 Thread Glen Beane

On Oct 5, 2011, at 2:25 PM, Roy Weckiewicz wrote:

> Hi,
> 
> 
> We have a local galaxy instance running on mac osx connected to a mysql 
> database. We are having an issue downloading files which are > 2Gb. I've read 
> posts on this mailing list that mention that there is a browser limit on 
> downloading files of this size, which is fine 

I believe the 2GB limit is for uploading through a browser, not downloading.  I 
have downloaded files larger than 2GB through a web browser from our local 
Galaxy instance a number of times, although I do remember a long time ago 
(around a year ago) we had similar trouble with our local instance -- 
unfortunately I do not remember how we resolved it.


--
Glen L. Beane
Senior Software Engineer
The Jackson Laboratory
(207) 288-6153


___
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/


Re: [galaxy-dev] downloading large files

2011-08-26 Thread Edward Kirton
okay, thanks.  i'll create a tool to export files as a tarball in the
user's ftp folder, and couple it with a cron job to make sure the
files are deleted after a week.  i'll contribute it to the toolshed
when done.

On Fri, Aug 26, 2011 at 11:59 AM, Nate Coraor  wrote:
> Edward Kirton wrote:
>> i thought i recalled reading about downloading files from a history
>> via ftp, but i could been mistaken -- couldn't find anything on the
>> wiki or mailing list archives.  does this feature exist?
>>
>> what's the best way for users to download many or large files other
>> than via the browser?
>
> You can use wget/curl to avoid the browser, but it's still an http
> transfer.  Some people have written an "export" tool that writes a
> dataset to some specified location.  We've talked before about adding
> this sort of functionality directly into the interface but it hasn't
> been done yet.
>
> --nate
>
>> ___
>> Please keep all replies on the list by using "reply all"
>> in your mail client.  To manage your subscriptions to this
>> and other Galaxy lists, please use the interface at:
>>
>>   http://lists.bx.psu.edu/
>

___
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/


Re: [galaxy-dev] downloading large files

2011-08-26 Thread Nate Coraor
Edward Kirton wrote:
> i thought i recalled reading about downloading files from a history
> via ftp, but i could been mistaken -- couldn't find anything on the
> wiki or mailing list archives.  does this feature exist?
> 
> what's the best way for users to download many or large files other
> than via the browser?

You can use wget/curl to avoid the browser, but it's still an http
transfer.  Some people have written an "export" tool that writes a
dataset to some specified location.  We've talked before about adding
this sort of functionality directly into the interface but it hasn't
been done yet.

--nate

> ___
> Please keep all replies on the list by using "reply all"
> in your mail client.  To manage your subscriptions to this
> and other Galaxy lists, please use the interface at:
> 
>   http://lists.bx.psu.edu/
___
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/