Hi,

We have a local galaxy instance running on mac osx connected to a mysql
database. We are having an issue downloading files which are > 2Gb. I've
read posts on this mailing list that mention that there is a browser limit
on downloading files of this size, which is fine -- a command-line method
would be okay. Furthermore, I've read some solutions that mention using
either wget or curl along with the link from the 'disk icon' in the history
to download the data. We've tried both curl and wget, which seem to work
fine with files < 2Gb, but again fails with the larger files. Wget will tend
to time-out, and attempts to retry over and over (wget
'address-from-galaxy-history') and curl will hang forever as well (curl -O
'address-from-galaxy-history').

Attempts have been made to both download the data from the host computer to
itself, and from the host to a remote computer. Are we perhaps missing some
critical configuration setting?

We would like to stay away from a ftp solution if at all possible.

 Is anyone using a simple method to retrieve these large files?


-Roy W.

-- 
Roy Weckiewicz
Texas A&M University
___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/

Reply via email to