okay, thanks.  i'll create a tool to export files as a tarball in the
user's ftp folder, and couple it with a cron job to make sure the
files are deleted after a week.  i'll contribute it to the toolshed
when done.

On Fri, Aug 26, 2011 at 11:59 AM, Nate Coraor <n...@bx.psu.edu> wrote:
> Edward Kirton wrote:
>> i thought i recalled reading about downloading files from a history
>> via ftp, but i could been mistaken -- couldn't find anything on the
>> wiki or mailing list archives.  does this feature exist?
>>
>> what's the best way for users to download many or large files other
>> than via the browser?
>
> You can use wget/curl to avoid the browser, but it's still an http
> transfer.  Some people have written an "export" tool that writes a
> dataset to some specified location.  We've talked before about adding
> this sort of functionality directly into the interface but it hasn't
> been done yet.
>
> --nate
>
>> ___________________________________________________________
>> Please keep all replies on the list by using "reply all"
>> in your mail client.  To manage your subscriptions to this
>> and other Galaxy lists, please use the interface at:
>>
>>   http://lists.bx.psu.edu/
>

___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/

Reply via email to