On Thu, Jun 21, 2012 at 8:54 AM, Alan V. Cowles <alan.cow...@duke.edu> wrote:
> My first question: Is there a way to make the galaxy server retrieve data
> from a local storage set, instead of uploading newly through the web
> interface? Http seems like an awfully inefficient way to transmit sequence
> data. If I mounted an nfs directory full of sequence to the server could I
> load and analyze it, and what options would I do to config this?

I would highly recommend to take a look at the admin documentation on
Galaxy's Wiki[1]. You do have several options to upload data from a
local file system and share it with your users using Data
Libraries[2]. You can also configure your Galaxy to allow users to use
FTP to upload data[3], this can also be easily adapted to any other
method by which your users can place the data in the specify directory
in Galaxy's config( I personally prefer using scp or rsync over ssh,
as all my users have local unix account in the server.)


> My second question: When setting up a cluster do most people make the galaxy
> server a head and farm out to compute nodes, or do you send jobs off to a
> remote cluster entirely? It seems like the options in the universe_wsgi.ini
> file calls a local daemon for cluster management, but I am a bit confused.

My Galaxy server happens to run on my SGE head node, but as long as
the node running Galaxy has qsub abilities and ideally share a file
system with the cluster, you should be fine. There is a way around the
shared file system but I haven't used so far. You can find
documentation about both options in Galaxy's Wiki[4].


Hope it helps,
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:


Reply via email to