It sounds like the deferred job plugin will work as a mechanism to manage
the jobs, once there is a tool that communicate to two different galaxy
instances via the API and order them to start transferring/loading files. I
think the tools '__EXPORT_HISTORY__' and '__IMPORT_HISTORY__' (under
lib/gala
On Jan 7, 2013, at 1:52 PM, Kyle Ellrott wrote:
> I'm trying to figure out if I can do this all through the API (so I can skip
> setting up FTP servers and sharing database servers).
> I can scan one system, and initiate downloads on the destination system
> (using upload1). So as far as moving
I'm trying to figure out if I can do this all through the API (so I can
skip setting up FTP servers and sharing database servers).
I can scan one system, and initiate downloads on the destination system
(using upload1). So as far as moving files from one machine to another, it
should be fine. I cou
Hi,
A good practise is to create a ftp server
http://wiki.galaxyproject.org/Admin/Config/Upload%20via%20FTP and use a
tool to send / retrieve informations to this ftp server :
http://toolshed.g2.bx.psu.edu/
-> Data Source / data_nfs
Then export your ftp directory by NFS to your galaxy installatio
Is there any documentation on the transfer manager?
Is this a mechanism that I could use to synchronize data libraries between
two different Galaxy installations?
Kyle
___
Please keep all replies on the list by using "reply all"
in your mail