Hi,
I have two local instances of Galaxy running (both entirely independent of each 
other and running on separate clusters). I’m trying to export a history from 
Instance A to Instance B, but so far have not been able.
I’ve attempted this in two ways.

1. Download the Galaxy_history.tar.gz file from Instance A (Options > Export to 
File) and place it in a path on the filesystem which Galaxy B can see. Then 
Options >  Import from File and in the 'Archived History URL:’ field place: 
'/full/path/to/Galaxy_history.tar.gz' (I’ve also tried 
‘file:///full/path/to/Galaxy_history.tar.gz')
The job runs (submitted to LSF cluster) with no errors in paster.log, but the 
stdout file in the job_working_directory gives the error
Exception getting file from URL: unknown url 
type:/full/path/to/Galaxy_history.tar.gz <open file '<stderr>', mode 'w' at 
0x2ba2d4e9b1e0>
Error unpacking tar/gz archive: nothing to open <open file '<stderr>', mode 'w' 
at 0x2ba2d4e9b1e0>

2.  Copy the url link for the History in Instance A (Options > Share or Publish 
), then in Instance B go Options >  Import from File and in the 'Archived 
History URL:’ field place the url provided by Instance A.
Again the job runs OK, but the stdout file has the error:
Exception getting file from URL: <urlopen error [Errno 111] Connection refused> 
<open file '<stderr>', mode 'w' at 0x2b359790a1e0>
Error unpacking tar/gz archive: nothing to open <open file '<stderr>', mode 'w' 
at 0x2b359790a1e0>

I’m pretty sure method number 2 is failing because the Import History job is 
being run on the cluster which cannot see the outside world (and hence Instance 
A). I think I should be able to overcome this by specifying in job_conf.xml 
that the import tool be run locally. The problem is, that I’ve not been able to 
identify what the actual tool or process is that runs the Import Histories 
method. I know that it produces a bash script which runs 
lib/galaxy/tools/imp_exp/unpack_tar_gz_archive.py, but not being a standard 
tool (i.e. found in the ./tools/ directory with an xml wrapper), I can’t find 
an ID for it.
Naively I tried the following in job_conf.xml: (I have ‘local’ defined in 
‘destinations’ and ‘plugins’)

<tools default="local">
    <!--make the import histories tool run locally-->
    <tool id="lib/galaxy/tools/imp_exp/unpack_tar_gz_archive.py" 
destination="local"/>
</tools>

But this made no difference.

Does anyone know how I can either get the Import History tool to run locally or 
suggest another way to import my history?

Many thanks,
Graham


Dr. Graham Etherington
Bioinformatics Support Officer,
The Sainsbury Laboratory,
Norwich Research Park,
Norwich NR4 7UH.
UK
Tel: +44 (0)1603 450601
___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  http://lists.bx.psu.edu/

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/mailinglists/

Reply via email to