Dear all,

I noticed a strange behaviour in our local galaxy installation. First of all, 
my universe_wsgi.ini contains "retry_metadata_internally = False" and 
"cleanup_job = always". The tool writes its output simply into the 
job_working_directory and we move it via && mv static_filename.txt $output in 
the <command>-tag. This works fine.

When restarting the galaxy server and executing the tool after a fresh restart, 
everything is ok, there are no errors.
When executing the same tool a second time, galaxy brings a "tool error" 
stating that it was unable to finish the job. Nevertheless, the output files 
are all correct (but marked as red or failed).

The error report states:
Traceback (most recent call last):
  File "/home/galaxy/galaxy-dist/lib/galaxy/jobs/runners/local.py", line 129, 
in queue_job
    job_wrapper.finish( stdout, stderr, exit_code )
  File "/home/galaxy/galaxy-dist/lib/galaxy/jobs/__init__.py", line 997, in 
finish
    if ( not self.external_output_metadata.external_metadata_set_successfully( 
dataset, self.sa_session ) and self.app.config.retry_metadata_internally ):
  File "/home/galaxy/galaxy-dist/lib/galaxy/datatypes/metadata.py", line 731, 
in external_metadata_set_successfully
    rval, rstring = json.load( open( metadata_files.filename_results_code ) )
IOError: [Errno 2] No such file or directory: 
u'/home/galaxy/galaxy-dist/database/job_working_directory/000/59/metadata_results_HistoryDatasetAssociation_281_oHFjx0'

And in the logfile you can find multiple entries like that:
galaxy.datatypes.metadata DEBUG 2014-06-25 14:29:35,466 Failed to cleanup 
external metadata file (filename_results_code) for 
HistoryDatasetAssociation_281: [Errno 2] No such file or directory: 
'/home/galaxy/galaxy-dist/database/job_working_directory/000/59/metadata_results_HistoryDatasetAssociation_281_oHFjx0'

The  if-statement in /home/galaxy/galaxy-dist/lib/galaxy/jobs/__init__.py", 
line 997 should evaluate to False, since 
self.app.config.retry_metadata_internally is set to False in the 
universe_wsgi.ini but it seems it doesn't in this case?
Anyone having experienced such a behavior? Any suggestions how to go on and 
solve the issue?
Many thanks!

Jens

___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  http://lists.bx.psu.edu/

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/mailinglists/

Reply via email to