Hi all,

Some of our split BLAST jobs using the parallelism feature have
been failing (apparently for a while, so this is not a recent problem
from changes in the BLAST wrappers) as follows:

The Galaxy framework encountered the following error while attempting
to run the tool:

Traceback (most recent call last):
  File "/mnt/galaxy/galaxy-dist/lib/galaxy/jobs/runners/tasks.py",
line 141, in queue_job
    job_wrapper.finish( stdout, stderr, job_exit_code )
  File "/mnt/galaxy/galaxy-dist/lib/galaxy/jobs/__init__.py", line
962, in finish
    if ( not self.external_output_metadata.external_metadata_set_successfully(
dataset, self.sa_session ) and
self.app.config.retry_metadata_internally ):
  File "/mnt/galaxy/galaxy-dist/lib/galaxy/datatypes/metadata.py",
line 638, in external_metadata_set_successfully
    rval, rstring = simplejson.load( open(
metadata_files.filename_results_code ) )
line 328, in load
    use_decimal=use_decimal, **kw)
line 384, in loads
    return _default_decoder.decode(s)
line 402, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
line 420, in raw_decode
    raise JSONDecodeError("No JSON object could be decoded", s, idx)
JSONDecodeError: No JSON object could be decoded: line 1 column 0 (char 0)

Tool execution generated the following error message:

Unable to finish job

Reading the code, this is breaking after all the child jobs have
finished, and the data has been merged, during the external
set metadata step.

This specific example was BLAST XML output, and the merge
had produced a 4.2 GB file which ended mid record (evidently
the output merge was not really successful).

However, as a partial XML file, it would be invalid and might
this cause problems with something in the metadata setting
code - and explain the reported error?

Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

To search Galaxy mailing lists use the unified search at:

Reply via email to