I am having a lot of difficulty uploading some large gzipped fastqs (~ 10GB) to the public server. I have tried both ftp and "pulling" by http URL. The upload succeeds, however I get an error as it tries to gunzip it. I have tried more than 10 times now and succeeded once. These files are correct and complete, and gunzip properly locally. The error shown is usually this

format: txt, database: ?
Problem decompressing gzipped data

However on 2 occasions (both ftp uploads) I got the traceback below. Am I missing some obvious trick? I searched the archives and see references to problems with large gzipped files but no solutions.



Traceback (most recent call last):
File "/galaxy/home/g2main/galaxy_main/tools/data_source/upload.py", line 384, in <module>
File "/galaxy/home/g2main/galaxy_main/tools/data_source/upload.py", line 373, in __main__
    add_file( dataset, registry, json_file, output_path )
File "/galaxy/home/g2main/galaxy_main/tools/data_source/upload.py", line 270, in add_file line_count, converted_path = sniff.convert_newlines( dataset.path, in_place=in_place ) File "/galaxy/home/g2main/galaxy_main/lib/galaxy/datatypes/sniff.py", line 106, in convert_newlines
    shutil.move( temp_name, fname )
  File "/usr/lib/python2.7/shutil.py", line 299, in move
    copy2(src, real_dst)
  File "/usr/lib/python2.7/shutil.py", line 128, in copy2
    copyfile(src, dst)
  File "/usr/lib/python2.7/shutil.py", line 84, in copyfile
    copyfileobj(fsrc, fdst)
  File "/usr/lib/python2.7/shutil.py", line 49, in copyfileobj
    buf = fsrc.read(length)
IOError: [Errno 5] Input/output error
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using "reply all" in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:


To manage your subscriptions to this and other Galaxy lists,
please use the interface at:


Reply via email to