On Dec 15, 2011, at 4:05 PM, weihong yan wrote:
> I recently installed galaxy program on our Linux server and it is going to be
> used for analyzing high-throuput sequencing data. I tried to upload a
> dataset (a bed format file) from local computer to galaxy server, but no
> success. The message of "Dataset is uploading" stays forever. The uploaded
> dataset was shown on the /galaxy-dist/database/tmp directory, but it didn't
> get transferred to the job-working directory or files directory before it was
> deleted by the galaxy.
> The output of run_functional_tests shows that uploading file went
> successfully. The paster.log file doesn't show error message about the
> Any configuration did I miss? Your feedback is highly appreciated.
Sorry for the delayed response. Can you verify that the upload was not
interrupted? Was the file in the tmp/ directory the same as the file you
Uploading huge files via a browser is not the most efficient way to get data
into Galaxy. You may want to explore one of the other options in the wiki:
Note that the "Upload via FTP" method was designed for FTP but does not
actually require that you use FTP.
> Thank you!
> Please keep all replies on the list by using "reply all"
> in your mail client. To manage your subscriptions to this
> and other Galaxy lists, please use the interface at:
Please keep all replies on the list by using "reply all"
in your mail client. To manage your subscriptions to this
and other Galaxy lists, please use the interface at: