Have you tried to use data libraries for this? There's an import mechanism
there that'll allow you to simply link to the file on disk without
copy/upload. I believe the "example_watch_folder.py" sample script (in the
distribution) does just this via the API, if you want an example.
On Mon, Sep
I want to frequently import many tens of thousands of datasets. The files are
on the same sever as Galaxy. But the upload based mechanism is really really
slow. It takes hours to load this many files, yet the data is not moving at all!
What is the best strategy to go about making a faster bulk