Hi Jeremy,
That's really wonderful - thanks so much for taking the time and effort to
do this!
When you say large history, is there a size limit that I should be aware
of, or will it handle anything that my quota can accept?
Thanks,
Dave
On Sat, Oct 20, 2012 at 2:44 PM, Jeremy Goecks
When you say large history, is there a size limit that I should be aware of,
or will it handle anything that my quota can accept?
It will handle anything your quota can accept.
Best,
J.
___
The Galaxy User list should be used for the
I've reworked the code to handle large history export files in -central
changeset afc8e9345268., and this should solve your issue. This change should
make it out to our public server this coming week.
Best,
J.
On Oct 18, 2012, at 12:36 PM, Dave Corney wrote:
Hi Jeremy,
Thanks for your
Hi Jeremy,
Thanks for your offer of help. By the time I got your email I had already
added many new jobs to the history that are either running now or waiting
to run. Since I read somewhere that if the history is running then there
are problems exporting I shared a clone of the history with you.
Hi list,
Is there a currently a known problem with the export to file function?
I'm trying to migrate some data from the public galaxy to a private one;
the export function worked well with a small (~100mb) dataset, but it has
not been working with larger datasets (2GB) and I get the error:
Hi Dave,
Yes, if your Galaxy instance is on the internet, for entire history
transfer, you can skip the curl download and just enter the URL from the
public Main Galaxy server into your Galaxy directly.
To load large data over 2G that is local (datasets, not history
archives), you can use
Dave,
There's likely something problematic about your history that causing problems.
Can you share with me the history that's generating the error? To do so, from
the history options menu -- Share/Publish -- Share with a User -- my email
address
Thanks,
J.
On Oct 17, 2012, at 6:58 PM,
7 matches
Mail list logo