Re: [galaxy-user] Export to file

2012-10-22 Thread Dave Corney
Hi Jeremy,

That's really wonderful - thanks so much for taking the time and effort to
do this!

When you say large history, is there a size limit that I should be aware
of, or will it handle anything that my quota can accept?

Thanks,
Dave

On Sat, Oct 20, 2012 at 2:44 PM, Jeremy Goecks jeremy.goe...@emory.eduwrote:

 I've reworked the code to handle large history export files in -central
 changeset afc8e9345268., and this should solve your issue. This change
 should make it out to our public server this coming week.

 Best,
 J.

 On Oct 18, 2012, at 12:36 PM, Dave Corney wrote:

 Hi Jeremy,

 Thanks for your offer of help. By the time I got your email I had already
 added many new jobs to the history that are either running now or waiting
 to run. Since I read somewhere that if the history is running then there
 are problems exporting I shared a clone of the history with you. The clone
 should be identical to the history that I was having problems with
 yesterday. I can share with you the original history once the jobs have
 finished running (but it might take a while).

 Thanks,
 Dave


 On Wed, Oct 17, 2012 at 10:35 PM, Jeremy Goecks 
 jeremy.goe...@emory.eduwrote:

 Dave,

 There's likely something problematic about your history that causing
 problems. Can you share with me the history that's generating the error? To
 do so, from the history options menu -- Share/Publish -- Share with a
 User -- my email address

 Thanks,
 J.


 On Oct 17, 2012, at 6:58 PM, Jennifer Jackson wrote:

  Hi Dave,
 
  Yes, if your Galaxy instance is on the internet, for entire history
 transfer, you can skip the curl download and just enter the URL from the
 public Main Galaxy server into your Galaxy directly.
 
  To load large data over 2G that is local (datasets, not history
 archives), you can use the data library option. The idea is to load into a
 library, then move datasets from libraries into histories as needed. Help
 is in our wiki here:
  http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Libraries
 
 http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Uploading%20Library%20Files
 
  Take care,
 
  Jen
  Galaxy team
 
  On 10/17/12 3:21 PM, Dave Corney wrote:
  Hi Jen,
 
  Thanks for your response and suggestion. Just so that it is clear, for
  your second method, where I export to file and then use curl, I will
  download to my computer as an intermediate stage? Is there a simple way
  to take the history and datasets from PSU galaxy to our Princeton
 galaxy
  directly (without downloading to my computer first)? Unfortunately, we
  don't have FTP on our own galaxy, which is why I was looking for
  alternatives (each file is 2GB, so uploading through the browser won't
  work either). It seems that to import from file, the file needs to have
  a URL and I'm not sure how to go about that if the file is store
 locally
  on my computer.
 
  Thanks,
  Dave
 
 
 
  On Wed, Oct 17, 2012 at 6:12 PM, Jennifer Jackson j...@bx.psu.edu
  mailto:j...@bx.psu.edu wrote:
 
 Hi Dave,
 
 To export larger files, you can use a different method. Open up a
 terminal window on your computer and type in at the prompt ($):
 
 $ curl -0 'file_link'  name_the_output
 
 Where file_link can be obtained by right-clicking on the disc icon
 for the dataset and selecting Copy link location.
 
 If you are going to import into a local Galaxy, exporting entire
 histories, or a history comprised of datasets that you have
 copied/grouped together, may be a quick alternative. From the
 history panel, use Options (gear icon) - Export to File to
 generate a link, then use curl again to perform the download. The
 Import from File function (in the same menu) can be used in your
 local Galaxy to incorporate the history and the datasets it
 contains.
 
 Hopefully this helps, but please let us know if you have more
 questions,
 
 Jen
 Galaxy team
 
 
 On 10/17/12 2:37 PM, Dave Corney wrote:
 
 Hi list,
 
 Is there a currently a known problem with the export to file
 function?
 I'm trying to migrate some data from the public galaxy to a
 private one;
 the export function worked well with a small (~100mb) dataset,
 but it
 has not been working with larger datasets (2GB) and I get the
 error:
 Server Error. An error occurred. See the error logs for more
 information. (Turn debug on to display exception reports here).
 Is there
 a limit on the file size of the export? If so, what is it?
 
 Thanks in advance,
 Dave
 
 
 _
 The Galaxy User list should be used for the discussion of
 Galaxy analysis and other features on the public server
 at usegalaxy.org http://usegalaxy.org.  Please keep all
 replies on the list by
 using reply all in your mail client.  For discussion of
  

Re: [galaxy-user] Export to file

2012-10-22 Thread Jeremy Goecks
 
 When you say large history, is there a size limit that I should be aware of, 
 or will it handle anything that my quota can accept?

It will handle anything your quota can accept.

Best,
J.
___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using reply all in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

  http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

  http://lists.bx.psu.edu/


Re: [galaxy-user] Export to file

2012-10-20 Thread Jeremy Goecks
I've reworked the code to handle large history export files in -central 
changeset afc8e9345268., and this should solve your issue. This change should 
make it out to our public server this coming week.

Best,
J.

On Oct 18, 2012, at 12:36 PM, Dave Corney wrote:

 Hi Jeremy,
 
 Thanks for your offer of help. By the time I got your email I had already 
 added many new jobs to the history that are either running now or waiting to 
 run. Since I read somewhere that if the history is running then there are 
 problems exporting I shared a clone of the history with you. The clone should 
 be identical to the history that I was having problems with yesterday. I can 
 share with you the original history once the jobs have finished running (but 
 it might take a while).
 
 Thanks,
 Dave
 
 
 On Wed, Oct 17, 2012 at 10:35 PM, Jeremy Goecks jeremy.goe...@emory.edu 
 wrote:
 Dave,
 
 There's likely something problematic about your history that causing 
 problems. Can you share with me the history that's generating the error? To 
 do so, from the history options menu -- Share/Publish -- Share with a User 
 -- my email address
 
 Thanks,
 J.
 
 
 On Oct 17, 2012, at 6:58 PM, Jennifer Jackson wrote:
 
  Hi Dave,
 
  Yes, if your Galaxy instance is on the internet, for entire history 
  transfer, you can skip the curl download and just enter the URL from the 
  public Main Galaxy server into your Galaxy directly.
 
  To load large data over 2G that is local (datasets, not history archives), 
  you can use the data library option. The idea is to load into a library, 
  then move datasets from libraries into histories as needed. Help is in our 
  wiki here:
  http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Libraries
  http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Uploading%20Library%20Files
 
  Take care,
 
  Jen
  Galaxy team
 
  On 10/17/12 3:21 PM, Dave Corney wrote:
  Hi Jen,
 
  Thanks for your response and suggestion. Just so that it is clear, for
  your second method, where I export to file and then use curl, I will
  download to my computer as an intermediate stage? Is there a simple way
  to take the history and datasets from PSU galaxy to our Princeton galaxy
  directly (without downloading to my computer first)? Unfortunately, we
  don't have FTP on our own galaxy, which is why I was looking for
  alternatives (each file is 2GB, so uploading through the browser won't
  work either). It seems that to import from file, the file needs to have
  a URL and I'm not sure how to go about that if the file is store locally
  on my computer.
 
  Thanks,
  Dave
 
 
 
  On Wed, Oct 17, 2012 at 6:12 PM, Jennifer Jackson j...@bx.psu.edu
  mailto:j...@bx.psu.edu wrote:
 
 Hi Dave,
 
 To export larger files, you can use a different method. Open up a
 terminal window on your computer and type in at the prompt ($):
 
 $ curl -0 'file_link'  name_the_output
 
 Where file_link can be obtained by right-clicking on the disc icon
 for the dataset and selecting Copy link location.
 
 If you are going to import into a local Galaxy, exporting entire
 histories, or a history comprised of datasets that you have
 copied/grouped together, may be a quick alternative. From the
 history panel, use Options (gear icon) - Export to File to
 generate a link, then use curl again to perform the download. The
 Import from File function (in the same menu) can be used in your
 local Galaxy to incorporate the history and the datasets it contains.
 
 Hopefully this helps, but please let us know if you have more questions,
 
 Jen
 Galaxy team
 
 
 On 10/17/12 2:37 PM, Dave Corney wrote:
 
 Hi list,
 
 Is there a currently a known problem with the export to file
 function?
 I'm trying to migrate some data from the public galaxy to a
 private one;
 the export function worked well with a small (~100mb) dataset,
 but it
 has not been working with larger datasets (2GB) and I get the
 error:
 Server Error. An error occurred. See the error logs for more
 information. (Turn debug on to display exception reports here).
 Is there
 a limit on the file size of the export? If so, what is it?
 
 Thanks in advance,
 Dave
 
 
 _
 The Galaxy User list should be used for the discussion of
 Galaxy analysis and other features on the public server
 at usegalaxy.org http://usegalaxy.org.  Please keep all
 replies on the list by
 using reply all in your mail client.  For discussion of
 local Galaxy instances and the Galaxy source code, please
 use the Galaxy Development list:
 
 http://lists.bx.psu.edu/__listinfo/galaxy-dev
 http://lists.bx.psu.edu/listinfo/galaxy-dev
 
 To manage your subscriptions to this and other Galaxy lists,
 please 

Re: [galaxy-user] Export to file

2012-10-18 Thread Dave Corney
Hi Jeremy,

Thanks for your offer of help. By the time I got your email I had already
added many new jobs to the history that are either running now or waiting
to run. Since I read somewhere that if the history is running then there
are problems exporting I shared a clone of the history with you. The clone
should be identical to the history that I was having problems with
yesterday. I can share with you the original history once the jobs have
finished running (but it might take a while).

Thanks,
Dave


On Wed, Oct 17, 2012 at 10:35 PM, Jeremy Goecks jeremy.goe...@emory.eduwrote:

 Dave,

 There's likely something problematic about your history that causing
 problems. Can you share with me the history that's generating the error? To
 do so, from the history options menu -- Share/Publish -- Share with a
 User -- my email address

 Thanks,
 J.


 On Oct 17, 2012, at 6:58 PM, Jennifer Jackson wrote:

  Hi Dave,
 
  Yes, if your Galaxy instance is on the internet, for entire history
 transfer, you can skip the curl download and just enter the URL from the
 public Main Galaxy server into your Galaxy directly.
 
  To load large data over 2G that is local (datasets, not history
 archives), you can use the data library option. The idea is to load into a
 library, then move datasets from libraries into histories as needed. Help
 is in our wiki here:
  http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Libraries
 
 http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Uploading%20Library%20Files
 
  Take care,
 
  Jen
  Galaxy team
 
  On 10/17/12 3:21 PM, Dave Corney wrote:
  Hi Jen,
 
  Thanks for your response and suggestion. Just so that it is clear, for
  your second method, where I export to file and then use curl, I will
  download to my computer as an intermediate stage? Is there a simple way
  to take the history and datasets from PSU galaxy to our Princeton galaxy
  directly (without downloading to my computer first)? Unfortunately, we
  don't have FTP on our own galaxy, which is why I was looking for
  alternatives (each file is 2GB, so uploading through the browser won't
  work either). It seems that to import from file, the file needs to have
  a URL and I'm not sure how to go about that if the file is store locally
  on my computer.
 
  Thanks,
  Dave
 
 
 
  On Wed, Oct 17, 2012 at 6:12 PM, Jennifer Jackson j...@bx.psu.edu
  mailto:j...@bx.psu.edu wrote:
 
 Hi Dave,
 
 To export larger files, you can use a different method. Open up a
 terminal window on your computer and type in at the prompt ($):
 
 $ curl -0 'file_link'  name_the_output
 
 Where file_link can be obtained by right-clicking on the disc icon
 for the dataset and selecting Copy link location.
 
 If you are going to import into a local Galaxy, exporting entire
 histories, or a history comprised of datasets that you have
 copied/grouped together, may be a quick alternative. From the
 history panel, use Options (gear icon) - Export to File to
 generate a link, then use curl again to perform the download. The
 Import from File function (in the same menu) can be used in your
 local Galaxy to incorporate the history and the datasets it contains.
 
 Hopefully this helps, but please let us know if you have more
 questions,
 
 Jen
 Galaxy team
 
 
 On 10/17/12 2:37 PM, Dave Corney wrote:
 
 Hi list,
 
 Is there a currently a known problem with the export to file
 function?
 I'm trying to migrate some data from the public galaxy to a
 private one;
 the export function worked well with a small (~100mb) dataset,
 but it
 has not been working with larger datasets (2GB) and I get the
 error:
 Server Error. An error occurred. See the error logs for more
 information. (Turn debug on to display exception reports here).
 Is there
 a limit on the file size of the export? If so, what is it?
 
 Thanks in advance,
 Dave
 
 
 _
 The Galaxy User list should be used for the discussion of
 Galaxy analysis and other features on the public server
 at usegalaxy.org http://usegalaxy.org.  Please keep all
 replies on the list by
 using reply all in your mail client.  For discussion of
 local Galaxy instances and the Galaxy source code, please
 use the Galaxy Development list:
 
 http://lists.bx.psu.edu/__listinfo/galaxy-dev
 http://lists.bx.psu.edu/listinfo/galaxy-dev
 
 To manage your subscriptions to this and other Galaxy lists,
 please use the interface at:
 
 http://lists.bx.psu.edu/
 
 
 --
 Jennifer Jackson
 http://galaxyproject.org
 
 
 
  --
  Jennifer Jackson
  http://galaxyproject.org
  ___
  The Galaxy User list should be used for the discussion of
  

[galaxy-user] Export to file

2012-10-17 Thread Dave Corney
Hi list,

Is there a currently a known problem with the export to file function?
I'm trying to migrate some data from the public galaxy to a private one;
the export function worked well with a small (~100mb) dataset, but it has
not been working with larger datasets (2GB) and I get the error: Server
Error. An error occurred. See the error logs for more information. (Turn
debug on to display exception reports here). Is there a limit on the file
size of the export? If so, what is it?

Thanks in advance,
Dave
___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using reply all in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

  http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

  http://lists.bx.psu.edu/

Re: [galaxy-user] Export to file

2012-10-17 Thread Jennifer Jackson

Hi Dave,

Yes, if your Galaxy instance is on the internet, for entire history 
transfer, you can skip the curl download and just enter the URL from the 
public Main Galaxy server into your Galaxy directly.


To load large data over 2G that is local (datasets, not history 
archives), you can use the data library option. The idea is to load into 
a library, then move datasets from libraries into histories as needed. 
Help is in our wiki here:

http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Libraries
http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Uploading%20Library%20Files

Take care,

Jen
Galaxy team

On 10/17/12 3:21 PM, Dave Corney wrote:

Hi Jen,

Thanks for your response and suggestion. Just so that it is clear, for
your second method, where I export to file and then use curl, I will
download to my computer as an intermediate stage? Is there a simple way
to take the history and datasets from PSU galaxy to our Princeton galaxy
directly (without downloading to my computer first)? Unfortunately, we
don't have FTP on our own galaxy, which is why I was looking for
alternatives (each file is 2GB, so uploading through the browser won't
work either). It seems that to import from file, the file needs to have
a URL and I'm not sure how to go about that if the file is store locally
on my computer.

Thanks,
Dave



On Wed, Oct 17, 2012 at 6:12 PM, Jennifer Jackson j...@bx.psu.edu
mailto:j...@bx.psu.edu wrote:

Hi Dave,

To export larger files, you can use a different method. Open up a
terminal window on your computer and type in at the prompt ($):

$ curl -0 'file_link'  name_the_output

Where file_link can be obtained by right-clicking on the disc icon
for the dataset and selecting Copy link location.

If you are going to import into a local Galaxy, exporting entire
histories, or a history comprised of datasets that you have
copied/grouped together, may be a quick alternative. From the
history panel, use Options (gear icon) - Export to File to
generate a link, then use curl again to perform the download. The
Import from File function (in the same menu) can be used in your
local Galaxy to incorporate the history and the datasets it contains.

Hopefully this helps, but please let us know if you have more questions,

Jen
Galaxy team


On 10/17/12 2:37 PM, Dave Corney wrote:

Hi list,

Is there a currently a known problem with the export to file
function?
I'm trying to migrate some data from the public galaxy to a
private one;
the export function worked well with a small (~100mb) dataset,
but it
has not been working with larger datasets (2GB) and I get the
error:
Server Error. An error occurred. See the error logs for more
information. (Turn debug on to display exception reports here).
Is there
a limit on the file size of the export? If so, what is it?

Thanks in advance,
Dave


_
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org http://usegalaxy.org.  Please keep all
replies on the list by
using reply all in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

http://lists.bx.psu.edu/__listinfo/galaxy-dev
http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

http://lists.bx.psu.edu/


--
Jennifer Jackson
http://galaxyproject.org




--
Jennifer Jackson
http://galaxyproject.org
___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using reply all in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

 http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

 http://lists.bx.psu.edu/


Re: [galaxy-user] Export to file

2012-10-17 Thread Jeremy Goecks
Dave,

There's likely something problematic about your history that causing problems. 
Can you share with me the history that's generating the error? To do so, from 
the history options menu -- Share/Publish -- Share with a User -- my email 
address

Thanks,
J.


On Oct 17, 2012, at 6:58 PM, Jennifer Jackson wrote:

 Hi Dave,
 
 Yes, if your Galaxy instance is on the internet, for entire history transfer, 
 you can skip the curl download and just enter the URL from the public Main 
 Galaxy server into your Galaxy directly.
 
 To load large data over 2G that is local (datasets, not history archives), 
 you can use the data library option. The idea is to load into a library, then 
 move datasets from libraries into histories as needed. Help is in our wiki 
 here:
 http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Libraries
 http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Uploading%20Library%20Files
 
 Take care,
 
 Jen
 Galaxy team
 
 On 10/17/12 3:21 PM, Dave Corney wrote:
 Hi Jen,
 
 Thanks for your response and suggestion. Just so that it is clear, for
 your second method, where I export to file and then use curl, I will
 download to my computer as an intermediate stage? Is there a simple way
 to take the history and datasets from PSU galaxy to our Princeton galaxy
 directly (without downloading to my computer first)? Unfortunately, we
 don't have FTP on our own galaxy, which is why I was looking for
 alternatives (each file is 2GB, so uploading through the browser won't
 work either). It seems that to import from file, the file needs to have
 a URL and I'm not sure how to go about that if the file is store locally
 on my computer.
 
 Thanks,
 Dave
 
 
 
 On Wed, Oct 17, 2012 at 6:12 PM, Jennifer Jackson j...@bx.psu.edu
 mailto:j...@bx.psu.edu wrote:
 
Hi Dave,
 
To export larger files, you can use a different method. Open up a
terminal window on your computer and type in at the prompt ($):
 
$ curl -0 'file_link'  name_the_output
 
Where file_link can be obtained by right-clicking on the disc icon
for the dataset and selecting Copy link location.
 
If you are going to import into a local Galaxy, exporting entire
histories, or a history comprised of datasets that you have
copied/grouped together, may be a quick alternative. From the
history panel, use Options (gear icon) - Export to File to
generate a link, then use curl again to perform the download. The
Import from File function (in the same menu) can be used in your
local Galaxy to incorporate the history and the datasets it contains.
 
Hopefully this helps, but please let us know if you have more questions,
 
Jen
Galaxy team
 
 
On 10/17/12 2:37 PM, Dave Corney wrote:
 
Hi list,
 
Is there a currently a known problem with the export to file
function?
I'm trying to migrate some data from the public galaxy to a
private one;
the export function worked well with a small (~100mb) dataset,
but it
has not been working with larger datasets (2GB) and I get the
error:
Server Error. An error occurred. See the error logs for more
information. (Turn debug on to display exception reports here).
Is there
a limit on the file size of the export? If so, what is it?
 
Thanks in advance,
Dave
 
 
_
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org http://usegalaxy.org.  Please keep all
replies on the list by
using reply all in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:
 
http://lists.bx.psu.edu/__listinfo/galaxy-dev
http://lists.bx.psu.edu/listinfo/galaxy-dev
 
To manage your subscriptions to this and other Galaxy lists,
please use the interface at:
 
http://lists.bx.psu.edu/
 
 
--
Jennifer Jackson
http://galaxyproject.org
 
 
 
 -- 
 Jennifer Jackson
 http://galaxyproject.org
 ___
 The Galaxy User list should be used for the discussion of
 Galaxy analysis and other features on the public server
 at usegalaxy.org.  Please keep all replies on the list by
 using reply all in your mail client.  For discussion of
 local Galaxy instances and the Galaxy source code, please
 use the Galaxy Development list:
 
 http://lists.bx.psu.edu/listinfo/galaxy-dev
 
 To manage your subscriptions to this and other Galaxy lists,
 please use the interface at:
 
 http://lists.bx.psu.edu/


___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list