Re: [galaxy-user] How to transfer files between two galaxy instances

2012-05-29 Thread shamsher jagat
Ok. Perhaps I am not understanding the process- I am not making any head
way in transfering the data from one Galaxy instance to other. I have
uploaded some files in Rutsch lab galaxy instance and have url
http://galaxy.tuebingen.mpg.de/workflow/for_direct_import?id=9b305a114b324ccf

Nothing is happening. Is it possible that some one from Galaxy team can
enlist steps of - how to transfer files (data) from one galaxy instance to
other please, considering me as a beginer. Thanks. Sorry for pushing this
question.

Vasu
On Fri, May 25, 2012 at 11:33 AM, Dannon Baker dannonba...@me.com wrote:

  Hi,

 Just wanted to add a few clarifications here.  It definitely *is*
 currently possible to transfer a workflow from one instance to another
 instance that does not have (some or all) of the tools for a particular
 workflow.

 The error you're running into No JSON object means that you likely have
 the wrong link to your workflow.  The one you want is accessible via the
 workflow context menu - Download or Export - URL for importing into
 another galaxy.  Or, you could just download the raw file if you want and
 upload that as you figured out.  The format of the correct URL should look
 like this, note the for_direct_import in the string:

 https://main.g2.bx.psu.edu/workflow/for_direct_import?id=53b7bf0869d3e7ee

 As a correction to what was previously said, I would not recommend
 stripping out tools from an existing workflow prior to export.  When you
 upload the workflow to a new instance, if tools aren't available you will
 see something like the following when you edit the workflow, which
 specifies that the tool is not found:


 And at this point the unrecognized tools can be installed if it's your
 galaxy server, or if you wish, removed from the workflow via the editor.
  This must be done before the workflow will be usable.

 Lastly, workflows don't contain any data, just the organization and
 parameters of steps for a process.  What it sounds like you're looking for
 (to get your data there as well) is a history export, which is available
 through the menu at the top of your history as Export to File.

 -Dannon


 On May 24, 2012, at 4:06 PM, shamsher jagat wrote:

 Thanks Jen for the update. I tried following:
 Go to Ratsch Galaxy instance  workflow make work flow accessible via link
 Go to galaxy Penn server
 Workflow import workflow URL galaxy UR

 error is
  The data content does not appear to be a Galaxy workflow.
 Exception: No JSON object could be decoded: line 1 column 0 (char 0)

 I also downloaded the file from Ratsch serever saved on computer and  use
 option of Choose file under import galaxy flow it importe dthe file afetr a
 while and when I opened workflow there was no data only steps of the
 workflow were there.

 Do you have any suggestion wheer I am doing something wrong.

 Thanks



PastedGraphic-1.png___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using reply all in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

  http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

  http://lists.bx.psu.edu/

Re: [galaxy-user] How to transfer files between two galaxy instances

2012-05-29 Thread Dannon Baker
The data is not a part of the workflow -- what you want is a history export.

From the history you want to export, in the history menu, select Export to 
File.  This step may take a while depending on the size of your history.  
Navigate to the link to make sure it's done packaging the history.  If it 
isn't, you'll see Still exporting history your history; please check back 
soon.  If it is, a download will automatically start.  You can save this, or 
just cancel the download (if it's large).

Once the data is ready (the link works), on the destination Galaxy instance, 
select Import from File in the history menu.  Put in your link.

This should migrate the entire history.

On May 29, 2012, at 12:08 PM, shamsher jagat wrote:

 Ok. Perhaps I am not understanding the process- I am not making any head way 
 in transfering the data from one Galaxy instance to other. I have uploaded 
 some files in Rutsch lab galaxy instance and have url 
 http://galaxy.tuebingen.mpg.de/workflow/for_direct_import?id=9b305a114b324ccf
  
 Nothing is happening. Is it possible that some one from Galaxy team can 
 enlist steps of - how to transfer files (data) from one galaxy instance to 
 other please, considering me as a beginer. Thanks. Sorry for pushing this 
 question.
  
 Vasu
 On Fri, May 25, 2012 at 11:33 AM, Dannon Baker dannonba...@me.com wrote:
 Hi,
 
 Just wanted to add a few clarifications here.  It definitely *is* currently 
 possible to transfer a workflow from one instance to another instance that 
 does not have (some or all) of the tools for a particular workflow.
 
 The error you're running into No JSON object means that you likely have the 
 wrong link to your workflow.  The one you want is accessible via the workflow 
 context menu - Download or Export - URL for importing into another galaxy.  
 Or, you could just download the raw file if you want and upload that as you 
 figured out.  The format of the correct URL should look like this, note the 
 for_direct_import in the string:
 
 https://main.g2.bx.psu.edu/workflow/for_direct_import?id=53b7bf0869d3e7ee
 
 As a correction to what was previously said, I would not recommend stripping 
 out tools from an existing workflow prior to export.  When you upload the 
 workflow to a new instance, if tools aren't available you will see something 
 like the following when you edit the workflow, which specifies that the tool 
 is not found:
 
 PastedGraphic-1.png
 
 And at this point the unrecognized tools can be installed if it's your galaxy 
 server, or if you wish, removed from the workflow via the editor.  This must 
 be done before the workflow will be usable.
 
 Lastly, workflows don't contain any data, just the organization and 
 parameters of steps for a process.  What it sounds like you're looking for 
 (to get your data there as well) is a history export, which is available 
 through the menu at the top of your history as Export to File.
 
 -Dannon
 
 
 On May 24, 2012, at 4:06 PM, shamsher jagat wrote:
 
 Thanks Jen for the update. I tried following:
 Go to Ratsch Galaxy instance  workflow make work flow accessible via link
 Go to galaxy Penn server
 Workflow import workflow URL galaxy UR
  
 error is
  The data content does not appear to be a Galaxy workflow.
 Exception: No JSON object could be decoded: line 1 column 0 (char 0)
  
 I also downloaded the file from Ratsch serever saved on computer and  use 
 option of Choose file under import galaxy flow it importe dthe file afetr a 
 while and when I opened workflow there was no data only steps of the 
 workflow were there.
  
 Do you have any suggestion wheer I am doing something wrong.
  
 Thanks
 
 

___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using reply all in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

  http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

  http://lists.bx.psu.edu/


Re: [galaxy-user] How to transfer files between two galaxy instances

2012-05-25 Thread Dannon Baker
Hi,Just wanted to add a few clarifications here. It definitely *is* currently possible to transfer a workflow from one instance to another instance that does not have (some or all) of the tools for a particular workflow.The error you're running into "No JSON object" means that you likely have the wrong link to your workflow. The one you want is accessible via the workflow context menu - Download or Export - URL for importing into another galaxy. Or, you could just download the raw file if you want and upload that as you figured out. The format of the correct URL should look like this, note the "for_direct_import" in the string:https://main.g2.bx.psu.edu/workflow/for_direct_import?id=53b7bf0869d3e7eeAs a correction to what was previously said, I would not recommend stripping out tools from an existing workflow prior to export. When you upload the workflow to a new instance, if tools aren't available you will see something like the following when you edit the workflow, which specifies that the tool is not found:And at this point the unrecognized tools can be installed if it's your galaxy server, or if you wish, removed from the workflow via the editor. This must be done before the workflow will be usable.Lastly, workflows don't contain any data, just the organization and parameters of steps for a process. What it sounds like you're looking for (to get your data there as well) is a history export, which is available through the menu at the top of your history as "Export to File".-DannonOn May 24, 2012, at 4:06 PM, shamsher jagat wrote:Thanks Jen for the update. I tried following:Go to Ratsch Galaxy instance  workflow make work flow accessible via linkGo to galaxy Penn serverWorkflow import workflow URL galaxy URerror isThe data content does not appear to be a Galaxy workflow.Exception: No JSON object could be decoded: line 1 column 0 (char 0)I also downloaded the file from Ratsch serever saved on computer and use option of Choose file under import galaxy flow it importe dthe file afetr a while and when I opened workflow there was no data only steps of the workflow were there.Do you have any suggestion wheer I am doing something wrong.Thanks___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using reply all in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

  http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

  http://lists.bx.psu.edu/

Re: [galaxy-user] How to transfer files between two galaxy instances

2012-05-24 Thread Jennifer Jackson

Hi again,

I have an important update - I was incorrect about workflow transfer 
between Galaxy instances and where to edit. In summary, it is important 
to do the workflow editing up-front before the export, instead of after 
import. I'll explain why.


While it is now possible to import workflows from other Galaxy instances 
(any), once actually in that new instance, if that workflow contains 
tools that are not in the new target instance, there is nothing that can 
be done with it (editing or running will result in a server error). This 
is a known open issue with the workflow developers.


So, when you plan to import into the public main Galaxy instance, you 
will need to modify any workflow in the source instance before exporting 
it to make certain that it only contains tools that are present in the 
public main Galaxy instance. If a server error results from an imported 
workflow, an unavailable tool present in the workflow is the most likely 
root cause of the problem.


If you plan to import into a local or cloud instance, then you have the 
choice of either modifying the workflow and/or adding all of the 
workflow tools to your local/cloud instance, and then importing the 
workflow. You do not import the workflow first as this will result in an 
error - if it occurs, delete the workflow and import again after the 
required tools are added.


Good question and I apologize for getting this detail incorrect in the 
original reply.


Best,

Jen
Galaxy team

On 5/23/12 10:06 PM, Jennifer Jackson wrote:

Hello,

On 5/23/12 7:30 PM, shamsher jagat wrote:
I have uploaded files in citsrome/  Gunner Ratch lab Galaxy instances 
which allow users to use their tool. I want to either share work flow 
from these instances or atleast transfer FAstq files to penn state 
open source galaxy severer. Is it possible or not?

Incorrect start --
It would not be possible to share a workflow between two Galaxy 
instance using the built in Share or Publish methods, however, you 
should be able to download a workflow from one Galaxy instance and 
then Upload or import workflow into the public main Galaxy instance 
(this button is on the Workflow home page). Please note that any 
tools included in your workflow that were available in the other 
instance, but that are not available on the public main Galaxy 
instance, would not be functional, so the workflow will likely need to 
be edited before use.

 -- Incorrect end, see above


Moving FASTQ datasets from other instances should also be 
straightforward - download, then upload these to the public main 
Galaxy instance using FTP. If you are having trouble downloading 
workflows or datasets, then the host Galaxy instance should be contacted.


I have another question in this regard when I tried to upload the 
FASTq files via web link or FTP the job is never completed. I  have 
tried it couple of times. This problem is there from last couple of 
months. Are there so me changes which have been implemented recently 
which is not allowing me to upload the files. Indeed I have seen from 
last month or so too many messages suggesting either Tophat job stuck 
or job is not completed or unable to upload the file. I am not if all 
tehse problems are related (storage) or not. Can someone from Galaxy 
team advice.


At this time, there are no known issues with upload functions.  If you 
are having problems with URL data transfer from another Galaxy 
instance (or really, any 3rd party source), you can try to download 
the data locally to your desktop in a terminal prompt (as a test) with 
the command:


% curl -O 
'copied_link_location_from_dataset_history_or_workflow_or_other_URL'


If this is successful, then a load into Galaxy should be successful. 
If this is unsuccessful, first contact the source to check if the data 
is publicly accessible, as they may have advice about requirements 
concerning user/passwords in the URL. Once that is worked out and a 
URL load still fails, please submit a bug report from the problematic 
load (leaving the error dataset undeleted in your history) and we will 
examine.


The main public instance occasionally has brief server/cluster issues 
that impacted jobs, but these are very small portion of the overall 
volume of jobs processed, and a re-run is the solution. We do our best 
to respond quickly as soon as a problem is detected.


Reported delays due to high usage volume is a different matter. The 
public main Galaxy instance has substantial dedicated resources, but 
there are times when we still get very busy. If your analysis needs 
are urgent or your jobs are numerous, then a local or cloud instance 
is the recommended alternative: http://getgalaxy.org.


To be clear, you had FTP upload processes that were started but never 
completed? If so, the connection may be been interrupted. When this 
occurs, an FTP client will allow you to restart an upload so that it 
will begin again where it left off. To do this, re-establish the 
connection 

Re: [galaxy-user] How to transfer files between two galaxy instances

2012-05-24 Thread shamsher jagat
Thanks Jen for the update. I tried following:
Go to Ratsch Galaxy instance  workflow make work flow accessible via link
Go to galaxy Penn server
Workflow import workflow URL galaxy UR

error is
 The data content does not appear to be a Galaxy workflow.
Exception: No JSON object could be decoded: line 1 column 0 (char 0)

I also downloaded the file from Ratsch serever saved on computer and  use
option of Choose file under import galaxy flow it importe dthe file afetr a
while and when I opened workflow there was no data only steps of the
workflow were there.

Do you have any suggestion wheer I am doing something wrong.

Thanks



On Thu, May 24, 2012 at 8:13 AM, Jennifer Jackson j...@bx.psu.edu wrote:

 Hi again,

 I have an important update - I was incorrect about workflow transfer
 between Galaxy instances and where to edit. In summary, it is important to
 do the workflow editing up-front before the export, instead of after
 import. I'll explain why.

 While it is now possible to import workflows from other Galaxy instances
 (any), once actually in that new instance, if that workflow contains tools
 that are not in the new target instance, there is nothing that can be done
 with it (editing or running will result in a server error). This is a known
 open issue with the workflow developers.

 So, when you plan to import into the public main Galaxy instance, you will
 need to modify any workflow in the source instance before exporting it to
 make certain that it only contains tools that are present in the public
 main Galaxy instance. If a server error results from an imported workflow,
 an unavailable tool present in the workflow is the most likely root cause
 of the problem.

 If you plan to import into a local or cloud instance, then you have the
 choice of either modifying the workflow and/or adding all of the workflow
 tools to your local/cloud instance, and then importing the workflow. You do
 not import the workflow first as this will result in an error - if it
 occurs, delete the workflow and import again after the required tools are
 added.

 Good question and I apologize for getting this detail incorrect in the
 original reply.

 Best,

 Jen
 Galaxy team


 On 5/23/12 10:06 PM, Jennifer Jackson wrote:

 Hello,

 On 5/23/12 7:30 PM, shamsher jagat wrote:

 I have uploaded files in citsrome/  Gunner Ratch lab Galaxy instances
 which allow users to use their tool. I want to either share work flow from
 these instances or atleast transfer FAstq files to penn state open source
 galaxy severer. Is it possible or not?

 Incorrect start --

 It would not be possible to share a workflow between two Galaxy instance
 using the built in Share or Publish methods, however, you should be able
 to download a workflow from one Galaxy instance and then Upload or import
 workflow into the public main Galaxy instance (this button is on the
 Workflow home page). Please note that any tools included in your workflow
 that were available in the other instance, but that are not available on
 the public main Galaxy instance, would not be functional, so the workflow
 will likely need to be edited before use.

  -- Incorrect end, see above


 Moving FASTQ datasets from other instances should also be straightforward
 - download, then upload these to the public main Galaxy instance using FTP.
 If you are having trouble downloading workflows or datasets, then the host
 Galaxy instance should be contacted.


 I have another question in this regard when I tried to upload the FASTq
 files via web link or FTP the job is never completed. I  have tried it
 couple of times. This problem is there from last couple of months. Are
 there so me changes which have been implemented recently which is not
 allowing me to upload the files. Indeed I have seen from last month or so
 too many messages suggesting either Tophat job stuck or job is not
 completed or unable to upload the file. I am not if all tehse problems are
 related (storage) or not. Can someone from Galaxy team advice.

 At this time, there are no known issues with upload functions.  If you are
 having problems with URL data transfer from another Galaxy instance (or
 really, any 3rd party source), you can try to download the data locally to
 your desktop in a terminal prompt (as a test) with the command:

 % curl -O
 'copied_link_location_from_dataset_history_or_workflow_or_other_URL'

 If this is successful, then a load into Galaxy should be successful. If
 this is unsuccessful, first contact the source to check if the data is
 publicly accessible, as they may have advice about requirements concerning
 user/passwords in the URL. Once that is worked out and a URL load still
 fails, please submit a bug report from the problematic load (leaving the
 error dataset undeleted in your history) and we will examine.

 The main public instance occasionally has brief server/cluster issues that
 impacted jobs, but these are very small portion of the overall volume of
 jobs processed, and a 

[galaxy-user] How to transfer files between two galaxy instances

2012-05-23 Thread shamsher jagat
I have uploaded files in citsrome/  Gunner Ratch lab Galaxy instances which
allow users to use their tool. I want to either share work flow from these
instances or atleast transfer FAstq files to penn state open source
galaxy severer. Is it possible or not?

I have another question in this regard when I tried to upload the FASTq
files via web link or FTP the job is never completed. I  have tried it
couple of times. This problem is there from last couple of months. Are
there so me changes which have been implemented recently which is not
allowing me to upload the files. Indeed I have seen from last month or so
too many messages suggesting either Tophat job stuck or job is not
completed or unable to upload the file. I am not if all tehse problems are
related (storage) or not. Can someone from Galaxy team advice.
___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using reply all in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

  http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

  http://lists.bx.psu.edu/

Re: [galaxy-user] How to transfer files between two galaxy instances

2012-05-23 Thread Jennifer Jackson

Hello,

On 5/23/12 7:30 PM, shamsher jagat wrote:
I have uploaded files in citsrome/  Gunner Ratch lab Galaxy instances 
which allow users to use their tool. I want to either share work flow 
from these instances or atleast transfer FAstq files to penn state 
open source galaxy severer. Is it possible or not?
It would not be possible to share a workflow between two Galaxy instance 
using the built in Share or Publish methods, however, you should be 
able to download a workflow from one Galaxy instance and then Upload or 
import workflow into the public main Galaxy instance (this button is on 
the Workflow home page). Please note that any tools included in your 
workflow that were available in the other instance, but that are not 
available on the public main Galaxy instance, would not be functional, 
so the workflow will likely need to be edited before use.


Moving FASTQ datasets from other instances should also be 
straightforward - download, then upload these to the public main Galaxy 
instance using FTP. If you are having trouble downloading workflows or 
datasets, then the host Galaxy instance should be contacted.


I have another question in this regard when I tried to upload the 
FASTq files via web link or FTP the job is never completed. I  have 
tried it couple of times. This problem is there from last couple of 
months. Are there so me changes which have been implemented recently 
which is not allowing me to upload the files. Indeed I have seen from 
last month or so too many messages suggesting either Tophat job stuck 
or job is not completed or unable to upload the file. I am not if all 
tehse problems are related (storage) or not. Can someone from Galaxy 
team advice.


At this time, there are no known issues with upload functions.  If you 
are having problems with URL data transfer from another Galaxy instance 
(or really, any 3rd party source), you can try to download the data 
locally to your desktop in a terminal prompt (as a test) with the command:


% curl -O 
'copied_link_location_from_dataset_history_or_workflow_or_other_URL'


If this is successful, then a load into Galaxy should be successful. If 
this is unsuccessful, first contact the source to check if the data is 
publicly accessible, as they may have advice about requirements 
concerning user/passwords in the URL. Once that is worked out and a URL 
load still fails, please submit a bug report from the problematic load 
(leaving the error dataset undeleted in your history) and we will examine.


The main public instance occasionally has brief server/cluster issues 
that impacted jobs, but these are very small portion of the overall 
volume of jobs processed, and a re-run is the solution. We do our best 
to respond quickly as soon as a problem is detected.


Reported delays due to high usage volume is a different matter. The 
public main Galaxy instance has substantial dedicated resources, but 
there are times when we still get very busy. If your analysis needs are 
urgent or your jobs are numerous, then a local or cloud instance is the 
recommended alternative: http://getgalaxy.org.


To be clear, you had FTP upload processes that were started but never 
completed? If so, the connection may be been interrupted. When this 
occurs, an FTP client will allow you to restart an upload so that it 
will begin again where it left off. To do this, re-establish the 
connection first, then resume the transfer.  
http://wiki.g2.bx.psu.edu/FTPUpload


Hopefully this addresses your concerns,

Jen
Galaxy team



___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using reply all in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

   http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

   http://lists.bx.psu.edu/


--
Jennifer Jackson
http://galaxyproject.org

___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using reply all in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

  http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

  http://lists.bx.psu.edu/