[galaxy-user] How to control registration

2012-05-24 Thread Zeeshan Ali Shah
Hi, Is there any way to moderate Registration on Galaxy portal ? We are
setting up a cluster for internal users but it seems that by default
registration is open for all.

Can any way we disable the registration and Moderator create
users separately ? OR Moderate the registration process ?


Zeeshan
___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using "reply all" in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

  http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

  http://lists.bx.psu.edu/

Re: [galaxy-user] How to control registration

2012-05-24 Thread Roman Valls Guimera
Zeeshan, AFAIK the only settings you can set regarding this are in 
universe_wsgi.ini through the directives:


# Allow unregistered users to create new accounts (otherwise, they will have to
# be created by an admin).
allow_user_creation = True

# Email administrators when a new user account is created
# You also need to have smtp_server set for this to work.
new_user_email_admin = False

I have not seen a moderation system in place, but maybe you can use PDC's Plone 
forms for this moderation ?

Hope that helps !
Roman

24 maj 2012 kl. 10:18 skrev Zeeshan Ali Shah:

> Hi, Is there any way to moderate Registration on Galaxy portal ? We are 
> setting up a cluster for internal users but it seems that by default 
> registration is open for all. 
> 
> Can any way we disable the registration and Moderator create users separately 
> ? OR Moderate the registration process ?
> 
> 
> Zeeshan
> ___
> The Galaxy User list should be used for the discussion of
> Galaxy analysis and other features on the public server
> at usegalaxy.org.  Please keep all replies on the list by
> using "reply all" in your mail client.  For discussion of
> local Galaxy instances and the Galaxy source code, please
> use the Galaxy Development list:
> 
>  http://lists.bx.psu.edu/listinfo/galaxy-dev
> 
> To manage your subscriptions to this and other Galaxy lists,
> please use the interface at:
> 
>  http://lists.bx.psu.edu/


___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using "reply all" in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

  http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

  http://lists.bx.psu.edu/


Re: [galaxy-user] How to transfer files between two galaxy instances

2012-05-24 Thread Jennifer Jackson

Hi again,

I have an important update - I was incorrect about workflow transfer 
between Galaxy instances and where to edit. In summary, it is important 
to do the workflow editing up-front before the export, instead of after 
import. I'll explain why.


While it is now possible to import workflows from other Galaxy instances 
(any), once actually in that new instance, if that workflow contains 
tools that are not in the new target instance, there is nothing that can 
be done with it (editing or running will result in a server error). This 
is a known open issue with the workflow developers.


So, when you plan to import into the public main Galaxy instance, you 
will need to modify any workflow in the source instance before exporting 
it to make certain that it only contains tools that are present in the 
public main Galaxy instance. If a server error results from an imported 
workflow, an unavailable tool present in the workflow is the most likely 
root cause of the problem.


If you plan to import into a local or cloud instance, then you have the 
choice of either modifying the workflow and/or adding all of the 
workflow tools to your local/cloud instance, and then importing the 
workflow. You do not import the workflow first as this will result in an 
error - if it occurs, delete the workflow and import again after the 
required tools are added.


Good question and I apologize for getting this detail incorrect in the 
original reply.


Best,

Jen
Galaxy team

On 5/23/12 10:06 PM, Jennifer Jackson wrote:

Hello,

On 5/23/12 7:30 PM, shamsher jagat wrote:
I have uploaded files in citsrome/  Gunner Ratch lab Galaxy instances 
which allow users to use their tool. I want to either share work flow 
from these instances or atleast transfer FAstq files to penn state 
open source galaxy severer. Is it possible or not?

Incorrect start -->>
It would not be possible to share a workflow between two Galaxy 
instance using the built in "Share or Publish" methods, however, you 
should be able to download a workflow from one Galaxy instance and 
then "Upload or import workflow" into the public main Galaxy instance 
(this button is on the "Workflow" home page). Please note that any 
tools included in your workflow that were available in the other 
instance, but that are not available on the public main Galaxy 
instance, would not be functional, so the workflow will likely need to 
be edited before use.

<< -- Incorrect end, see above


Moving FASTQ datasets from other instances should also be 
straightforward - download, then upload these to the public main 
Galaxy instance using FTP. If you are having trouble downloading 
workflows or datasets, then the host Galaxy instance should be contacted.


I have another question in this regard when I tried to upload the 
FASTq files via web link or FTP the job is never completed. I  have 
tried it couple of times. This problem is there from last couple of 
months. Are there so me changes which have been implemented recently 
which is not allowing me to upload the files. Indeed I have seen from 
last month or so too many messages suggesting either Tophat job stuck 
or job is not completed or unable to upload the file. I am not if all 
tehse problems are related (storage) or not. Can someone from Galaxy 
team advice.


At this time, there are no known issues with upload functions.  If you 
are having problems with URL data transfer from another Galaxy 
instance (or really, any 3rd party source), you can try to download 
the data locally to your desktop in a terminal prompt (as a test) with 
the command:


% curl -O 
'copied_link_location_from_dataset_history_or_workflow_or_other_URL'


If this is successful, then a load into Galaxy should be successful. 
If this is unsuccessful, first contact the source to check if the data 
is publicly accessible, as they may have advice about requirements 
concerning user/passwords in the URL. Once that is worked out and a 
URL load still fails, please submit a bug report from the problematic 
load (leaving the error dataset undeleted in your history) and we will 
examine.


The main public instance occasionally has brief server/cluster issues 
that impacted jobs, but these are very small portion of the overall 
volume of jobs processed, and a re-run is the solution. We do our best 
to respond quickly as soon as a problem is detected.


Reported delays due to high usage volume is a different matter. The 
public main Galaxy instance has substantial dedicated resources, but 
there are times when we still get very busy. If your analysis needs 
are urgent or your jobs are numerous, then a local or cloud instance 
is the recommended alternative: http://getgalaxy.org.


To be clear, you had FTP upload processes that were started but never 
completed? If so, the connection may be been interrupted. When this 
occurs, an FTP client will allow you to restart an upload so that it 
will begin again where it left off. To do this, re-establish the 
connect

Re: [galaxy-user] cloud instance of galaxy

2012-05-24 Thread Dave Clements
Hi Tom,

The cloud image has the reference data for hg19 and mm9.  However, it does
not come with fully configured loc files for every tool that takes
advantage of all that preloaded data.  Try updating the loc file for the
Extract Genomic DNA tool.

Dave C.

On Wed, May 23, 2012 at 11:07 AM, Randall, Thomas (NIH/NIEHS) [C] <
thomas.rand...@nih.gov> wrote:

> I have put up an instance of galaxy on the cloud and as a simple test am
> trying to retrieve hg19 sequences corresponding to a bed file using the
> Extract Genomic DNA tool. hg19 is supposed to be a “loaded, system
> installed build” yet I get an error message stating that “No sequences are
> available for hg19”. A similar thing occurred with an mm9 request. What is
> being done wrong?
>
> Tom
>
> ** **
>
> Thomas Randall, PhD
>
> Bioinformatics Scientist, Contractor
>
> Integrative Bioinformatics
>
> National Institute of Environmental Health Sciences
>
> P.O. Box 12233, Research Triangle Park, NC 27709
>
> randall...@niehs.nih.gov
>
> 919-541-2271
>
> ** **
>
> ___
> The Galaxy User list should be used for the discussion of
> Galaxy analysis and other features on the public server
> at usegalaxy.org.  Please keep all replies on the list by
> using "reply all" in your mail client.  For discussion of
> local Galaxy instances and the Galaxy source code, please
> use the Galaxy Development list:
>
>  http://lists.bx.psu.edu/listinfo/galaxy-dev
>
> To manage your subscriptions to this and other Galaxy lists,
> please use the interface at:
>
>  http://lists.bx.psu.edu/
>



-- 
http://galaxyproject.org/GCC2012 
http://galaxyproject.org/
http://getgalaxy.org/
http://usegalaxy.org/
http://galaxyproject.org/wiki/
___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using "reply all" in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

  http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

  http://lists.bx.psu.edu/

Re: [galaxy-user] cloud instance of galaxy

2012-05-24 Thread Dave Clements
Hi Tom,

An update from the CloudMan developers:  This will be fixed in the next
release, which is imminent.

Thanks,

Dave C.

On Thu, May 24, 2012 at 11:04 AM, Dave Clements
wrote:

> Hi Tom,
>
> The cloud image has the reference data for hg19 and mm9.  However, it does
> not come with fully configured loc files for every tool that takes
> advantage of all that preloaded data.  Try updating the loc file for the
> Extract Genomic DNA tool.
>
> Dave C.
>
> On Wed, May 23, 2012 at 11:07 AM, Randall, Thomas (NIH/NIEHS) [C] <
> thomas.rand...@nih.gov> wrote:
>
>> I have put up an instance of galaxy on the cloud and as a simple test am
>> trying to retrieve hg19 sequences corresponding to a bed file using the
>> Extract Genomic DNA tool. hg19 is supposed to be a “loaded, system
>> installed build” yet I get an error message stating that “No sequences are
>> available for hg19”. A similar thing occurred with an mm9 request. What is
>> being done wrong?
>>
>> Tom
>>
>> ** **
>>
>> Thomas Randall, PhD
>>
>> Bioinformatics Scientist, Contractor
>>
>> Integrative Bioinformatics
>>
>> National Institute of Environmental Health Sciences
>>
>> P.O. Box 12233, Research Triangle Park, NC 27709
>>
>> randall...@niehs.nih.gov
>>
>> 919-541-2271
>>
>> ** **
>>
>> ___
>> The Galaxy User list should be used for the discussion of
>> Galaxy analysis and other features on the public server
>> at usegalaxy.org.  Please keep all replies on the list by
>> using "reply all" in your mail client.  For discussion of
>> local Galaxy instances and the Galaxy source code, please
>> use the Galaxy Development list:
>>
>>  http://lists.bx.psu.edu/listinfo/galaxy-dev
>>
>> To manage your subscriptions to this and other Galaxy lists,
>> please use the interface at:
>>
>>  http://lists.bx.psu.edu/
>>
>
>
>
> --
> http://galaxyproject.org/GCC2012 
> http://galaxyproject.org/
> http://getgalaxy.org/
> http://usegalaxy.org/
> http://galaxyproject.org/wiki/
>
>


-- 
http://galaxyproject.org/GCC2012 
http://galaxyproject.org/
http://getgalaxy.org/
http://usegalaxy.org/
http://galaxyproject.org/wiki/
___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using "reply all" in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

  http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

  http://lists.bx.psu.edu/

Re: [galaxy-user] How to transfer files between two galaxy instances

2012-05-24 Thread shamsher jagat
Thanks Jen for the update. I tried following:
Go to Ratsch Galaxy instance > workflow> make work flow accessible via link
Go to galaxy Penn server
Workflow> import workflow URL> galaxy UR

error is
 The data content does not appear to be a Galaxy workflow.
Exception: No JSON object could be decoded: line 1 column 0 (char 0)

I also downloaded the file from Ratsch serever saved on computer and  use
option of Choose file under import galaxy flow it importe dthe file afetr a
while and when I opened workflow there was no data only steps of the
workflow were there.

Do you have any suggestion wheer I am doing something wrong.

Thanks



On Thu, May 24, 2012 at 8:13 AM, Jennifer Jackson  wrote:

> Hi again,
>
> I have an important update - I was incorrect about workflow transfer
> between Galaxy instances and where to edit. In summary, it is important to
> do the workflow editing up-front before the export, instead of after
> import. I'll explain why.
>
> While it is now possible to import workflows from other Galaxy instances
> (any), once actually in that new instance, if that workflow contains tools
> that are not in the new target instance, there is nothing that can be done
> with it (editing or running will result in a server error). This is a known
> open issue with the workflow developers.
>
> So, when you plan to import into the public main Galaxy instance, you will
> need to modify any workflow in the source instance before exporting it to
> make certain that it only contains tools that are present in the public
> main Galaxy instance. If a server error results from an imported workflow,
> an unavailable tool present in the workflow is the most likely root cause
> of the problem.
>
> If you plan to import into a local or cloud instance, then you have the
> choice of either modifying the workflow and/or adding all of the workflow
> tools to your local/cloud instance, and then importing the workflow. You do
> not import the workflow first as this will result in an error - if it
> occurs, delete the workflow and import again after the required tools are
> added.
>
> Good question and I apologize for getting this detail incorrect in the
> original reply.
>
> Best,
>
> Jen
> Galaxy team
>
>
> On 5/23/12 10:06 PM, Jennifer Jackson wrote:
>
> Hello,
>
> On 5/23/12 7:30 PM, shamsher jagat wrote:
>
> I have uploaded files in citsrome/  Gunner Ratch lab Galaxy instances
> which allow users to use their tool. I want to either share work flow from
> these instances or atleast transfer FAstq files to penn state open source
> galaxy severer. Is it possible or not?
>
> Incorrect start -->>
>
> It would not be possible to share a workflow between two Galaxy instance
> using the built in "Share or Publish" methods, however, you should be able
> to download a workflow from one Galaxy instance and then "Upload or import
> workflow" into the public main Galaxy instance (this button is on the
> "Workflow" home page). Please note that any tools included in your workflow
> that were available in the other instance, but that are not available on
> the public main Galaxy instance, would not be functional, so the workflow
> will likely need to be edited before use.
>
> << -- Incorrect end, see above
>
>
> Moving FASTQ datasets from other instances should also be straightforward
> - download, then upload these to the public main Galaxy instance using FTP.
> If you are having trouble downloading workflows or datasets, then the host
> Galaxy instance should be contacted.
>
>
> I have another question in this regard when I tried to upload the FASTq
> files via web link or FTP the job is never completed. I  have tried it
> couple of times. This problem is there from last couple of months. Are
> there so me changes which have been implemented recently which is not
> allowing me to upload the files. Indeed I have seen from last month or so
> too many messages suggesting either Tophat job stuck or job is not
> completed or unable to upload the file. I am not if all tehse problems are
> related (storage) or not. Can someone from Galaxy team advice.
>
> At this time, there are no known issues with upload functions.  If you are
> having problems with URL data transfer from another Galaxy instance (or
> really, any 3rd party source), you can try to download the data locally to
> your desktop in a terminal prompt (as a test) with the command:
>
> % curl -O
> 'copied_link_location_from_dataset_history_or_workflow_or_other_URL'
>
> If this is successful, then a load into Galaxy should be successful. If
> this is unsuccessful, first contact the source to check if the data is
> publicly accessible, as they may have advice about requirements concerning
> user/passwords in the URL. Once that is worked out and a URL load still
> fails, please submit a bug report from the problematic load (leaving the
> error dataset undeleted in your history) and we will examine.
>
> The main public instance occasionally has brief server/cluster issues that
> impac

Re: [galaxy-user] Runtime on public Galaxy and current version of Galaxy on the Cloud?

2012-05-24 Thread Diana Cox-Foster
Hi, I need some guidance on how to upload data files onto an instance of Galaxy 
on the Cloud.  I have successfully established the instance, but am unable to 
ftp data sets into it.I have tried using my logon password for the instance 
to be able to gain access, without success.  

Is there a direct way to move data files from the public galaxy over to the 
instance without downloading the files onto my computer.

Thanks for any help--- Diana

**
Diana Cox-Foster, Professor
office: 536 ASI Bldg

MAIL:
501 ASI Bldg
Department of Entomology
Penn State University
University Park, PA, USA 16802

email: dx...@psu.edu
office phone: 814-865-1022
dept. phone: 814-865-1895



On May 18, 2012, at 5:54 PM, Enis Afgan wrote:

> Hi Diana, 
> What type if instance did you use? You should use at least Large instace 
> type. And did you make sure to format the user data as described in the wiki 
> (http://wiki.g2.bx.psu.edu/CloudMan, under Step 2)?
> 
> An alternative approach (which, granted, has not been publicized much yet) is 
> to use http://bioCloudCentral.org and start your instance via that portal. I 
> just tested it and and instance with Galaxy configured and running was up in 
> about 5 minutes. When filling out the form, you may want to choose the Galaxy 
> AMI (vs. the default CloudBioLinux one - which will also work) by clicking on 
> the Advanced Options and choosing Galaxy Cloud AMI from the drop down menu).
> 
> Let us know if you still have trouble,
> Enis
> 
> On Sat, May 19, 2012 at 6:34 AM, Diana Cox-Foster  wrote:
> Hi, I am finding that the run time on the public galaxy is exceptionally 
> long, with job submitted yesterday still waiting to start. 
> 
> Given that I need to have analyses performed more quickly, I have tried to 
> use Galaxy on the Cloud.  Is this the following still the current version of 
> cloudman to use?
> 
> Current AMI:
> 
> AMI: ami-da58aab3
> Name: 861460482541/galaxy-cloudman-2011-03-22
> I have set up instances for this version and have waited for over an hour for 
> them to load, with multiple attempts to get the websites to open.  All status 
> indicators say that the instance is running.  
> 
> Any insight welcomed--- thanks--- Diana
> 
> **
> Diana Cox-Foster, Professor
> office: 536 ASI Bldg
> 
> MAIL:
> 501 ASI Bldg
> Department of Entomology
> Penn State University
> University Park, PA, USA 16802
> 
> email: dx...@psu.edu
> office phone: 814-865-1022
> dept. phone: 814-865-1895
> 
> 
> 
> 
> ___
> The Galaxy User list should be used for the discussion of
> Galaxy analysis and other features on the public server
> at usegalaxy.org.  Please keep all replies on the list by
> using "reply all" in your mail client.  For discussion of
> local Galaxy instances and the Galaxy source code, please
> use the Galaxy Development list:
> 
>  http://lists.bx.psu.edu/listinfo/galaxy-dev
> 
> To manage your subscriptions to this and other Galaxy lists,
> please use the interface at:
> 
>  http://lists.bx.psu.edu/
> 

___
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using "reply all" in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

  http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

  http://lists.bx.psu.edu/