[ 
https://issues.apache.org/jira/browse/FALCON-1728?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15088724#comment-15088724
 ] 

pavan kumar kolamuri commented on FALCON-1728:
----------------------------------------------

Yes [~bvellanki] you are right . Falcon should not allow the scenario you 
described in last comment. When cluster is defined as target in feed that means 
it meant for replication and process shouldn't run in target cluster with this 
feed as output feed where as input feed is fine. 

> Process entity definition allows multiple clusters when it has output Feed 
> defined. 
> ------------------------------------------------------------------------------------
>
>                 Key: FALCON-1728
>                 URL: https://issues.apache.org/jira/browse/FALCON-1728
>             Project: Falcon
>          Issue Type: Bug
>          Components: process
>    Affects Versions: 0.9
>            Reporter: Balu Vellanki
>            Assignee: Balu Vellanki
>            Priority: Critical
>
> Process XSD allows user to specify multiple clusters per process entity. I am 
> guessing this would allow a user to run duplicate instance of the process on 
> multiple clusters at the same time (I do not really see a need for this). 
> When the process has an output feed defined, you can have duplicate process 
> instances writing to same feed instance, causing data corruption/failures. 
> The solution is to 
> 1. Do not allow multiple clusters per process. Let the user define a 
> duplicate process if user wants to run duplicate instances.  
> OR
> 2. Allow multiple clusters, but only when there is no output feed defined.
> [~sriksun] please let me know if there is any other reason for allowing 
> multiple clusters in a process. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to