[
https://issues.apache.org/jira/browse/FALCON-1497?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14938794#comment-14938794
]
Rishav Rohit commented on FALCON-1497:
--------------------------------------
[~peeyushb] Please follow below steps to replicate this issue -
1. Submit and schedule a Falcon process like
https://github.com/rishav-rohit/falcon_samples/blob/master/sample2-hive/multi-table-process.xml
(the hive script is already uploaded to HDFS).
2. After sometime, delete old Hive script from HDFS, modify and upload new Hive
script to HDFS (I used this script
https://github.com/rishav-rohit/falcon_samples/blob/master/sample2-hive/cnt_script.hql).
3. Now update the process using below command (there is no change in process
definition, only script file is modified) -
falcon entity -type process -name multi-table-process -update -file
multi-table-process.xml
Hope I am updating the process correctly.
> Hive script / oozie workflow is not updated on updating a process definition
> ----------------------------------------------------------------------------
>
> Key: FALCON-1497
> URL: https://issues.apache.org/jira/browse/FALCON-1497
> Project: Falcon
> Issue Type: Bug
> Components: process
> Affects Versions: 0.6
> Environment: HDP-2.2.0
> Reporter: Rishav Rohit
>
> Suppose a process is defined which invokes Oozie workflow or Hive script.
> When I update the Oozie workflow or Hive script and and then update the
> process definition in Falcon, the Falcon update command executes successfully
> but the Oozie workflow / Hive script is not updated.
> I have not checked if same is the case with pig scripts too.
> I believe this is a bug because when I update a workflow/script and then
> update the process, it should take new workflow/scripts.
> Thanks
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)