Repository: falcon Updated Branches: refs/heads/master baab41425 -> 855852ecf
FALCON-1589 Package sample recipe properties file. Contributed by Peeyush Bishnoi. Project: http://git-wip-us.apache.org/repos/asf/falcon/repo Commit: http://git-wip-us.apache.org/repos/asf/falcon/commit/855852ec Tree: http://git-wip-us.apache.org/repos/asf/falcon/tree/855852ec Diff: http://git-wip-us.apache.org/repos/asf/falcon/diff/855852ec Branch: refs/heads/master Commit: 855852ecfc86b63d317c2615a27845eaac1c0ec1 Parents: baab414 Author: Ajay Yadava <[email protected]> Authored: Thu Nov 26 18:03:15 2015 +0530 Committer: Ajay Yadava <[email protected]> Committed: Thu Nov 26 18:03:15 2015 +0530 ---------------------------------------------------------------------- CHANGES.txt | 2 ++ docs/src/site/twiki/HDFSDR.twiki | 21 ++++++++++++++++---- docs/src/site/twiki/HiveDR.twiki | 24 ++++++++++++++++------- src/main/assemblies/distributed-package.xml | 24 ++++++++++------------- src/main/assemblies/standalone-package.xml | 25 ++++++++++-------------- 5 files changed, 56 insertions(+), 40 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/falcon/blob/855852ec/CHANGES.txt ---------------------------------------------------------------------- diff --git a/CHANGES.txt b/CHANGES.txt index a2ca7f9..b4bdfa4 100755 --- a/CHANGES.txt +++ b/CHANGES.txt @@ -53,6 +53,8 @@ Trunk (Unreleased) OPTIMIZATIONS BUG FIXES + FALCON- 1589 Package sample recipe properties file(Peeyush Bishnoi via Ajay Yadava) + FALCON-1597 Falcon should not retry in case of an instance being manual kill from user (Sandeep Samudrala via Pallavi Rao) FALCON-1606 Process schedule fails in some cases in case of NativeScheduler(Pavan Kumar Kolamuri via Ajay Yadava) http://git-wip-us.apache.org/repos/asf/falcon/blob/855852ec/docs/src/site/twiki/HDFSDR.twiki ---------------------------------------------------------------------- diff --git a/docs/src/site/twiki/HDFSDR.twiki b/docs/src/site/twiki/HDFSDR.twiki index 4a42c4c..1c1e3f5 100644 --- a/docs/src/site/twiki/HDFSDR.twiki +++ b/docs/src/site/twiki/HDFSDR.twiki @@ -9,13 +9,26 @@ Falcon supports HDFS DR recipe to replicate data from source cluster to destinat </verbatim> ---+++ Update recipes properties - Update recipe properties file in addons/recipes/hdfs-replication with required attributes for replicating - data from source cluster to destination cluster. + Copy HDFS replication recipe properties, workflow and template file from $FALCON_HOME/data-mirroring/hdfs-replication to the accessible + directory path or to the recipe directory path (*falcon.recipe.path=<recipe directory path>*). *"falcon.recipe.path"* must be specified + in Falcon conf client.properties. Now update the copied recipe properties file with required attributes to replicate data from source cluster to + destination cluster for HDFS DR. ---+++ Submit HDFS DR recipe + + After updating the recipe properties file with required attributes in directory path or in falcon.recipe.path, + there are two ways of submitting the HDFS DR recipe: + + * 1. Specify Falcon recipe properties file through recipe command line. <verbatim> $FALCON_HOME/bin/falcon recipe -name hdfs-replication -operation HDFS_REPLICATION + -properties /cluster/hdfs-replication.properties </verbatim> -Recipe templates for HDFS DR is available in addons/recipes/hdfs-replication and copy it to -recipe path (*falcon.recipe.path=<recipe directory path>*) by specifying in client.properties. + * 2. Use Falcon recipe path specified in Falcon conf client.properties . + <verbatim> + $FALCON_HOME/bin/falcon recipe -name hdfs-replication -operation HDFS_REPLICATION + </verbatim> + + +*Note:* Recipe properties file, workflow file and template file name must match to the recipe name, it must be unique and in the same directory. http://git-wip-us.apache.org/repos/asf/falcon/blob/855852ec/docs/src/site/twiki/HiveDR.twiki ---------------------------------------------------------------------- diff --git a/docs/src/site/twiki/HiveDR.twiki b/docs/src/site/twiki/HiveDR.twiki index ca039ce..a8f6aee 100644 --- a/docs/src/site/twiki/HiveDR.twiki +++ b/docs/src/site/twiki/HiveDR.twiki @@ -48,17 +48,27 @@ Following is the prerequisites to use Hive DR </verbatim> ---+++ Update recipes properties - Update recipe properties file in addons/recipes/hive-disaster-recovery with required attributes for replicating - Hive data and metadata from source cluster to destination cluster. + Copy Hive DR recipe properties, workflow and template file from $FALCON_HOME/data-mirroring/hive-disaster-recovery to the accessible + directory path or to the recipe directory path (*falcon.recipe.path=<recipe directory path>*). *"falcon.recipe.path"* must be specified + in Falcon conf client.properties. Now update the copied recipe properties file with required attributes to replicate metadata and data from source cluster to + destination cluster for Hive DR. ---+++ Submit Hive DR recipe + After updating the recipe properties file with required attributes in directory path or in falcon.recipe.path, + there are two ways of submitting the Hive DR recipe: + + * 1. Specify Falcon recipe properties file through recipe command line. <verbatim> - $FALCON_HOME/bin/falcon recipe -name hive-disaster-recovery -operation HIVE_DISASTER_RECOVERY + $FALCON_HOME/bin/falcon recipe -name hive-disaster-recovery -operation HIVE_DISASTER_RECOVERY + -properties /cluster/hive-disaster-recovery.properties </verbatim> + * 2. Use Falcon recipe path specified in Falcon conf client.properties . + <verbatim> + $FALCON_HOME/bin/falcon recipe -name hive-disaster-recovery -operation HIVE_DISASTER_RECOVERY + </verbatim> -Recipe templates for Hive DR is available in addons/recipes/hive-disaster-recovery and copy it to -recipe path (*falcon.recipe.path=<recipe directory path>*) by specifying in client.properties. -*Note:* If kerberos security is enabled on cluster, use the secure templates for Hive DR from - addons/recipes/hive-disaster-recovery +*Note:* + * Recipe properties file, workflow file and template file name must match to the recipe name, it must be unique and in the same directory. + * If kerberos security is enabled on cluster, use the secure templates for Hive DR from $FALCON_HOME/data-mirroring/hive-disaster-recovery . http://git-wip-us.apache.org/repos/asf/falcon/blob/855852ec/src/main/assemblies/distributed-package.xml ---------------------------------------------------------------------- diff --git a/src/main/assemblies/distributed-package.xml b/src/main/assemblies/distributed-package.xml index ebd1745..dc5e2f8 100644 --- a/src/main/assemblies/distributed-package.xml +++ b/src/main/assemblies/distributed-package.xml @@ -96,6 +96,16 @@ <directory>hadoop-dependencies/target/dependency</directory> <outputDirectory>hadooplibs</outputDirectory> </fileSet> + + <fileSet> + <directory>addons/recipes/hdfs-replication/src/main/resources</directory> + <outputDirectory>data-mirroring/hdfs-replication</outputDirectory> + </fileSet> + + <fileSet> + <directory>addons/recipes/hive-disaster-recovery/src/main/resources</directory> + <outputDirectory>data-mirroring/hive-disaster-recovery</outputDirectory> + </fileSet> </fileSets> <files> @@ -137,20 +147,6 @@ <source>oozie-el-extensions/src/main/conf/oozie-site.xml</source> <outputDirectory>oozie/conf</outputDirectory> </file> - - <file> - <source>addons/recipes/hdfs-replication/src/main/resources/hdfs-replication-workflow.xml</source> - <outputDirectory>data-mirroring/workflows</outputDirectory> - </file> - - <file> - <source>addons/recipes/hive-disaster-recovery/src/main/resources/hive-disaster-recovery-workflow.xml</source> - <outputDirectory>data-mirroring/workflows</outputDirectory> - </file> - <file> - <source>addons/recipes/hive-disaster-recovery/src/main/resources/hive-disaster-recovery-secure-workflow.xml</source> - <outputDirectory>data-mirroring/workflows</outputDirectory> - </file> </files> </assembly> http://git-wip-us.apache.org/repos/asf/falcon/blob/855852ec/src/main/assemblies/standalone-package.xml ---------------------------------------------------------------------- diff --git a/src/main/assemblies/standalone-package.xml b/src/main/assemblies/standalone-package.xml index b88aec3..2909631 100644 --- a/src/main/assemblies/standalone-package.xml +++ b/src/main/assemblies/standalone-package.xml @@ -101,6 +101,16 @@ <directory>src/main/examples</directory> <outputDirectory>examples</outputDirectory> </fileSet> + + <fileSet> + <directory>addons/recipes/hdfs-replication/src/main/resources</directory> + <outputDirectory>data-mirroring/hdfs-replication</outputDirectory> + </fileSet> + + <fileSet> + <directory>addons/recipes/hive-disaster-recovery/src/main/resources</directory> + <outputDirectory>data-mirroring/hive-disaster-recovery</outputDirectory> + </fileSet> </fileSets> <files> @@ -126,21 +136,6 @@ </file> <file> - <source>addons/recipes/hdfs-replication/src/main/resources/hdfs-replication-workflow.xml</source> - <outputDirectory>data-mirroring/workflows</outputDirectory> - </file> - - <file> - <source>addons/recipes/hive-disaster-recovery/src/main/resources/hive-disaster-recovery-workflow.xml</source> - <outputDirectory>data-mirroring/workflows</outputDirectory> - </file> - - <file> - <source>addons/recipes/hive-disaster-recovery/src/main/resources/hive-disaster-recovery-secure-workflow.xml</source> - <outputDirectory>data-mirroring/workflows</outputDirectory> - </file> - - <file> <source>webapp/target/falcon-webapp-${project.version}.war</source> <outputDirectory>server/webapp</outputDirectory> <destName>falcon.war</destName>
