-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/
-----------------------------------------------------------

(Updated Oct. 21, 2014, 11:56 p.m.)


Review request for Ambari, Dmytro Sen, Sumit Mohanty, and Sid Wagle.


Bugs: AMBARI-7892
    https://issues.apache.org/jira/browse/AMBARI-7892


Repository: ambari


Description
-------

This is related to AMBARI-7842
WebHCat relies on the following tarballs/jars

|| File || Property ||
| pig-*.tar.gz | templeton.pig.archive |
|hive-*tar.gz | templeton.hive.archive|
| sqoop-*tar.gz | templeton.sqoop.archive|
|hadoop-streaming-*.jar | templeton.streaming.jar|

All of these need to be copied to HDFS, and the name of the file needs to be 
injected into the property with the fully qualified path in HDFS.


Diffs
-----

  
ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py
 efe7e63 
  
ambari-common/src/main/python/resource_management/libraries/functions/version.py
 PRE-CREATION 
  
ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml
 0523dab 
  
ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
 7c86070 
  
ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
 4aad1a2 
  
ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py
 71989c3 
  ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml 
cc52fe3 
  
ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml
 3435a63 
  ambari-server/src/test/python/TestVersion.py PRE-CREATION 
  ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 
5f92a2d 
  ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
  ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
  ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
  ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 

Diff: https://reviews.apache.org/r/26965/diff/


Testing
-------

Ran ambari-server unit tests,
----------------------------------------------------------------------
Total run:667
Total errors:0
Total failures:0
OK

And verified on cluster using the following steps.

1. Set properties
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev 
cluster-env hive_tar_source                         
"/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version 
}}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev 
cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ 
hdp_stack_version }}/hive/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev 
cluster-env pig_tar_source                          
"/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version 
}}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev 
cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ 
hdp_stack_version }}/pig/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev 
cluster-env sqoop_tar_source                        
"/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ 
hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev 
cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ 
hdp_stack_version }}/sqoop/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev 
cluster-env hadoop-streaming_tar_source             
"/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version 
}}.{{ hdp_stack_version }}.jar"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev 
cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ 
hdp_stack_version }}/mr/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev 
webhcat-site templeton.jar                  
"/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev 
webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ 
hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version 
}}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev 
webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ 
hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version 
}}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev 
webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ 
hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version 
}}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev 
webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ 
hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ 
hdp_stack_version }}.jar"

2. Verified properties were saved
http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site

3. Copy changed files
yes | cp 
/vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py
                   
/usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
yes | cp 
/vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py
                          
/usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
yes | cp 
/vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py
  
/usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
yes | cp 
/vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
                
/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
yes | cp 
/vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
                 
/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
yes | cp 
/vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py
                    
/var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py

4. Check that tarballs are not already in HDFS. If they are, delete them.
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls 
/hdp/apps/2.2.0.0-974/hive/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls 
/hdp/apps/2.2.0.0-974/pig/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls 
/hdp/apps/2.2.0.0-974/sqoop/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/

5. Before starting WebHCat, check webhcat-site.xml for properties that should 
be unversioned
less /etc/hive-webhcat/conf/webhcat-site.xml
/ templeton.*archive
/ templeton.*jar

6. Restart WebHCat and verify files are copied to HDFS,
python 
/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py
 START /var/lib/ambari-agent/data/command-102.json 
/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS 
/var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls 
/hdp/apps/2.2.0.0-974/hive/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls 
/hdp/apps/2.2.0.0-974/pig/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls 
/hdp/apps/2.2.0.0-974/sqoop/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/

7. Verify that webhcat-site.xml has properties with actual values this time.

Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it 
should have the properties with the versioned paths.


Thanks,

Alejandro Fernandez

Reply via email to