----------------------------------------------------------- This is an automatically generated e-mail. To reply, visit: https://reviews.apache.org/r/27061/#review58002 -----------------------------------------------------------
Ship it! Ship It! - Sid Wagle On Oct. 23, 2014, 12:49 a.m., Alejandro Fernandez wrote: > > ----------------------------------------------------------- > This is an automatically generated e-mail. To reply, visit: > https://reviews.apache.org/r/27061/ > ----------------------------------------------------------- > > (Updated Oct. 23, 2014, 12:49 a.m.) > > > Review request for Ambari, Jaimin Jetly, Jonathan Hurley, Sumit Mohanty, and > Sid Wagle. > > > Bugs: AMBARI-7913 > https://issues.apache.org/jira/browse/AMBARI-7913 > > > Repository: ambari > > > Description > ------- > > The Pig Service Check fails because the cluster-env.xml file doesn't contain > all of the following properties > tez_tar_source, tez_tar_destination_folder > hive_tar_source, hive_tar_destination_folder > pig_tar_source, pig_tar_destination_folder > hadoop-streaming_tar_source, hadoop-streaming_tar_destination_folder > sqoop_tar_source, sqoop_tar_destination_folder > > This happens because site_properties.js is not saving these to cluster-env.xml > > > Diffs > ----- > > > ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml > 5c7ea79 > ambari-web/app/data/HDP2/site_properties.js 541a6d0 > > Diff: https://reviews.apache.org/r/27061/diff/ > > > Testing > ------- > > Started up ambari-server, symlinked the web folder to my local working copy, > and during the installation process verified that the following URL showed > all of the properties, > > http://c6403.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env&tag=version1 > > When creating the cluster, selected HDFS, YARN, MR, Tez, Hive, Pig, > Zookeeper, Sqoop. > > After the deployment completed, verified that tez-site.xml contains > tez.lib.uri with an actual path, e.g., > hdfs:///apps/hdp/2.2.0.0-991/tez/tez-0.14.0.2.2.0.0-991.tar.gz > Next, re-ran a service check on Pig, which passed. > > Unit tests passed, > Total run:669 > Total errors:0 > Total failures:0 > OK > > [INFO] > ------------------------------------------------------------------------ > [INFO] Total time: 25:13.750s > [INFO] Finished at: Wed Oct 22 17:46:23 PDT 2014 > [INFO] Final Memory: 52M/492M > [INFO] > ------------------------------------------------------------------------ > > Pig Service Check results: > 2014-10-23 00:41:56,977 - ExecuteHadoop['dfs -rmr pigsmoke.out passwd; hadoop > --config /etc/hadoop/conf dfs -put /etc/passwd passwd '] {'security_enabled': > False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'try_sleep': 5, > 'kinit_path_local': '', 'tries': 3, 'user': 'ambari-qa', 'bin_dir': > '/usr/hdp/current/hadoop-client/bin'} > 2014-10-23 00:41:56,978 - Execute['hadoop --config /etc/hadoop/conf dfs -rmr > pigsmoke.out passwd; hadoop --config /etc/hadoop/conf dfs -put /etc/passwd > passwd '] {'logoutput': False, 'path': > ['/usr/hdp/current/hadoop-client/bin'], 'tries': 3, 'user': 'ambari-qa', > 'try_sleep': 5} > 2014-10-23 00:42:04,277 - File['/var/lib/ambari-agent/data/tmp/pigSmoke.sh'] > {'content': StaticFile('pigSmoke.sh'), 'mode': 0755} > 2014-10-23 00:42:04,288 - Execute['pig > /var/lib/ambari-agent/data/tmp/pigSmoke.sh'] {'path': > ['/usr/hdp/current/pig-client/bin:/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], > 'tries': 3, 'user': 'ambari-qa', 'try_sleep': 5} > 2014-10-23 00:42:36,804 - ExecuteHadoop['fs -test -e pigsmoke.out'] > {'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'ambari-qa', > 'conf_dir': '/etc/hadoop/conf'} > 2014-10-23 00:42:36,806 - Execute['hadoop --config /etc/hadoop/conf fs -test > -e pigsmoke.out'] {'logoutput': False, 'path': > ['/usr/hdp/current/hadoop-client/bin'], 'tries': 1, 'user': 'ambari-qa', > 'try_sleep': 0} > > > Thanks, > > Alejandro Fernandez > >
