[
https://issues.apache.org/jira/browse/AMBARI-8932?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14260127#comment-14260127
]
Hudson commented on AMBARI-8932:
--------------------------------
FAILURE: Integrated in Ambari-trunk-Commit #1352 (See
[https://builds.apache.org/job/Ambari-trunk-Commit/1352/])
AMBARI-8932. Creating hdfs directories on deploy takes too long, Part 2,
reduces deploy time by ~6min (aonishuk) (aonishuk:
http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=1652163e873a3015e841665c22fcbb45e092708e)
* ambari-server/src/test/python/stacks/2.0.6/HDFS/test_namenode.py
> Creating hdfs directories on deploy takes too long, Part 2, reduces deploy
> time by ~6min
> ----------------------------------------------------------------------------------------
>
> Key: AMBARI-8932
> URL: https://issues.apache.org/jira/browse/AMBARI-8932
> Project: Ambari
> Issue Type: Bug
> Reporter: Andrew Onischuk
> Assignee: Andrew Onischuk
> Fix For: 2.0.0
>
>
> Take a look at the webhcat logs, creating dfs directories by calling hadoop
> binary time by time takes too long.
>
> 014-12-10 17:09:29,060 - ExecuteHadoop['fs -ls
> hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'logoutput': True,
> 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:09:29,073 - Execute['hadoop --config /etc/hadoop/conf fs
> -ls hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'logoutput': True,
> 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hcat', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:09:46,301 - ls:
> `hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz': No such file or directory
> 2014-12-10 17:09:46,301 -
> HdfsDirectory['hdfs:///hdp/apps/2.2.0.0-2041/hive'] {'security_enabled':
> False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user':
> 'hdfs', 'kinit_path_local': '', 'mode': 0555, 'owner': 'hdfs', 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'action': ['create']}
> 2014-12-10 17:09:46,303 - Execute['hadoop --config /etc/hadoop/conf fs
> -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"`
> hdfs:///hdp/apps/2.2.0.0-2041/hive && hadoop --config /etc/hadoop/conf fs
> -chmod 555 hdfs:///hdp/apps/2.2.0.0-2041/hive && hadoop --config
> /etc/hadoop/conf fs -chown hdfs hdfs:///hdp/apps/2.2.0.0-2041/hive']
> {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export
> {ENV_PLACEHOLDER} > /dev/null ; hadoop --config /etc/hadoop/conf fs -ls
> hdfs:///hdp/apps/2.2.0.0-2041/hive'", 'user': 'hdfs', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:10:29,989 -
> CopyFromLocal['/usr/hdp/current/hive-client/hive.tar.gz'] {'hadoop_bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'group': 'hadoop', 'hdfs_user': 'hdfs',
> 'owner': 'hdfs', 'kinnit_if_needed': '', 'dest_dir':
> 'hdfs:///hdp/apps/2.2.0.0-2041/hive', 'hadoop_conf_dir': '/etc/hadoop/conf',
> 'mode': 0444}
> 2014-12-10 17:10:30,017 - ExecuteHadoop['fs -copyFromLocal
> /usr/hdp/current/hive-client/hive.tar.gz hdfs:///hdp/apps/2.2.0.0-2041/hive']
> {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export
> {ENV_PLACEHOLDER} > /dev/null ; PATH=$PATH:/usr/hdp/current/hadoop-client/bin
> hadoop fs -ls hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'", 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:10:48,275 - Execute['hadoop --config /etc/hadoop/conf fs
> -copyFromLocal /usr/hdp/current/hive-client/hive.tar.gz
> hdfs:///hdp/apps/2.2.0.0-2041/hive'] {'logoutput': False, 'try_sleep': 0,
> 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:11:07,134 - ExecuteHadoop['fs -chown hdfs:hadoop
> hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:11:07,135 - Execute['hadoop --config /etc/hadoop/conf fs
> -chown hdfs:hadoop hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz']
> {'logoutput': False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user':
> 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:11:16,533 - ExecuteHadoop['fs -chmod 444
> hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:11:16,534 - Execute['hadoop --config /etc/hadoop/conf fs
> -chmod 444 hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'logoutput':
> False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hdfs',
> 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:11:29,515 - ExecuteHadoop['fs -ls
> hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'] {'logoutput': True, 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:11:29,516 - Execute['hadoop --con014-12-10 17:09:29,060 -
> ExecuteHadoop['fs -ls hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz']
> {'logoutput': True, 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user':
> 'hcat', 'conf_dir': '/etc/hadoop/conf'}
> 2014-12-10 17:09:29,073 - Execute['hadoop --config /etc/hadoop/conf fs
> -ls hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'logoutput': True,
> 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hcat', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:09:46,301 - ls:
> `hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz': No such file or directory
> 2014-12-10 17:09:46,301 -
> HdfsDirectory['hdfs:///hdp/apps/2.2.0.0-2041/hive'] {'security_enabled':
> False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user':
> 'hdfs', 'kinit_path_local': '', 'mode': 0555, 'owner': 'hdfs', 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'action': ['create']}
> 2014-12-10 17:09:46,303 - Execute['hadoop --config /etc/hadoop/conf fs
> -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"`
> hdfs:///hdp/apps/2.2.0.0-2041/hive && hadoop --config /etc/hadoop/conf fs
> -chmod 555 hdfs:///hdp/apps/2.2.0.0-2041/hive && hadoop --config
> /etc/hadoop/conf fs -chown hdfs hdfs:///hdp/apps/2.2.0.0-2041/hive']
> {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export
> {ENV_PLACEHOLDER} > /dev/null ; hadoop --config /etc/hadoop/conf fs -ls
> hdfs:///hdp/apps/2.2.0.0-2041/hive'", 'user': 'hdfs', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:10:29,989 -
> CopyFromLocal['/usr/hdp/current/hive-client/hive.tar.gz'] {'hadoop_bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'group': 'hadoop', 'hdfs_user': 'hdfs',
> 'owner': 'hdfs', 'kinnit_if_needed': '', 'dest_dir':
> 'hdfs:///hdp/apps/2.2.0.0-2041/hive', 'hadoop_conf_dir': '/etc/hadoop/conf',
> 'mode': 0444}
> 2014-12-10 17:10:30,017 - ExecuteHadoop['fs -copyFromLocal
> /usr/hdp/current/hive-client/hive.tar.gz hdfs:///hdp/apps/2.2.0.0-2041/hive']
> {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export
> {ENV_PLACEHOLDER} > /dev/null ; PATH=$PATH:/usr/hdp/current/hadoop-client/bin
> hadoop fs -ls hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'", 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:10:48,275 - Execute['hadoop --config /etc/hadoop/conf fs
> -copyFromLocal /usr/hdp/current/hive-client/hive.tar.gz
> hdfs:///hdp/apps/2.2.0.0-2041/hive'] {'logoutput': False, 'try_sleep': 0,
> 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:11:07,134 - ExecuteHadoop['fs -chown hdfs:hadoop
> hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:11:07,135 - Execute['hadoop --config /etc/hadoop/conf fs
> -chown hdfs:hadoop hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz']
> {'logoutput': False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user':
> 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:11:16,533 - ExecuteHadoop['fs -chmod 444
> hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:11:16,534 - Execute['hadoop --config /etc/hadoop/conf fs
> -chmod 444 hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'logoutput':
> False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hdfs',
> 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:11:29,515 - ExecuteHadoop['fs -ls
> hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'] {'logoutput': True, 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:11:29,516 - Execute['hadoop --config /etc/hadoop/conf fs
> -ls hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'] {'logoutput': True,
> 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hcat', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:11:45,791 - ls:
> `hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz': No such file or directory
> 2014-12-10 17:11:45,791 -
> HdfsDirectory['hdfs:///hdp/apps/2.2.0.0-2041/pig'] {'security_enabled':
> False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user':
> 'hdfs', 'kinit_path_local': '', 'mode': 0555, 'owner': 'hdfs', 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'action': ['create']}
> 2014-12-10 17:11:45,794 - Execute['hadoop --config /etc/hadoop/conf fs
> -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"`
> hdfs:///hdp/apps/2.2.0.0-2041/pig && hadoop --config /etc/hadoop/conf fs
> -chmod 555 hdfs:///hdp/apps/2.2.0.0-2041/pig && hadoop --config
> /etc/hadoop/conf fs -chown hdfs hdfs:///hdp/apps/2.2.0.0-2041/pig']
> {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export
> {ENV_PLACEHOLDER} > /dev/null ; hadoop --config /etc/hadoop/conf fs -ls
> hdfs:///hdp/apps/2.2.0.0-2041/pig'", 'user': 'hdfs', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:12:31,703 -
> CopyFromLocal['/usr/hdp/current/pig-client/pig.tar.gz'] {'hadoop_bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'group': 'hadoop', 'hdfs_user': 'hdfs',
> 'owner': 'hdfs', 'kinnit_if_needed': '', 'dest_dir':
> 'hdfs:///hdp/apps/2.2.0.0-2041/pig', 'hadoop_conf_dir': '/etc/hadoop/conf',
> 'mode': 0444}
> 2014-12-10 17:12:31,703 - ExecuteHadoop['fs -copyFromLocal
> /usr/hdp/current/pig-client/pig.tar.gz hdfs:///hdp/apps/2.2.0.0-2041/pig']
> {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export
> {ENV_PLACEHOLDER} > /dev/null ; PATH=$PATH:/usr/hdp/current/hadoop-client/bin
> hadoop fs -ls hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'", 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:12:49,508 - Execute['hadoop --config /etc/hadoop/conf fs
> -copyFromLocal /usr/hdp/current/pig-client/pig.tar.gz
> hdfs:///hdp/apps/2.2.0.0-2041/pig'] {'logoutput': False, 'try_sleep': 0,
> 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:13:09,506 - ExecuteHadoop['fs -chown hdfs:hadoop
> hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'] {'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:13:09,507 - Execute['hadoop --config /etc/hadoop/conf fs
> -chown hdfs:hadoop hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz']
> {'logoutput': False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user':
> 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:13:18,968 - ExecuteHadoop['fs -chmod 444
> hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'] {'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:13:18,969 - Execute['hadoop --config /etc/hadoop/conf fs
> -chmod 444 hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'] {'logoutput':
> False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hdfs',
> 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:13:32,936 - ExecuteHadoop['fs -ls
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'] {'logoutput':
> True, 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat',
> 'conf_dir': '/etc/hadoop/conf'}
> 2014-12-10 17:13:32,937 - Execute['hadoop --config /etc/hadoop/conf fs
> -ls hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar']
> {'logoutput': True, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user':
> 'hcat', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:13:52,891 - ls:
> `hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar': No such file
> or directory
> 2014-12-10 17:13:52,892 -
> HdfsDirectory['hdfs:///hdp/apps/2.2.0.0-2041/mapreduce'] {'security_enabled':
> False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user':
> 'hdfs', 'kinit_path_local': '', 'mode': 0555, 'owner': 'hdfs', 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'action': ['create']}
> 2014-12-10 17:13:52,904 - Execute['hadoop --config /etc/hadoop/conf fs
> -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"`
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce && hadoop --config /etc/hadoop/conf
> fs -chmod 555 hdfs:///hdp/apps/2.2.0.0-2041/mapreduce && hadoop --config
> /etc/hadoop/conf fs -chown hdfs hdfs:///hdp/apps/2.2.0.0-2041/mapreduce']
> {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export
> {ENV_PLACEHOLDER} > /dev/null ; hadoop --config /etc/hadoop/conf fs -ls
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce'", 'user': 'hdfs', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:14:03,832 - Skipping Execute['hadoop --config
> /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"`
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce && hadoop --config /etc/hadoop/conf
> fs -chmod 555 hdfs:///hdp/apps/2.2.0.0-2041/mapreduce && hadoop --config
> /etc/hadoop/conf fs -chown hdfs hdfs:///hdp/apps/2.2.0.0-2041/mapreduce']
> due to not_if
> 2014-12-10 17:14:03,833 -
> CopyFromLocal['/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar']
> {'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'group': 'hadoop',
> 'hdfs_user': 'hdfs', 'owner': 'hdfs', 'kinnit_if_needed': '', 'dest_dir':
> 'hdfs:///hdp/apps/2.2.0.0-2041/mapreduce', 'hadoop_conf_dir':
> '/etc/hadoop/conf', 'mode': 0444}
> 2014-12-10 17:14:03,836 - ExecuteHadoop['fs -copyFromLocal
> /usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce'] {'not_if': "/usr/bin/sudo su hdfs
> -l -s /bin/bash -c 'export {ENV_PLACEHOLDER} > /dev/null ;
> PATH=$PATH:/usr/hdp/current/hadoop-client/bin hadoop fs -ls
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'", 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:14:12,682 - Execute['hadoop --config /etc/hadoop/conf fs
> -copyFromLocal /usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce'] {'logoutput': False, 'try_sleep':
> 0, 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:14:22,350 - ExecuteHadoop['fs -chown hdfs:hadoop
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'] {'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:14:22,352 - Execute['hadoop --config /etc/hadoop/conf fs
> -chown hdfs:hadoop
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'] {'logoutput':
> False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hdfs',
> 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:14:34,163 - ExecuteHadoop['fs -chmod 444
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'] {'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:14:34,164 - Execute['hadoop --config /etc/hadoop/conf fs
> -chmod 444 hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar']
> {'logoutput': False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user':
> 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:14:50,851 - Could not find file:
> /usr/hdp/current/sqoop-client/sqoop.tar.gz
> 2014-12-10 17:14:50,862 - XmlConfig['webhcat-site.xml'] {'owner': 'hcat',
> 'group': 'hadoop', 'conf_dir': '/etc/hive-webhcat/conf',
> 'configuration_attributes': ..., 'configurations': ...}
> 2014-12-10 17:14:50,979 - Generating config:
> /etc/hive-webhcat/conf/webhcat-site.xml
> 2014-12-10 17:14:50,980 - File['/etc/hive-webhcat/conf/webhcat-site.xml']
> {'owner': 'hcat', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode':
> None, 'encoding': 'UTF-8'}
> 2014-12-10 17:14:50,983 - Writing
> File['/etc/hive-webhcat/conf/webhcat-site.xml'] because it doesn't exist
> 2014-12-10 17:14:51,114 - Changing owner for
> /etc/hive-webhcat/conf/webhcat-site.xml from 0 to hcat
> 2014-12-10 17:14:51,169 - Changing group for
> /etc/hive-webhcat/conf/webhcat-site.xml from 0 to hadoop
> 2014-12-10 17:14:51,221 - File['/etc/hive-webhcat/conf/webhcat-env.sh']
> {'content': InlineTemplate(...), 'owner': 'hcat', 'group': 'hadoop'}
> 2014-12-10 17:14:51,222 - Writing
> File['/etc/hive-webhcat/conf/webhcat-env.sh'] because it doesn't exist
> 2014-12-10 17:14:51,312 - Changing owner for
> /etc/hive-webhcat/conf/webhcat-env.sh from 0 to hcat
> 2014-12-10 17:14:51,367 - Changing group for
> /etc/hive-webhcat/conf/webhcat-env.sh from 0 to hadoop
> 2014-12-10 17:14:51,423 - Execute['env
> HADOOP_HOME=/usr/hdp/current/hadoop-client
> /usr/hdp/current/hive-webhcat/sbin/webhcat_server.sh start'] {'not_if': 'ls
> /var/run/webhcat/webhcat.pid >/dev/null 2>&1 && ps -p `cat
> /var/run/webhcat/webhcat.pid` >/dev/null 2>&1', 'user': 'hcat'}fig
> /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz']
> {'logoutput': True, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user':
> 'hcat', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:11:45,791 - ls:
> `hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz': No such file or directory
> 2014-12-10 17:11:45,791 -
> HdfsDirectory['hdfs:///hdp/apps/2.2.0.0-2041/pig'] {'security_enabled':
> False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user':
> 'hdfs', 'kinit_path_local': '', 'mode': 0555, 'owner': 'hdfs', 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'action': ['create']}
> 2014-12-10 17:11:45,794 - Execute['hadoop --config /etc/hadoop/conf fs
> -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"`
> hdfs:///hdp/apps/2.2.0.0-2041/pig && hadoop --config /etc/hadoop/conf fs
> -chmod 555 hdfs:///hdp/apps/2.2.0.0-2041/pig && hadoop --config
> /etc/hadoop/conf fs -chown hdfs hdfs:///hdp/apps/2.2.0.0-2041/pig']
> {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export
> {ENV_PLACEHOLDER} > /dev/null ; hadoop --config /etc/hadoop/conf fs -ls
> hdfs:///hdp/apps/2.2.0.0-2041/pig'", 'user': 'hdfs', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:12:31,703 -
> CopyFromLocal['/usr/hdp/current/pig-client/pig.tar.gz'] {'hadoop_bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'group': 'hadoop', 'hdfs_user': 'hdfs',
> 'owner': 'hdfs', 'kinnit_if_needed': '', 'dest_dir':
> 'hdfs:///hdp/apps/2.2.0.0-2041/pig', 'hadoop_conf_dir': '/etc/hadoop/conf',
> 'mode': 0444}
> 2014-12-10 17:12:31,703 - ExecuteHadoop['fs -copyFromLocal
> /usr/hdp/current/pig-client/pig.tar.gz hdfs:///hdp/apps/2.2.0.0-2041/pig']
> {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export
> {ENV_PLACEHOLDER} > /dev/null ; PATH=$PATH:/usr/hdp/current/hadoop-client/bin
> hadoop fs -ls hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'", 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:12:49,508 - Execute['hadoop --config /etc/hadoop/conf fs
> -copyFromLocal /usr/hdp/current/pig-client/pig.tar.gz
> hdfs:///hdp/apps/2.2.0.0-2041/pig'] {'logoutput': False, 'try_sleep': 0,
> 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:13:09,506 - ExecuteHadoop['fs -chown hdfs:hadoop
> hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'] {'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:13:09,507 - Execute['hadoop --config /etc/hadoop/conf fs
> -chown hdfs:hadoop hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz']
> {'logoutput': False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user':
> 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:13:18,968 - ExecuteHadoop['fs -chmod 444
> hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'] {'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:13:18,969 - Execute['hadoop --config /etc/hadoop/conf fs
> -chmod 444 hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'] {'logoutput':
> False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hdfs',
> 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:13:32,936 - ExecuteHadoop['fs -ls
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'] {'logoutput':
> True, 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat',
> 'conf_dir': '/etc/hadoop/conf'}
> 2014-12-10 17:13:32,937 - Execute['hadoop --config /etc/hadoop/conf fs
> -ls hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar']
> {'logoutput': True, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user':
> 'hcat', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:13:52,891 - ls:
> `hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar': No such file
> or directory
> 2014-12-10 17:13:52,892 -
> HdfsDirectory['hdfs:///hdp/apps/2.2.0.0-2041/mapreduce'] {'security_enabled':
> False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user':
> 'hdfs', 'kinit_path_local': '', 'mode': 0555, 'owner': 'hdfs', 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'action': ['create']}
> 2014-12-10 17:13:52,904 - Execute['hadoop --config /etc/hadoop/conf fs
> -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"`
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce && hadoop --config /etc/hadoop/conf
> fs -chmod 555 hdfs:///hdp/apps/2.2.0.0-2041/mapreduce && hadoop --config
> /etc/hadoop/conf fs -chown hdfs hdfs:///hdp/apps/2.2.0.0-2041/mapreduce']
> {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export
> {ENV_PLACEHOLDER} > /dev/null ; hadoop --config /etc/hadoop/conf fs -ls
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce'", 'user': 'hdfs', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:14:03,832 - Skipping Execute['hadoop --config
> /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"`
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce && hadoop --config /etc/hadoop/conf
> fs -chmod 555 hdfs:///hdp/apps/2.2.0.0-2041/mapreduce && hadoop --config
> /etc/hadoop/conf fs -chown hdfs hdfs:///hdp/apps/2.2.0.0-2041/mapreduce']
> due to not_if
> 2014-12-10 17:14:03,833 -
> CopyFromLocal['/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar']
> {'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'group': 'hadoop',
> 'hdfs_user': 'hdfs', 'owner': 'hdfs', 'kinnit_if_needed': '', 'dest_dir':
> 'hdfs:///hdp/apps/2.2.0.0-2041/mapreduce', 'hadoop_conf_dir':
> '/etc/hadoop/conf', 'mode': 0444}
> 2014-12-10 17:14:03,836 - ExecuteHadoop['fs -copyFromLocal
> /usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce'] {'not_if': "/usr/bin/sudo su hdfs
> -l -s /bin/bash -c 'export {ENV_PLACEHOLDER} > /dev/null ;
> PATH=$PATH:/usr/hdp/current/hadoop-client/bin hadoop fs -ls
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'", 'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:14:12,682 - Execute['hadoop --config /etc/hadoop/conf fs
> -copyFromLocal /usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce'] {'logoutput': False, 'try_sleep':
> 0, 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path':
> ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:14:22,350 - ExecuteHadoop['fs -chown hdfs:hadoop
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'] {'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:14:22,352 - Execute['hadoop --config /etc/hadoop/conf fs
> -chown hdfs:hadoop
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'] {'logoutput':
> False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hdfs',
> 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:14:34,163 - ExecuteHadoop['fs -chmod 444
> hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'] {'bin_dir':
> '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir':
> '/etc/hadoop/conf'}
> 2014-12-10 17:14:34,164 - Execute['hadoop --config /etc/hadoop/conf fs
> -chmod 444 hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar']
> {'logoutput': False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user':
> 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2014-12-10 17:14:50,851 - Could not find file:
> /usr/hdp/current/sqoop-client/sqoop.tar.gz
> 2014-12-10 17:14:50,862 - XmlConfig['webhcat-site.xml'] {'owner': 'hcat',
> 'group': 'hadoop', 'conf_dir': '/etc/hive-webhcat/conf',
> 'configuration_attributes': ..., 'configurations': ...}
> 2014-12-10 17:14:50,979 - Generating config:
> /etc/hive-webhcat/conf/webhcat-site.xml
> 2014-12-10 17:14:50,980 - File['/etc/hive-webhcat/conf/webhcat-site.xml']
> {'owner': 'hcat', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode':
> None, 'encoding': 'UTF-8'}
> 2014-12-10 17:14:50,983 - Writing
> File['/etc/hive-webhcat/conf/webhcat-site.xml'] because it doesn't exist
> 2014-12-10 17:14:51,114 - Changing owner for
> /etc/hive-webhcat/conf/webhcat-site.xml from 0 to hcat
> 2014-12-10 17:14:51,169 - Changing group for
> /etc/hive-webhcat/conf/webhcat-site.xml from 0 to hadoop
> 2014-12-10 17:14:51,221 - File['/etc/hive-webhcat/conf/webhcat-env.sh']
> {'content': InlineTemplate(...), 'owner': 'hcat', 'group': 'hadoop'}
> 2014-12-10 17:14:51,222 - Writing
> File['/etc/hive-webhcat/conf/webhcat-env.sh'] because it doesn't exist
> 2014-12-10 17:14:51,312 - Changing owner for
> /etc/hive-webhcat/conf/webhcat-env.sh from 0 to hcat
> 2014-12-10 17:14:51,367 - Changing group for
> /etc/hive-webhcat/conf/webhcat-env.sh from 0 to hadoop
> 2014-12-10 17:14:51,423 - Execute['env
> HADOOP_HOME=/usr/hdp/current/hadoop-client
> /usr/hdp/current/hive-webhcat/sbin/webhcat_server.sh start'] {'not_if': 'ls
> /var/run/webhcat/webhcat.pid >/dev/null 2>&1 && ps -p `cat
> /var/run/webhcat/webhcat.pid` >/dev/null 2>&1', 'user': 'hcat'}
>
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)