[
https://issues.apache.org/jira/browse/AMBARI-14232?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15043962#comment-15043962
]
Hudson commented on AMBARI-14232:
---------------------------------
FAILURE: Integrated in Ambari-trunk-Commit #3982 (See
[https://builds.apache.org/job/Ambari-trunk-Commit/3982/])
AMBARI-14232. Kerberization fails if Hive is not installed but Spark is
(smohanty:
[http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=c1b3692d7f0c70bb55875ffe3c354367d0ccf481])
* ambari-server/src/main/java/org/apache/ambari/server/state/Cluster.java
*
ambari-server/src/main/java/org/apache/ambari/server/state/cluster/ClusterImpl.java
*
ambari-server/src/test/java/org/apache/ambari/server/controller/KerberosHelperTest.java
*
ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
> Kerberization fails if Hive is not installed but Spark is installed without
> Spark TS
> ------------------------------------------------------------------------------------
>
> Key: AMBARI-14232
> URL: https://issues.apache.org/jira/browse/AMBARI-14232
> Project: Ambari
> Issue Type: Bug
> Components: ambari-server
> Reporter: Sebastian Toader
> Assignee: Sebastian Toader
> Priority: Blocker
> Fix For: 2.2.0
>
> Attachments: AMBARI-14232.v1.patch
>
>
> Install spark without Hive. Spark-client installation fails with below error.
> {code:title="/var/lib/ambari-agent/data/errors-85.txt"}
> Traceback (most recent call last):
> File
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py",
> line 59, in <module>
> SparkClient().execute()
> File
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
> line 217, in execute
> method(env)
> File
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py",
> line 35, in install
> self.configure(env)
> File
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py",
> line 41, in configure
> setup_spark(env, 'client', action = 'config')
> File
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/setup_spark.py",
> line 89, in setup_spark
> key_value_delimiter = " ",
> File "/usr/lib/python2.6/site-packages/resource_management/core/base.py",
> line 154, in __init__
> self.env.run()
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 158, in run
> self.run_action(resource, action)
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 121, in run_action
> provider_action()
> File
> "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/properties_file.py",
> line 55, in action_create
> mode = self.resource.mode
> File "/usr/lib/python2.6/site-packages/resource_management/core/base.py",
> line 154, in __init__
> self.env.run()
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 158, in run
> self.run_action(resource, action)
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 121, in run_action
> provider_action()
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
> line 108, in action_create
> self.resource.group, mode=self.resource.mode,
> cd_access=self.resource.cd_access)
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
> line 44, in _ensure_metadata
> _user_entity = pwd.getpwnam(user)
> KeyError: 'getpwnam(): name not found: hive'{code}
> {code:title=ambari hash}
> root@os-u14-qihles-spark-re1-2:~# ambari-server --hash
> 863d787b3c2f06b9593aa3cca6656f9ee666817d
> ===========
> Version 2.1.3.0
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)