[
https://issues.apache.org/jira/browse/AMBARI-20910?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15992600#comment-15992600
]
Hadoop QA commented on AMBARI-20910:
------------------------------------
{color:red}-1 overall{color}. Here are the results of testing the latest
attachment
http://issues.apache.org/jira/secure/attachment/12865864/AMBARI-20910.patch
against trunk revision .
{color:green}+1 @author{color}. The patch does not contain any @author
tags.
{color:red}-1 tests included{color}. The patch doesn't appear to include
any new or modified tests.
Please justify why no new tests are needed for this
patch.
Also please list what manual steps were performed to
verify this patch.
{color:green}+1 release audit{color}. The applied patch does not increase
the total number of release audit warnings.
{color:green}+1 javac{color}. The applied patch does not increase the
total number of javac compiler warnings.
{color:green}+1 core tests{color}. The patch passed unit tests in
ambari-server.
Console output:
https://builds.apache.org/job/Ambari-trunk-test-patch/11563//console
This message is automatically generated.
> HDP 3.0 TP - Unable to install Spark, cannot find package/scripts dir
> ---------------------------------------------------------------------
>
> Key: AMBARI-20910
> URL: https://issues.apache.org/jira/browse/AMBARI-20910
> Project: Ambari
> Issue Type: Bug
> Components: stacks
> Affects Versions: 3.0.0
> Reporter: Alejandro Fernandez
> Assignee: Alejandro Fernandez
> Fix For: trunk
>
> Attachments: AMBARI-20910.patch
>
>
> STR:
> * Install Ambari 3.0 (last build was 650)
> * Install HDP 3.0 (last build is 197) with ZK, HDFS, YARN. Note: will fail on
> RM and Service Checks.
> * Because Hive is not yet compiling, temporarily comment out Hive as a
> required service for Spark, and HIVE_METASTORE as a required co-hosted
> component.
> /var/lib/ambari-server/resources/common-services/SPARK/2.2.0/metainfo.xml
> * Restart Ambari Server
> * Attempt to add Spark as a service.
> Error:
> {noformat}
> Caught an exception while executing custom service command: <type
> 'exceptions.KeyError'>: 'service_package_folder'; 'service_package_folder'
> {noformat}
> This is coming from CustomServiceOrchestrator.py
> {code}
> except Exception, e: # We do not want to let agent fail completely
> exc_type, exc_obj, exc_tb = sys.exc_info()
> message = "Caught an exception while executing "\
> "custom service command: {0}: {1}; {2}".format(exc_type, exc_obj,
> str(e))
> logger.exception(message)
> {code}
> Looks like Spark 2.2.0 doesn't have the package/scripts directory.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)