[
https://issues.apache.org/jira/browse/AMBARI-20910?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15993807#comment-15993807
]
Hudson commented on AMBARI-20910:
---------------------------------
FAILURE: Integrated in Jenkins build Ambari-trunk-Commit #7407 (See
[https://builds.apache.org/job/Ambari-trunk-Commit/7407/])
AMBARI-20910. HDP 3.0 TP - Unable to install Spark, cannot find (afernandez:
[http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=4b588a9237a72465f3ca83c207a8d4234d9c4c12])
* (add)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/package/scripts/setup_livy.py
* (add)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/package/scripts/spark_service.py
* (delete)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/scripts/params.py
* (add)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/package/scripts/service_check.py
* (add)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/package/scripts/spark_client.py
* (add)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/package/scripts/status_params.py
* (add)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/package/scripts/job_history_server.py
* (add)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/package/scripts/livy_server.py
* (add)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/package/scripts/setup_spark.py
* (delete)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/scripts/service_check.py
* (delete)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/scripts/job_history_server.py
* (add)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/package/scripts/params.py
* (delete)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/scripts/spark_client.py
* (delete)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/scripts/setup_livy.py
* (delete)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/scripts/spark_thrift_server.py
* (delete)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/scripts/setup_spark.py
* (add)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/package/scripts/spark_thrift_server.py
* (delete)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/scripts/status_params.py
* (delete)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/scripts/livy_service.py
* (delete)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/scripts/spark_service.py
* (delete)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/scripts/livy_server.py
* (add)
ambari-server/src/main/resources/common-services/SPARK/2.2.0/package/scripts/livy_service.py
> HDP 3.0 TP - Unable to install Spark, cannot find package/scripts dir
> ---------------------------------------------------------------------
>
> Key: AMBARI-20910
> URL: https://issues.apache.org/jira/browse/AMBARI-20910
> Project: Ambari
> Issue Type: Bug
> Components: stacks
> Affects Versions: 3.0.0
> Reporter: Alejandro Fernandez
> Assignee: Alejandro Fernandez
> Fix For: trunk
>
> Attachments: AMBARI-20910.patch
>
>
> STR:
> * Install Ambari 3.0 (last build was 650)
> * Install HDP 3.0 (last build is 197) with ZK, HDFS, YARN. Note: will fail on
> RM and Service Checks.
> * Because Hive is not yet compiling, temporarily comment out Hive as a
> required service for Spark, and HIVE_METASTORE as a required co-hosted
> component.
> /var/lib/ambari-server/resources/common-services/SPARK/2.2.0/metainfo.xml
> * Restart Ambari Server
> * Attempt to add Spark as a service.
> Error:
> {noformat}
> Caught an exception while executing custom service command: <type
> 'exceptions.KeyError'>: 'service_package_folder'; 'service_package_folder'
> {noformat}
> This is coming from CustomServiceOrchestrator.py
> {code}
> except Exception, e: # We do not want to let agent fail completely
> exc_type, exc_obj, exc_tb = sys.exc_info()
> message = "Caught an exception while executing "\
> "custom service command: {0}: {1}; {2}".format(exc_type, exc_obj,
> str(e))
> logger.exception(message)
> {code}
> Looks like Spark 2.2.0 doesn't have the package/scripts directory.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)