-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/47847/
-----------------------------------------------------------

(Updated May 27, 2016, 9:11 p.m.)


Review request for Ambari, Sumit Mohanty and Srimanth Gunturi.


Changes
-------

1. use a variable to specify paths. 2. make the same change in spark2


Bugs: AMBARI-16755
    https://issues.apache.org/jira/browse/AMBARI-16755


Repository: ambari


Description
-------

Add spark.driver.extraLibraryPath in spark-defaults.conf and 
spark-thrift-sparkconf.conf.


Diffs (updated)
-----

  
ambari-server/src/main/resources/common-services/SPARK/1.2.1/configuration/spark-defaults.xml
 b507d5e 
  
ambari-server/src/main/resources/common-services/SPARK/1.2.1/package/scripts/params.py
 c5f3eb6 
  
ambari-server/src/main/resources/common-services/SPARK/1.5.2/configuration/spark-thrift-sparkconf.xml
 b5742ea 
  
ambari-server/src/main/resources/common-services/SPARK2/2.0.0/configuration/spark2-defaults.xml
 5d6c781 
  
ambari-server/src/main/resources/common-services/SPARK2/2.0.0/configuration/spark2-thrift-sparkconf.xml
 ce1d159 
  
ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/params.py
 ded9959 

Diff: https://reviews.apache.org/r/47847/diff/


Testing
-------

Tests passed. WARNing messages about unable to load native-hadoop library are 
gone when users start sparkshell and submit application on Spark.


Thanks,

Weiqing Yang

Reply via email to