-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/47847/#review134864
-----------------------------------------------------------




ambari-server/src/main/resources/common-services/SPARK/1.2.1/configuration/spark-defaults.xml
 (line 154)
<https://reviews.apache.org/r/47847/#comment199804>

    WebHCat and Hive use "/usr/hdp/${hdp.version}" when specifying paths.


- Srimanth Gunturi


On May 25, 2016, 7:37 p.m., Weiqing Yang wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/47847/
> -----------------------------------------------------------
> 
> (Updated May 25, 2016, 7:37 p.m.)
> 
> 
> Review request for Ambari, Sumit Mohanty and Srimanth Gunturi.
> 
> 
> Bugs: AMBARI-16755
>     https://issues.apache.org/jira/browse/AMBARI-16755
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> Add spark.driver.extraLibraryPath in spark-defaults.conf and 
> spark-thrift-sparkconf.conf.
> 
> 
> Diffs
> -----
> 
>   
> ambari-server/src/main/resources/common-services/SPARK/1.2.1/configuration/spark-defaults.xml
>  b507d5e 
>   
> ambari-server/src/main/resources/common-services/SPARK/1.5.2/configuration/spark-thrift-sparkconf.xml
>  b5742ea 
> 
> Diff: https://reviews.apache.org/r/47847/diff/
> 
> 
> Testing
> -------
> 
> Tests passed. WARNing messages about unable to load native-hadoop library are 
> gone when users start sparkshell and submit application on Spark.
> 
> 
> Thanks,
> 
> Weiqing Yang
> 
>

Reply via email to