-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/42051/#review115089
-----------------------------------------------------------



ambari-server/src/main/resources/common-services/SPARK/1.2.0.2.2/package/scripts/setup_spark.py
 (line 94)
<https://reviews.apache.org/r/42051/#comment175948>

    effective_version = params.version if params.upgrade_direction in 
["UPGRADE", "DOWNGRADE"] else params.hdp_stack_version
    
    # Note, that can also use upgrade_type. If none, then use 
hdp_stack_version, otherwise, use params.version
    
    params.version is only available during RU/EU. If not in that scenario, use 
hdp_stack_version
    
    compare_version(format_hdp_stack_version(effective_version), "2.4.0.0") >= 0



ambari-server/src/main/resources/common-services/SPARK/1.2.0.2.2/package/scripts/spark_service.py
 (line 34)
<https://reviews.apache.org/r/42051/#comment175949>

    Similar comment as above. spark_service.py and setup_spark.py should accept 
upgrade_type or lookup params.upgrade_direction


- Alejandro Fernandez


On Jan. 19, 2016, 12:16 a.m., Jeff Zhang wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/42051/
> -----------------------------------------------------------
> 
> (Updated Jan. 19, 2016, 12:16 a.m.)
> 
> 
> Review request for Ambari, Alejandro Fernandez, Saisai Shao, and Sumit 
> Mohanty.
> 
> 
> Bugs: AMBARI-14561
>     https://issues.apache.org/jira/browse/AMBARI-14561
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> This is for HDP stack, in HDP stack, we'd like to do following changes for 
> spark thrift server
> - Remove advanced properties except queue name
> - Use yarn-client mode
> - Enable Dynamic Resource Allocation
> - Enable Fair scheduling policy
> 
> 
> Diffs
> -----
> 
>   
> ambari-common/src/main/python/resource_management/libraries/functions/copy_tarball.py
>  c3ffc7b 
>   
> ambari-server/src/main/resources/common-services/SPARK/1.2.0.2.2/package/scripts/params.py
>  68240bd 
>   
> ambari-server/src/main/resources/common-services/SPARK/1.2.0.2.2/package/scripts/setup_spark.py
>  4b38572 
>   
> ambari-server/src/main/resources/common-services/SPARK/1.2.0.2.2/package/scripts/spark_service.py
>  d4c6732 
>   
> ambari-server/src/main/resources/stacks/HDP/2.3/upgrades/nonrolling-upgrade-2.4.xml
>  50c8584 
>   ambari-server/src/main/resources/stacks/HDP/2.3/upgrades/upgrade-2.4.xml 
> f145de1 
>   
> ambari-server/src/main/resources/stacks/HDP/2.4/services/SPARK/configuration/spark-defaults.xml
>  PRE-CREATION 
>   
> ambari-server/src/main/resources/stacks/HDP/2.4/services/SPARK/configuration/spark-thrift-fairscheduler.xml
>  PRE-CREATION 
>   
> ambari-server/src/main/resources/stacks/HDP/2.4/services/SPARK/configuration/spark-thrift-sparkconf.xml
>  PRE-CREATION 
>   ambari-server/src/main/resources/stacks/HDP/2.4/services/SPARK/metainfo.xml 
> 6c0e393 
>   ambari-server/src/main/resources/stacks/HDP/2.4/upgrades/config-upgrade.xml 
> d5e4f78 
> 
> Diff: https://reviews.apache.org/r/42051/diff/
> 
> 
> Testing
> -------
> 
> This patch doesn't solve the issue completly. The weird thing is that the 
> configuration in HDP 2.3's spark-thrift-sparkconf.xml will be appended to 
> HDP-2.4's spark-thrift-sparkconf.xml. And what I want is remove some 
> configuration in 2.3. And actually I make Spark 1.6 of HDP 2.4 extend Spark 
> 1.2 in common-services, not sure why it still use spark-thrift-sparkconf.xml 
> in HDP 2.3
> 
> 
> Thanks,
> 
> Jeff Zhang
> 
>

Reply via email to