[ 
https://issues.apache.org/jira/browse/TOREE-425?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16090869#comment-16090869
 ] 

ASF GitHub Bot commented on TOREE-425:
--------------------------------------

Github user rdblue commented on a diff in the pull request:

    https://github.com/apache/incubator-toree/pull/128#discussion_r127858852
  
    --- Diff: 
kernel/src/main/scala/org/apache/toree/boot/layer/ComponentInitialization.scala 
---
    @@ -94,6 +98,18 @@ trait StandardComponentInitialization extends 
ComponentInitialization {
     
       }
     
    +  def initializeSparkContext(config:Config, kernel:Kernel) = {
    +    // TOREE:425 Spark cluster mode requires a context to be initialized 
before
    +    // it register the application as Running
    +    if ( SparkUtils.isSparkClusterMode(kernel.sparkConf) ) {
    --- End diff --
    
    Nit: Are the spaces needed? I think this is non-standard for Scala or Java 
projects.


> sparkContext lazy initiation causes some issues when Toree is running on Yarn 
> Cluster mode
> ------------------------------------------------------------------------------------------
>
>                 Key: TOREE-425
>                 URL: https://issues.apache.org/jira/browse/TOREE-425
>             Project: TOREE
>          Issue Type: Bug
>          Components: Kernel
>    Affects Versions: 0.2.0
>            Reporter: Luciano Resende
>            Assignee: Luciano Resende
>            Priority: Critical
>             Fix For: 0.2.0
>
>
> Kernels running in yarn-cluster mode (when launched via spark-submit) must 
> initialize a SparkContext in order for the Spark Yarn code to register the 
> application as RUNNING: 
> https://github.com/apache/spark/blob/3d4d11a80fe8953d48d8bfac2ce112e37d38dc90/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala#L405



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to