Github user rvesse commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22904#discussion_r238483901
  
    --- Diff: 
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/SparkKubernetesClientFactory.scala
 ---
    @@ -67,8 +66,16 @@ private[spark] object SparkKubernetesClientFactory {
         val dispatcher = new Dispatcher(
           ThreadUtils.newDaemonCachedThreadPool("kubernetes-dispatcher"))
     
    -    // TODO [SPARK-25887] Create builder in a way that respects 
configurable context
    -    val config = new ConfigBuilder()
    +    // Allow for specifying a context used to auto-configure from the 
users K8S config file
    +    val kubeContext = sparkConf.get(KUBERNETES_CONTEXT).filter(c => 
StringUtils.isNotBlank(c))
    +    logInfo(s"Auto-configuring K8S client using " +
    +      s"${if (kubeContext.isEmpty) s"context ${kubeContext.get}" else 
"current context"}" +
    +      s" from users K8S config file")
    +
    +    // Start from an auto-configured config with the desired context
    +    // Fabric 8 uses null to indicate that the users current context 
should be used so if no
    +    // explicit setting pass null
    +    val config = new 
ConfigBuilder(autoConfigure(kubeContext.getOrElse(null)))
    --- End diff --
    
    If the context does not exist then Fabric 8 falls back to other ways of 
auto-configuring itself (e.g. service account)
    
    Fabric 8 skips any file based auto-configuration if there is no K8S config 
file present 
(https://github.com/fabric8io/kubernetes-client/blob/master/kubernetes-client/src/main/java/io/fabric8/kubernetes/client/Config.java#L436-L459).
    
    Since we don't propagate the submission clients config file into the driver 
pods no auto-configuration from config file will be attempted in the driver 
because there won't be a config file present.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to