Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/23174#discussion_r239892984
  
    --- Diff: 
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/BasicExecutorFeatureStep.scala
 ---
    @@ -87,44 +88,61 @@ private[spark] class 
BasicExecutorFeatureStep(kubernetesConf: KubernetesExecutor
         val executorCpuQuantity = new QuantityBuilder(false)
           .withAmount(executorCoresRequest)
           .build()
    -    val executorExtraClasspathEnv = executorExtraClasspath.map { cp =>
    -      new EnvVarBuilder()
    -        .withName(ENV_CLASSPATH)
    -        .withValue(cp)
    -        .build()
    -    }
    -    val executorExtraJavaOptionsEnv = kubernetesConf
    -      .get(EXECUTOR_JAVA_OPTIONS)
    -      .map { opts =>
    -        val subsOpts = Utils.substituteAppNExecIds(opts, 
kubernetesConf.appId,
    -          kubernetesConf.executorId)
    -        val delimitedOpts = Utils.splitCommandString(subsOpts)
    -        delimitedOpts.zipWithIndex.map {
    -          case (opt, index) =>
    -            new 
EnvVarBuilder().withName(s"$ENV_JAVA_OPT_PREFIX$index").withValue(opt).build()
    +
    +    val executorEnv: Seq[EnvVar] = {
    +        (Seq(
    +          (ENV_DRIVER_URL, driverUrl),
    +          (ENV_EXECUTOR_CORES, executorCores.toString),
    +          (ENV_EXECUTOR_MEMORY, executorMemoryString),
    +          (ENV_APPLICATION_ID, kubernetesConf.appId),
    +          // This is to set the SPARK_CONF_DIR to be /opt/spark/conf
    +          (ENV_SPARK_CONF_DIR, SPARK_CONF_DIR_INTERNAL),
    +          (ENV_EXECUTOR_ID, kubernetesConf.executorId)
    +        ) ++ kubernetesConf.environment).map { case (k, v) =>
    +          new EnvVarBuilder()
    +            .withName(k)
    +            .withValue(v)
    +            .build()
             }
    -      }.getOrElse(Seq.empty[EnvVar])
    -    val executorEnv = (Seq(
    -      (ENV_DRIVER_URL, driverUrl),
    -      (ENV_EXECUTOR_CORES, executorCores.toString),
    -      (ENV_EXECUTOR_MEMORY, executorMemoryString),
    -      (ENV_APPLICATION_ID, kubernetesConf.appId),
    -      // This is to set the SPARK_CONF_DIR to be /opt/spark/conf
    -      (ENV_SPARK_CONF_DIR, SPARK_CONF_DIR_INTERNAL),
    -      (ENV_EXECUTOR_ID, kubernetesConf.executorId)) ++
    -      kubernetesConf.environment)
    -      .map(env => new EnvVarBuilder()
    -        .withName(env._1)
    -        .withValue(env._2)
    -        .build()
    -      ) ++ Seq(
    -      new EnvVarBuilder()
    -        .withName(ENV_EXECUTOR_POD_IP)
    -        .withValueFrom(new EnvVarSourceBuilder()
    -          .withNewFieldRef("v1", "status.podIP")
    +      } ++ {
    +        Seq(new EnvVarBuilder()
    +          .withName(ENV_EXECUTOR_POD_IP)
    +          .withValueFrom(new EnvVarSourceBuilder()
    +            .withNewFieldRef("v1", "status.podIP")
    +            .build())
               .build())
    -        .build()
    -    ) ++ executorExtraJavaOptionsEnv ++ executorExtraClasspathEnv.toSeq
    +      } ++ {
    +        Option(secMgr.getSecretKey()).map { authSecret =>
    +          new EnvVarBuilder()
    +            .withName(SecurityManager.ENV_AUTH_SECRET)
    +            .withValue(authSecret)
    --- End diff --
    
    > If the secret is put directly in the environment variable field itself, 
then anyone who has permission to get the pod metadata from the Kubernetes API 
server can now read the secret generated by this application.
    
    Yes, and it's extremely annoying that k8s allows anybody with access to the 
pods to read env variables, instead of just the pod owner. In fact, it doesn't 
even seem to have the concept of who owns the pod.
    
    Anyway, this isn't different from someone else being able to read secrets 
in the same namespace as the pod.
    
    As I said before, it all depends on how you configure your cluster for 
security, and in k8s there seems to be a lot of different options.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to