[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user felixcheung commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r240022921 --- Diff: core/src/test/scala/org/apache/spark/SecurityManagerSuite.scala --- @@ -440,12 +473,27 @@ class SecurityManagerSuite extends SparkFunSuite with ResetSystemProperties { intercept[IllegalArgumentException] { mgr.getSecretKey() } + case FILE => +val secretFile = createTempSecretFile() +conf.set(AUTH_SECRET_FILE, secretFile.getAbsolutePath) +mgr.initializeAuth() +assert(encodeFileAsBase64(secretFile) === mgr.getSecretKey()) } } } ) } } + private def encodeFileAsBase64(secretFile: File) = { +Base64.getEncoder.encodeToString(Files.readAllBytes(secretFile.toPath)) + } + + private def createTempSecretFile(contents: String = "test-secret"): File = { +val secretDir = Utils.createTempDir("temp-secrets") +val secretFile = new File(secretDir, "temp-secret.txt") +Files.write(secretFile.toPath, contents.getBytes(StandardCharsets.UTF_8)) +secretFile --- End diff -- ah it's fine --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user mccheah commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r239990434 --- Diff: resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/features/BasicExecutorFeatureStepSuite.scala --- @@ -16,10 +16,13 @@ */ package org.apache.spark.deploy.k8s.features -import scala.collection.JavaConverters._ +import java.io.File +import java.nio.charset.StandardCharsets +import java.nio.file.Files import io.fabric8.kubernetes.api.model._ import org.scalatest.BeforeAndAfter +import scala.collection.JavaConverters._ --- End diff -- Think I fixed this. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user mccheah commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r239944338 --- Diff: core/src/main/scala/org/apache/spark/SecurityManager.scala --- @@ -380,6 +400,12 @@ private[spark] class SecurityManager( } } + private def readSecretFile(secretFilePath: String): String = { +val secretFile = new File(secretFilePath) +require(secretFile.isFile, s"No file found containing the secret key at $secretFilePath.") +Base64.getEncoder.encodeToString(Files.readAllBytes(secretFile.toPath)) --- End diff -- Will check non-empty only --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r239898432 --- Diff: core/src/main/scala/org/apache/spark/SecurityManager.scala --- @@ -380,6 +400,12 @@ private[spark] class SecurityManager( } } + private def readSecretFile(secretFilePath: String): String = { +val secretFile = new File(secretFilePath) +require(secretFile.isFile, s"No file found containing the secret key at $secretFilePath.") +Base64.getEncoder.encodeToString(Files.readAllBytes(secretFile.toPath)) --- End diff -- It would be good to validate that this is at least not empty; optionally if it at least contain the number of expected bytes (there's a config for that). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r239899856 --- Diff: resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/BasicExecutorFeatureStep.scala --- @@ -112,12 +112,14 @@ private[spark] class BasicExecutorFeatureStep( .build()) .build()) } ++ { -Option(secMgr.getSecretKey()).map { authSecret => - new EnvVarBuilder() -.withName(SecurityManager.ENV_AUTH_SECRET) -.withValue(authSecret) -.build() -} +Option(secMgr.getSecretKey()) --- End diff -- ``` if (!kubernetesConf.contains(...)) { // existing code } else { None } ``` I really don't like trying to mix different concerns in the same call chain... --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r23991 --- Diff: core/src/test/scala/org/apache/spark/SecurityManagerSuite.scala --- @@ -440,12 +473,27 @@ class SecurityManagerSuite extends SparkFunSuite with ResetSystemProperties { intercept[IllegalArgumentException] { mgr.getSecretKey() } + case FILE => --- End diff -- add empty line before --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r239900712 --- Diff: resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/BasicExecutorFeatureStep.scala --- @@ -24,7 +24,7 @@ import org.apache.spark.{SecurityManager, SparkConf, SparkException} import org.apache.spark.deploy.k8s._ import org.apache.spark.deploy.k8s.Config._ import org.apache.spark.deploy.k8s.Constants._ -import org.apache.spark.internal.config.{EXECUTOR_CLASS_PATH, EXECUTOR_JAVA_OPTIONS, EXECUTOR_MEMORY, EXECUTOR_MEMORY_OVERHEAD, PYSPARK_EXECUTOR_MEMORY} +import org.apache.spark.internal.config.{AUTH_SECRET_FILE_EXECUTOR, EXECUTOR_CLASS_PATH, EXECUTOR_JAVA_OPTIONS, EXECUTOR_MEMORY, EXECUTOR_MEMORY_OVERHEAD, PYSPARK_EXECUTOR_MEMORY} --- End diff -- Seems like a good time to just use a wildcard here. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r239897958 --- Diff: core/src/main/scala/org/apache/spark/SecurityManager.scala --- @@ -367,11 +372,26 @@ private[spark] class SecurityManager( case _ => require(sparkConf.contains(SPARK_AUTH_SECRET_CONF), - s"A secret key must be specified via the $SPARK_AUTH_SECRET_CONF config.") + s"A secret key must be specified via the $SPARK_AUTH_SECRET_CONF config") return } -secretKey = Utils.createSecret(sparkConf) +if (sparkConf.get(AUTH_SECRET_FILE_DRIVER).isDefined + && sparkConf.get(AUTH_SECRET_FILE_EXECUTOR).isEmpty) { --- End diff -- Indent the second line one more level. Also you could do `sparkConf.contains(...DRIVER) == sparkConf.contains(...EXECUTOR)`. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r239900460 --- Diff: resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/features/BasicExecutorFeatureStepSuite.scala --- @@ -158,6 +161,25 @@ class BasicExecutorFeatureStepSuite extends SparkFunSuite with BeforeAndAfter { checkEnv(executor, conf, Map(SecurityManager.ENV_AUTH_SECRET -> secMgr.getSecretKey())) } + test("Auth secret shouldn't propagate if files are loaded.") { +val secretDir = Utils.createTempDir("temp-secret") +val secretFile = new File(secretDir, "secret-file.txt") +Files.write(secretFile.toPath, "some-secret".getBytes(StandardCharsets.UTF_8)) +val conf = baseConf.clone() + .set(NETWORK_AUTH_ENABLED, true) + .set(AUTH_SECRET_FILE, secretFile.getAbsolutePath) + .set("spark.master", "k8s://127.0.0.1") +val secMgr = new SecurityManager(conf) +secMgr.initializeAuth() + +val step = new BasicExecutorFeatureStep(KubernetesTestConf.createExecutorConf(sparkConf = conf), + secMgr) + +val executor = step.configurePod(SparkPod.initialPod()) +assert(!executor.container.getEnv.asScala.map(_.getName).contains( --- End diff -- `!KubernetesFeaturesTestUtils.containerHasEnvVar(...)` --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r239900826 --- Diff: core/src/test/scala/org/apache/spark/SecurityManagerSuite.scala --- @@ -18,8 +18,11 @@ package org.apache.spark import java.io.File +import java.nio.charset.StandardCharsets --- End diff -- Unnecessary (see next line). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user mccheah commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r239884995 --- Diff: core/src/test/scala/org/apache/spark/SecurityManagerSuite.scala --- @@ -440,12 +473,27 @@ class SecurityManagerSuite extends SparkFunSuite with ResetSystemProperties { intercept[IllegalArgumentException] { mgr.getSecretKey() } + case FILE => +val secretFile = createTempSecretFile() +conf.set(AUTH_SECRET_FILE, secretFile.getAbsolutePath) +mgr.initializeAuth() +assert(encodeFileAsBase64(secretFile) === mgr.getSecretKey()) } } } ) } } + private def encodeFileAsBase64(secretFile: File) = { +Base64.getEncoder.encodeToString(Files.readAllBytes(secretFile.toPath)) + } + + private def createTempSecretFile(contents: String = "test-secret"): File = { +val secretDir = Utils.createTempDir("temp-secrets") +val secretFile = new File(secretDir, "temp-secret.txt") +Files.write(secretFile.toPath, contents.getBytes(StandardCharsets.UTF_8)) +secretFile --- End diff -- Hm, can you clarify? The temporary directory should be generated with a random string. I believe this is sufficient for this test. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user dongjoon-hyun commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r239706529 --- Diff: resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/features/BasicExecutorFeatureStepSuite.scala --- @@ -16,10 +16,13 @@ */ package org.apache.spark.deploy.k8s.features -import scala.collection.JavaConverters._ +import java.io.File +import java.nio.charset.StandardCharsets +import java.nio.file.Files import io.fabric8.kubernetes.api.model._ import org.scalatest.BeforeAndAfter +import scala.collection.JavaConverters._ --- End diff -- ? Hi, @mccheah . We import `java.*` and `scala.*` before any others. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user felixcheung commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r239705869 --- Diff: core/src/test/scala/org/apache/spark/SecurityManagerSuite.scala --- @@ -440,12 +473,27 @@ class SecurityManagerSuite extends SparkFunSuite with ResetSystemProperties { intercept[IllegalArgumentException] { mgr.getSecretKey() } + case FILE => +val secretFile = createTempSecretFile() +conf.set(AUTH_SECRET_FILE, secretFile.getAbsolutePath) +mgr.initializeAuth() +assert(encodeFileAsBase64(secretFile) === mgr.getSecretKey()) } } } ) } } + private def encodeFileAsBase64(secretFile: File) = { +Base64.getEncoder.encodeToString(Files.readAllBytes(secretFile.toPath)) + } + + private def createTempSecretFile(contents: String = "test-secret"): File = { +val secretDir = Utils.createTempDir("temp-secrets") +val secretFile = new File(secretDir, "temp-secret.txt") +Files.write(secretFile.toPath, contents.getBytes(StandardCharsets.UTF_8)) +secretFile --- End diff -- can this secret be recovered on disk or we trust tempDir ACL is sufficient? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
Github user mccheah commented on a diff in the pull request: https://github.com/apache/spark/pull/23252#discussion_r239701821 --- Diff: core/src/main/scala/org/apache/spark/SecurityManager.scala --- @@ -367,11 +372,26 @@ private[spark] class SecurityManager( case _ => require(sparkConf.contains(SPARK_AUTH_SECRET_CONF), - s"A secret key must be specified via the $SPARK_AUTH_SECRET_CONF config.") + s"A secret key must be specified via the $SPARK_AUTH_SECRET_CONF config") return } -secretKey = Utils.createSecret(sparkConf) +if (sparkConf.get(AUTH_SECRET_FILE_DRIVER).isDefined --- End diff -- Also considered adding a validation that file-based auth can only be used for Kubernetes. Omitted it for simplicity, but file-based auth seems dubious for non-K8s contexts. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23252: [SPARK-26239] File-based secret key loading for S...
GitHub user mccheah opened a pull request: https://github.com/apache/spark/pull/23252 [SPARK-26239] File-based secret key loading for SASL. ## What changes were proposed in this pull request? This proposes an alternative way to load secret keys into a Spark application that is running on Kubernetes. Instead of automatically generating the secret, the secret key can reside in a file that is shared between both the driver and executor containers. ## How was this patch tested? Unit tests. You can merge this pull request into a Git repository by running: $ git pull https://github.com/palantir/spark auth-secret-with-file Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/23252.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #23252 commit 957cb15a2d48b4cf2b5c7f1a8c124df3a53bf4d9 Author: mcheah Date: 2018-12-07T05:04:56Z [SPARK-26239] File-based secret key loading for SASL. This proposes an alternative way to load secret keys into a Spark application that is running on Kubernetes. Instead of automatically generating the secret, the secret key can reside in a file that is shared between both the driver and executor containers. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org