[ https://issues.apache.org/jira/browse/SPARK-26301?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-26301: ---------------------------------- Affects Version/s: (was: 3.0.0) 3.1.0 > Consider switching from putting secret in environment variable directly to > using secret reference > ------------------------------------------------------------------------------------------------- > > Key: SPARK-26301 > URL: https://issues.apache.org/jira/browse/SPARK-26301 > Project: Spark > Issue Type: New Feature > Components: Kubernetes > Affects Versions: 3.1.0 > Reporter: Matt Cheah > Priority: Major > > In SPARK-26194 we proposed using an environment variable that is loaded in > the executor pod spec to share the generated SASL secret key between the > driver and the executors. However in practice this is very difficult to > secure. Most traditional Kubernetes deployments will handle permissions by > allowing wide access to viewing pod specs but restricting access to view > Kubernetes secrets. Now however any user that can view the pod spec can also > view the contents of the SASL secrets. > An example use case where this quickly breaks down is in the case where a > systems administrator is allowed to look at pods that run user code in order > to debug failing infrastructure, but the cluster administrator should not be > able to view contents of secrets or other sensitive data from Spark > applications run by their users. > We propose modifying the existing solution to instead automatically create a > Kubernetes Secret object containing the SASL encryption key, then using the > [secret reference feature in > Kubernetes|https://kubernetes.io/docs/concepts/configuration/secret/#using-secrets-as-environment-variables] > to store the data in the environment variable without putting the secret > data in the pod spec itself. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org