tgravescs commented on pull request #30738: URL: https://github.com/apache/spark/pull/30738#issuecomment-747580485
its a matter of the confs and environment and what is in the docker images and keeping things consistent. If I create a docker image that contains the spark and spark confs with a bunch of things set in spark-env, but then when I launch my job I change the configs or specifically launch it with different config files its still going to pick up the spark-env.sh from the docker image. It shouldn't do that to keep things consistent and get reliable results. There is a driver and executor plugin that run at launch, its a developer api so probably not great docs on it: https://github.com/apache/spark/blob/master/core/src/main/java/org/apache/spark/api/plugin/SparkPlugin.java it does run after start though so I'm not sure exactly when you need this to run. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
