Arseniy Tashoyan created SPARK-34345: ----------------------------------------
Summary: Allow several files property files Key: SPARK-34345 URL: https://issues.apache.org/jira/browse/SPARK-34345 Project: Spark Issue Type: Improvement Components: Spark Submit Affects Versions: 3.0.1 Reporter: Arseniy Tashoyan Example: we have 2 applications A and B. These applications have some common Spark settings and some application-specific settings. The idea is to run them like this: {code:bash} spark-submit --properties-files common.properties,a.properties A spark-submit --properties-files common.properties,b.properties B {code} Benefits: - Common settings can be extracted to a common file _common.properties_, no need to copy them over _a.properties_ and _b.properties_ - Applications can override common settings in their respective custom properties files Currently the following mechanism works in SparkSubmitArguments.scala: console arguments like _--conf key=value_ overwrite settings in the properties file. This is not enough, because console arguments should be specified in the launcher script; de-facto they belong to the binary distribution rather than the configuration. Consider the following scenario: Spark on Kubernetes, the configuration is provided as a ConfigMap. We could have the following ConfigMaps: - _a.properties_ // mount to the Pod with application A - _b.properties_ // mount to the Pod with application B - _common.properties_ // mount to both Pods with A and B Meanwhile the launcher script _app-submit.sh_ is the same for both applications A and B, since it contains none configuration settings: {code:bash} spark-submit --properties-files common.properties,${app_name}.properties ... {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org