Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/19519#discussion_r145492860
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -235,11 +235,11 @@ object SparkSubmit extends CommandLineUtils with
Logging {
private[deploy] def prepareSubmitEnvironment(
args: SparkSubmitArguments,
conf: Option[HadoopConfiguration] = None)
- : (Seq[String], Seq[String], Map[String, String], String) = {
+ : (Seq[String], Seq[String], SparkConf, String) = {
// Return values
val childArgs = new ArrayBuffer[String]()
val childClasspath = new ArrayBuffer[String]()
- val sysProps = new HashMap[String, String]()
+ val sparkConf = new SparkConf()
--- End diff --
Yes. Becase this conf will now be exposed to apps (once I change code to
extend `SparkApplication`), the conf needs to respect system properties.
In fact the previous version should probably have done that too from the
get go.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]