Github user dongjoon-hyun commented on a diff in the pull request:
https://github.com/apache/spark/pull/19519#discussion_r145491446
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -235,11 +235,11 @@ object SparkSubmit extends CommandLineUtils with
Logging {
private[deploy] def prepareSubmitEnvironment(
args: SparkSubmitArguments,
conf: Option[HadoopConfiguration] = None)
- : (Seq[String], Seq[String], Map[String, String], String) = {
+ : (Seq[String], Seq[String], SparkConf, String) = {
// Return values
val childArgs = new ArrayBuffer[String]()
val childClasspath = new ArrayBuffer[String]()
- val sysProps = new HashMap[String, String]()
+ val sparkConf = new SparkConf()
--- End diff --
Hi, @vanzin .
Is it intentionally loading default config? Previously, it wasn't in [line
340](https://github.com/apache/spark/pull/19519/files#diff-4d2ab44195558d5a9d5f15b8803ef39dL340).
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]