Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/19616#discussion_r165516803
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala
---
@@ -51,33 +52,16 @@ import org.apache.spark.util._
/**
* Common application master functionality for Spark on Yarn.
*/
-private[spark] class ApplicationMaster(args: ApplicationMasterArguments)
extends Logging {
+private[spark] class ApplicationMaster(args: ApplicationMasterArguments,
sparkConf: SparkConf,
--- End diff --
This doesn't follow Spark's convention for multi-line arguments.
This also looks a little odd now, because there are conflicting arguments.
`ApplicationMasterArguments` is now only used in cluster mode, and everything
else is expected to be provided in the other parameters. So while this is the
simpler change, it's also a little ugly.
I don't really have a good suggestion right now, but it's something to
think about.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]