Github user skonto commented on a diff in the pull request:
https://github.com/apache/spark/pull/19374#discussion_r144947255
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -373,10 +374,16 @@ class SparkContext(config: SparkConf) extends Logging
{
// log out spark.app.name in the Spark driver logs
logInfo(s"Submitted application: $appName")
- // System property spark.yarn.app.id must be set if user code ran by
AM on a YARN cluster
- if (master == "yarn" && deployMode == "cluster" &&
!_conf.contains("spark.yarn.app.id")) {
- throw new SparkException("Detected yarn cluster mode, but isn't
running on a cluster. " +
- "Deployment to YARN is not supported directly by SparkContext.
Please use spark-submit.")
+ // System property spark.yarn.app.id must be set if user code ran by
AM on a YARN cluster or
+ // System property spark.mesos.driver.frameworkId must be set if user
code ran by
+ // Mesos Dispatcher on a MESOS cluster
+ if (deployMode == "cluster") {
--- End diff --
Ok probably you are right we do a spark submit again. I will remove this
part.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]