holdenk commented on a change in pull request #33211:
URL: https://github.com/apache/spark/pull/33211#discussion_r667095177
##########
File path:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/KubernetesConf.scala
##########
@@ -225,6 +225,9 @@ private[spark] object KubernetesConf {
new KubernetesExecutorConf(sparkConf.clone(), appId, executorId,
driverPod, resourceProfileId)
}
+ def getKubernetesAppId(): String =
+ s"spark-${UUID.randomUUID().toString.replaceAll("-", "")}"
+
Review comment:
We should still allow the user to specify their own app ID.
##########
File path:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/scheduler/cluster/k8s/KubernetesClusterSchedulerBackend.scala
##########
@@ -83,12 +84,12 @@ private[spark] class KubernetesClusterSchedulerBackend(
/**
* Get an application ID associated with the job.
* This returns the string value of spark.app.id if set, otherwise
- * the locally-generated ID from the superclass.
+ * the locally-generated ID.
*
* @return The application ID
*/
override def applicationId(): String = {
-
conf.getOption("spark.app.id").map(_.toString).getOrElse(super.applicationId)
+ conf.getOption("spark.app.id").map(_.toString).getOrElse(appId)
Review comment:
Would it make sense to instead change the superclass appId impl?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]