Github user gaborgsomogyi commented on a diff in the pull request:
https://github.com/apache/spark/pull/20807#discussion_r174032329
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala
---
@@ -496,7 +497,7 @@ private[yarn] class YarnAllocator(
executorIdCounter += 1
val executorHostname = container.getNodeId.getHost
val containerId = container.getId
- val executorId = executorIdCounter.toString
+ val executorId = (initialExecutorIdCounter +
executorIdCounter).toString
--- End diff --
The initial problem was that initialExecutorIdCounter is coming from the
driver which is already stopped. Making this lazy solved this. The other
integer is necessary because make it `lazy var` is not possible.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]