mridulm commented on code in PR #45052:
URL: https://github.com/apache/spark/pull/45052#discussion_r1501375088
##########
core/src/main/scala/org/apache/spark/storage/BlockManager.scala:
##########
@@ -177,15 +177,17 @@ private[spark] class HostLocalDirManager(
* Manager running on every node (driver and executors) which provides
interfaces for putting and
* retrieving blocks both locally and remotely into various stores (memory,
disk, and off-heap).
*
- * Note that [[initialize()]] must be called before the BlockManager is usable.
+ * Note that [[initialize()]] must be called before the BlockManager is
usable. Also, the
+ * `memoryManager` is initialized at a later stage after DriverPlugin is
loaded, to allow the
+ * plugin to overwrite memory configurations.
*/
private[spark] class BlockManager(
val executorId: String,
rpcEnv: RpcEnv,
val master: BlockManagerMaster,
val serializerManager: SerializerManager,
val conf: SparkConf,
- memoryManager: MemoryManager,
+ var memoryManager: MemoryManager,
Review Comment:
If this is a task from some previous test, which has completed, the
exception is beneign at best and not a concern for non-test codepaths, right ?
If yes, do we need to try to work around this ? (other than by fixing the
offending tests perhaps ?)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]