mccheah commented on a change in pull request #28618:
URL: https://github.com/apache/spark/pull/28618#discussion_r485994084
##########
File path:
core/src/main/java/org/apache/spark/shuffle/sort/io/LocalDiskShuffleExecutorComponents.java
##########
@@ -17,69 +17,64 @@
package org.apache.spark.shuffle.sort.io;
-import java.util.Map;
import java.util.Optional;
import com.google.common.annotations.VisibleForTesting;
+import com.google.common.base.Supplier;
+import com.google.common.base.Suppliers;
import org.apache.spark.SparkConf;
import org.apache.spark.SparkEnv;
+import org.apache.spark.shuffle.IndexShuffleBlockResolver;
import org.apache.spark.shuffle.api.ShuffleExecutorComponents;
import org.apache.spark.shuffle.api.ShuffleMapOutputWriter;
-import org.apache.spark.shuffle.IndexShuffleBlockResolver;
import org.apache.spark.shuffle.api.SingleSpillShuffleMapOutputWriter;
-import org.apache.spark.storage.BlockManager;
public class LocalDiskShuffleExecutorComponents implements
ShuffleExecutorComponents {
private final SparkConf sparkConf;
- private BlockManager blockManager;
- private IndexShuffleBlockResolver blockResolver;
+ private final Supplier<IndexShuffleBlockResolver> blockResolver;
Review comment:
I believe this is a supplier because at the time the executor components
are initialized, the entire SparkEnv instance is not initialized yet, so we
want the block resolver to be fetched from the SparkEnv lazily.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]