Ngone51 commented on a change in pull request #28924:
URL: https://github.com/apache/spark/pull/28924#discussion_r447579643
##########
File path: core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala
##########
@@ -93,6 +94,7 @@ class BlockManagerSuite extends SparkFunSuite with Matchers
with BeforeAndAfterE
.set(MEMORY_STORAGE_FRACTION, 0.999)
.set(Kryo.KRYO_SERIALIZER_BUFFER_SIZE.key, "1m")
.set(STORAGE_UNROLL_MEMORY_THRESHOLD, 512L)
+ .set(Network.RPC_ASK_TIMEOUT, "5s")
Review comment:
In the newly added tests, we need to simulate the timeout error from
BlockManager. But at the same time, we also don't want the test run too long
since the default timeout value is 120s. Therefore, we choose a quite short
timeout for the tests. On the other hand, we don't set it to a smaller value,
e.g. 1s, which may cause test flaky.
Note that the best way to set the timeout value is to set it for the newly
added tests locally instead of setting it globally. However, with the
limitation of the current test framework in Core side, it's hard to set it
locally since it requires more changes.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]