dongjoon-hyun commented on a change in pull request #25333: [SPARK-28597][SS]
Add config to retry spark streaming's meta log when it met error
URL: https://github.com/apache/spark/pull/25333#discussion_r313167034
##########
File path:
sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/MicroBatchExecutionSuite.scala
##########
@@ -68,4 +73,44 @@ class MicroBatchExecutionSuite extends StreamTest with
BeforeAndAfter {
CheckNewAnswer((25, 1), (30, 1)) // This should not throw the error
reported in SPARK-24156
)
}
+
+ test("SPARK-28597: Add config to retry spark streaming's meta log when it
met") {
+ val s = MemoryStream[Int]
+ val df = s.toDF()
+ // Specified checkpointLocation manually to init metadata file
+ val tmp =
+ new File(System.getProperty("java.io.tmpdir"),
UUID.randomUUID().toString).getCanonicalPath
+ testStream(s.toDF())(
+ StartStream(checkpointLocation = tmp)
+ )
+
+ // fail with less retries
+ df.sparkSession.sessionState.conf.setConfString(
+ SQLConf.STREAMING_CHECKPOINT_FILE_MANAGER_CLASS.parent.key,
+ classOf[FakeFileSystemBasedCheckpointFileManager].getName)
+ df.sparkSession.sessionState.conf.setConfString(
+ SQLConf.STREAMING_META_DATA_NUM_RETRIES.key,
+ 1.toString)
+ intercept[Throwable] {
+ testStream(s.toDF())(
+ StartStream(checkpointLocation = tmp),
+ AddData(s, 1),
+ CheckAnswer(1)
+ )
+ }
Review comment:
`intercept[Throwable]`? It's unclear what you want to test.
It's a good habit to check the exact exception type and the exact error
message always.
Please search `intercept` use cases in the Spark code base. It will be
helpful for you.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]