gengliangwang commented on a change in pull request #33229: URL: https://github.com/apache/spark/pull/33229#discussion_r664400203
########## File path: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DateTimeTestUtils.scala ########## @@ -54,6 +55,13 @@ object DateTimeTestUtils { "Europe/Amsterdam") val outstandingZoneIds: Seq[ZoneId] = outstandingTimezonesIds.map(getZoneId) + private val random = new Random(System.nanoTime()) + + // Take 2 samples from the `outstandingZoneIds`. This is useful when the test case is slow. Review comment: Originally, I did this for improving the test case `SPARK-34761,SPARK-35889: add a day-time interval to a timestamp`. But I am fine with your comment ########## File path: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/DateExpressionsSuite.scala ########## @@ -122,8 +124,8 @@ class DateExpressionsSuite extends SparkFunSuite with ExpressionEvalHelper { (2000 to 2002).foreach { y => (0 to 11 by 11).foreach { m => c.set(y, m, 28) - (0 to 5 * 24).foreach { i => - c.add(Calendar.HOUR_OF_DAY, 1) + (0 to 12).foreach { i => Review comment: It is not used. ########## File path: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/DateExpressionsSuite.scala ########## @@ -122,8 +124,8 @@ class DateExpressionsSuite extends SparkFunSuite with ExpressionEvalHelper { (2000 to 2002).foreach { y => (0 to 11 by 11).foreach { m => c.set(y, m, 28) - (0 to 5 * 24).foreach { i => - c.add(Calendar.HOUR_OF_DAY, 1) + (0 to 12).foreach { i => Review comment: > Why? What is the reason to test the same HOUR_OF_DAY multiple times? The HOUR_OF_DAY is increasing in the loop. I am simply following the original logic. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org