HeartSaVioR commented on a change in pull request #23660: 
[SPARK-26379][SS][FOLLOWUP] Use dummy TimeZoneId to avoid UnresolvedException 
in CurrentBatchTimestamp
URL: https://github.com/apache/spark/pull/23660#discussion_r251198547
 
 

 ##########
 File path: 
sql/core/src/test/scala/org/apache/spark/sql/streaming/StreamSuite.scala
 ##########
 @@ -1081,21 +1081,19 @@ class StreamSuite extends StreamTest {
     }
   }
 
-  test("SPARK-26379 Structured Streaming - Exception on adding 
current_timestamp / current_date" +
+  test("SPARK-26379 Structured Streaming - Exception on adding 
current_timestamp " +
     " to Dataset - use v2 sink") {
     testCurrentTimestampOnStreamingQuery(useV2Sink = true)
   }
 
-  test("SPARK-26379 Structured Streaming - Exception on adding 
current_timestamp / current_date" +
+  test("SPARK-26379 Structured Streaming - Exception on adding 
current_timestamp " +
     " to Dataset - use v1 sink") {
     testCurrentTimestampOnStreamingQuery(useV2Sink = false)
   }
 
   private def testCurrentTimestampOnStreamingQuery(useV2Sink: Boolean): Unit = 
{
     val input = MemoryStream[Int]
-    val df = input.toDS()
-      .withColumn("cur_timestamp", lit(current_timestamp()))
-      .withColumn("cur_date", lit(current_date()))
+    val df = input.toDS().withColumn("cur_timestamp", lit(current_timestamp()))
 
 Review comment:
   Yeah, adding `cur_date` was the reason why UT is passed even without the 
patch.
   
   I added only `current_timestamp()` first and added `current_date()` 
afterwards. which looks like making cur_timestamp be resolved without any of 
patches.
   (Though I'm not sure about the mechanism why it happens...)
   
   Nice finding!

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to