HyukjinKwon commented on a change in pull request #28858:
URL: https://github.com/apache/spark/pull/28858#discussion_r442765441



##########
File path: 
sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/UnsafeRow.java
##########
@@ -90,7 +90,8 @@ public static int calculateBitSetWidthInBytes(int numFields) {
           FloatType,
           DoubleType,
           DateType,
-          TimestampType
+          TimestampType,
+          TimeType

Review comment:
       Problem is that you have to define the mapping for PySpark and SparkR 
too that's probably one of the tricky part.
   I think it's best to loop the dev mailing list and see if there's a big 
support for this. Otherwise, I'm not super supportive of it due to the 
maintenance cost.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to