[GitHub] [spark] bart-samwel commented on a change in pull request #28534: [SPARK-31710][SQL]TIMESTAMP_SECONDS, TIMESTAMP_MILLISECONDS and TIMESTAMP_MICROSECONDS to timestamp transfer

2020-05-20 Thread GitBox


bart-samwel commented on a change in pull request #28534:
URL: https://github.com/apache/spark/pull/28534#discussion_r427967213



##
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
##
@@ -401,6 +401,78 @@ case class DayOfYear(child: Expression) extends 
UnaryExpression with ImplicitCas
   }
 }
 
+abstract class NumberToTimestampBase extends UnaryExpression
+  with ImplicitCastInputTypes {

Review comment:
   It does -- it only casts string literals though, not variables. I think 
it's fine to allow this, given that this is Spark semantics. It *will* make it 
harder to add other overloads for these functions later (especially if we want 
string overloads that do something else), but I don't expect that to happen, so 
we're fine.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] bart-samwel commented on a change in pull request #28534: [SPARK-31710][SQL]TIMESTAMP_SECONDS, TIMESTAMP_MILLISECONDS and TIMESTAMP_MICROSECONDS to timestamp transfer

2020-05-20 Thread GitBox


bart-samwel commented on a change in pull request #28534:
URL: https://github.com/apache/spark/pull/28534#discussion_r427884768



##
File path: sql/core/src/test/resources/sql-tests/inputs/datetime.sql
##
@@ -4,6 +4,9 @@
 select 
TIMESTAMP_SECONDS(1230219000),TIMESTAMP_SECONDS(-1230219000),TIMESTAMP_SECONDS(null);
 select 
TIMESTAMP_MILLISECONDS(1230219000123),TIMESTAMP_MILLISECONDS(-1230219000123),TIMESTAMP_MILLISECONDS(null);
 select 
TIMESTAMP_MICROSECONDS(1230219000123123),TIMESTAMP_MICROSECONDS(-1230219000123123),TIMESTAMP_MICROSECONDS(null);
+-- overflow exception:
+select 
TIMESTAMP_SECONDS(1230219000123123),TIMESTAMP_SECONDS(-1230219000123123),TIMESTAMP_SECONDS(null);

Review comment:
   These cases don't need the "null" case. Just one case per SELECT please, 
since you can only test one exception at a time -- the first one that gets 
triggered, the other one isn't even evaluated.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] bart-samwel commented on a change in pull request #28534: [SPARK-31710][SQL]TIMESTAMP_SECONDS, TIMESTAMP_MILLISECONDS and TIMESTAMP_MICROSECONDS to timestamp transfer

2020-05-20 Thread GitBox


bart-samwel commented on a change in pull request #28534:
URL: https://github.com/apache/spark/pull/28534#discussion_r427840937



##
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
##
@@ -401,6 +401,78 @@ case class DayOfYear(child: Expression) extends 
UnaryExpression with ImplicitCas
   }
 }
 
+@ExpressionDescription(
+  usage = "_FUNC_(seconds) - Creates timestamp from the number of seconds 
since UTC epoch.",
+  examples = """
+Examples:
+  > SELECT _FUNC_(1230219000);
+   "2008-12-25 07:30:00"
+  """,
+  group = "datetime_funcs",
+  since = "3.1.0")
+case class SecondsToTimestamp(child: Expression)
+  extends NumberToTimestampBase {
+
+  override def upScaleFactor: SQLTimestamp = MICROS_PER_SECOND

Review comment:
   No, I mean that the type should be Long, just like the base class:
   ```suggestion
 override def upScaleFactor: Long = MICROS_PER_SECOND
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] bart-samwel commented on a change in pull request #28534: [SPARK-31710][SQL]TIMESTAMP_SECONDS, TIMESTAMP_MILLISECONDS and TIMESTAMP_MICROSECONDS to timestamp transfer

2020-05-20 Thread GitBox


bart-samwel commented on a change in pull request #28534:
URL: https://github.com/apache/spark/pull/28534#discussion_r427840563



##
File path: sql/core/src/test/resources/sql-tests/inputs/datetime.sql
##
@@ -1,5 +1,10 @@
 -- date time functions
 
+-- [SPARK-31710] TIMESTAMP_SECONDS, TIMESTAMP_MILLISECONDS and 
TIMESTAMP_MICROSECONDS to timestamp transfer
+select 
TIMESTAMP_SECONDS(1230219000),TIMESTAMP_SECONDS(-1230219000),TIMESTAMP_SECONDS(null);

Review comment:
   The SQL test suite  can also show exceptions, so let's keep it there as 
well.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] bart-samwel commented on a change in pull request #28534: [SPARK-31710][SQL]TIMESTAMP_SECONDS, TIMESTAMP_MILLISECONDS and TIMESTAMP_MICROSECONDS to timestamp transfer

2020-05-20 Thread GitBox


bart-samwel commented on a change in pull request #28534:
URL: https://github.com/apache/spark/pull/28534#discussion_r427818024



##
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
##
@@ -401,6 +401,78 @@ case class DayOfYear(child: Expression) extends 
UnaryExpression with ImplicitCas
   }
 }
 
+@ExpressionDescription(
+  usage = "_FUNC_(seconds) - Creates timestamp from the number of seconds 
since UTC epoch.",
+  examples = """
+Examples:
+  > SELECT _FUNC_(1230219000);
+   "2008-12-25 07:30:00"
+  """,
+  group = "datetime_funcs",
+  since = "3.1.0")
+case class SecondsToTimestamp(child: Expression)
+  extends NumberToTimestampBase {
+
+  override def upScaleFactor: SQLTimestamp = MICROS_PER_SECOND
+
+  override def prettyName: String = "timestamp_seconds"
+}
+
+@ExpressionDescription(
+  usage = "_FUNC_(milliseconds) - " +
+"Creates timestamp from the number of milliseconds since UTC epoch.",
+  examples = """
+Examples:
+  > SELECT _FUNC_(123021900);

Review comment:
   Include precision in the example, e.g. 
   ```suggestion
 > SELECT _FUNC_(1230219123);
"2008-12-25 07:30:00.123"
   ```
   
   Same for micros below.

##
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
##
@@ -401,6 +401,78 @@ case class DayOfYear(child: Expression) extends 
UnaryExpression with ImplicitCas
   }
 }
 
+@ExpressionDescription(
+  usage = "_FUNC_(seconds) - Creates timestamp from the number of seconds 
since UTC epoch.",
+  examples = """
+Examples:
+  > SELECT _FUNC_(1230219000);
+   "2008-12-25 07:30:00"
+  """,
+  group = "datetime_funcs",
+  since = "3.1.0")
+case class SecondsToTimestamp(child: Expression)
+  extends NumberToTimestampBase {
+
+  override def upScaleFactor: SQLTimestamp = MICROS_PER_SECOND
+
+  override def prettyName: String = "timestamp_seconds"
+}
+
+@ExpressionDescription(
+  usage = "_FUNC_(milliseconds) - " +
+"Creates timestamp from the number of milliseconds since UTC epoch.",
+  examples = """
+Examples:
+  > SELECT _FUNC_(123021900);
+   "2008-12-25 07:30:00"
+  """,
+  group = "datetime_funcs",
+  since = "3.1.0")
+case class MilliSecondsToTimestamp(child: Expression)
+  extends NumberToTimestampBase {
+
+  override def upScaleFactor: SQLTimestamp = MICROS_PER_MILLIS
+
+  override def prettyName: String = "timestamp_milliseconds"
+}
+
+@ExpressionDescription(
+  usage = "_FUNC_(microseconds) - " +
+"Creates timestamp from the number of microseconds since UTC epoch.",
+  examples = """
+Examples:
+  > SELECT _FUNC_(12302190);
+   "2008-12-25 07:30:00"
+  """,
+  group = "datetime_funcs",
+  since = "3.1.0")
+case class MicroSecondsToTimestamp(child: Expression)
+  extends NumberToTimestampBase {
+
+  override def upScaleFactor: SQLTimestamp = 1L
+
+  override def prettyName: String = "timestamp_microseconds"
+}
+
+abstract class NumberToTimestampBase extends UnaryExpression

Review comment:
   From a readability perspective it may be better to put the base class 
*before* the subclasses.

##
File path: sql/core/src/test/resources/sql-tests/inputs/datetime.sql
##
@@ -1,5 +1,10 @@
 -- date time functions
 
+-- [SPARK-31710] TIMESTAMP_SECONDS, TIMESTAMP_MILLISECONDS and 
TIMESTAMP_MICROSECONDS to timestamp transfer
+select 
TIMESTAMP_SECONDS(1230219000),TIMESTAMP_SECONDS(-1230219000),TIMESTAMP_SECONDS(null);

Review comment:
   Is it possible to test an out-of-range case, i.e., where the *1000 or 
*100 overflows, and to show that it returns an error?

##
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
##
@@ -401,6 +401,78 @@ case class DayOfYear(child: Expression) extends 
UnaryExpression with ImplicitCas
   }
 }
 
+@ExpressionDescription(
+  usage = "_FUNC_(seconds) - Creates timestamp from the number of seconds 
since UTC epoch.",
+  examples = """
+Examples:
+  > SELECT _FUNC_(1230219000);
+   "2008-12-25 07:30:00"
+  """,
+  group = "datetime_funcs",
+  since = "3.1.0")
+case class SecondsToTimestamp(child: Expression)
+  extends NumberToTimestampBase {
+
+  override def upScaleFactor: SQLTimestamp = MICROS_PER_SECOND

Review comment:
   Is that factor really a timestamp? That seems weird, it's just a Long in 
the base class.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: