Github user viirya commented on a diff in the pull request:
https://github.com/apache/spark/pull/7398#discussion_r34765649
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/types/AbstractDataType.scala
---
@@ -114,6 +114,15 @@ private[sql] object TypeCollection {
BooleanType,
ByteType, ShortType, IntegerType, LongType)
+ /**
+ * Types that include numeric types and interval type. They are only
used in unary_minus,
+ * unary_positive, add and subtract operations.
+ */
+ val NumericAndInterval = TypeCollection(
+ ByteType, ShortType, IntegerType, LongType,
+ FloatType, DoubleType, DecimalType,
+ IntervalType)
--- End diff --
OK. Another problem that might be minor is that the casting error message
will become `argument 1 is expected to be of type (numeric or interval)`,
instead of `...type (tinyint or smallint or int or bigint or float or double or
decimal or interval...`.
Is it still informative?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]