Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/23042#discussion_r234091858
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala
---
@@ -138,6 +138,11 @@ object TypeCoercion {
case (DateType, TimestampType)
=> if (conf.compareDateTimestampInTimestamp) Some(TimestampType)
else Some(StringType)
+ // to support a popular use case of tables using Decimal(X, 0) for
long IDs instead of strings
+ // see SPARK-26070 for more details
+ case (n: DecimalType, s: StringType) if n.scale == 0 =>
Some(DecimalType(n.precision, n.scale))
--- End diff --
CC @gatorsmile @mgaido91 I think it's time to look at the SQL standard and
other mainstream databases, and see how shall we update the type coercions
rules with safe mode. What do you think?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]