Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/10283#discussion_r47488206
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala
---
@@ -61,6 +61,7 @@ object ScalaReflection extends ScalaReflection {
case t if t <:< definitions.ByteTpe => ByteType
case t if t <:< definitions.BooleanTpe => BooleanType
case t if t <:< localTypeOf[Array[Byte]] => BinaryType
+ case t if t <:< localTypeOf[Decimal] => DecimalType.SYSTEM_DEFAULT
--- End diff --
Normally `Decimal` should only be used inside spark SQL as the internal
representation of decimal type, and we don't need to catch it here. Do we break
it in tests?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]