Github user liancheng commented on a diff in the pull request:
https://github.com/apache/spark/pull/6796#discussion_r33487239
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTableSupport.scala
---
@@ -369,9 +371,6 @@ private[parquet] class MutableRowWriteSupport extends
RowWriteSupport {
case DateType => writer.addInteger(record.getInt(index))
case TimestampType => writeTimestamp(record.getLong(index))
case d: DecimalType =>
- if (d.precisionInfo == None || d.precisionInfo.get.precision > 18)
{
- sys.error(s"Unsupported datatype $d, cannot write to consumer")
- }
--- End diff --
Had an offline discussion with @yhuai but forgot to post a summary here: at
last we decided not to convert unlimited decimals to `decimal(10, 0)`
implicitly in #6617, because firstly we need to confirm all other parts works
in a consistent way, which might introduce unexpected complexity in #6617, and
secondly implicit conversion can often become a huge footgun. So let's still
report an error in case of `d.precisionInfo == None` (but please throw an
`AnalysisException` instead of using `sys.error`).
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]