Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/18655#discussion_r128146875
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/execution/arrow/ArrowConvertersSuite.scala
---
@@ -391,6 +392,85 @@ class ArrowConvertersSuite extends SharedSQLContext
with BeforeAndAfterAll {
collectAndValidate(df, json, "floating_point-double_precision.json")
}
+ ignore("decimal conversion") {
--- End diff --
Oh, I'm sorry, I should have mentioned it.
It seems like `JsonFileReader` doesn't support DecimalType, so I ignored it
for now.
But now I'm thinking that If Arrow 0.4.0 has a bug for the decimal type as
you said, should I remove decimal type support from this pr and add support in
the following prs?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]