GitHub user viirya opened a pull request:
https://github.com/apache/spark/pull/4729
[SPARK-5950][SQL] Enable inserting array into Hive table saved as Parquet
using DataSource API
Currently `ParquetConversions` in `HiveMetastoreCatalog` does not really
work. One reason is that table is not part of the children nodes of
`InsertIntoTable`. So the replacing is not working.
When we create a Parquet table in Hive with ARRAY field. In default
`ArrayType` has `containsNull` as true. It affects the table's schema. But when
inserting data into the table later, the schema of inserting data can be with
`containsNull` as true or false. That makes the inserting/reading failed.
A similar problem is reported in
https://issues.apache.org/jira/browse/SPARK-5508.
Hive seems only support null elements array. So this pr enables same
behavior.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/viirya/spark-1 hive_parquet
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/4729.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #4729
----
commit 4e3bd5568e644bc81e2539a917329486ea968a92
Author: Liang-Chi Hsieh <[email protected]>
Date: 2015-02-23T17:03:30Z
Enable inserting array into hive table saved as parquet using datasource.
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]