[ 
https://issues.apache.org/jira/browse/SPARK-9757?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Cheng Lian updated SPARK-9757:
------------------------------
    Description: 
{{ParquetHiveSerDe}} in Hive versions < 1.2.0 doesn't support decimal. 
Persisting Parquet relations to metastore of such versions (say 0.13.1) throws 
the following exception after SPARK-6923.
{code}
Caused by: java.lang.UnsupportedOperationException: Parquet does not support 
decimal. See HIVE-6384
        at 
org.apache.hadoop.hive.ql.io.parquet.serde.ArrayWritableObjectInspector.getObjectInspector(ArrayWritableObjectInspector.java:102)
        at 
org.apache.hadoop.hive.ql.io.parquet.serde.ArrayWritableObjectInspector.<init>(ArrayWritableObjectInspector.java:60)
        at 
org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe.initialize(ParquetHiveSerDe.java:113)
        at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:339)
        at 
org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:288)
        at 
org.apache.hadoop.hive.ql.metadata.Table.checkValidity(Table.java:194)
        at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:597)
        at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:576)
        at 
org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply$mcV$sp(ClientWrapper.scala:358)
        at 
org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply(ClientWrapper.scala:356)
        at 
org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply(ClientWrapper.scala:356)
        at 
org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:256)
        at 
org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:211)
        at 
org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:248)
        at 
org.apache.spark.sql.hive.client.ClientWrapper.createTable(ClientWrapper.scala:356)
        at 
org.apache.spark.sql.hive.HiveMetastoreCatalog.createDataSourceTable(HiveMetastoreCatalog.scala:351)
        at 
org.apache.spark.sql.hive.HiveMetastoreCatalog.createDataSourceTable(HiveMetastoreCatalog.scala:198)
        at 
org.apache.spark.sql.hive.execution.CreateMetastoreDataSource.run(commands.scala:152)
{code}

  was:
{code}
Caused by: java.lang.UnsupportedOperationException: Parquet does not support 
decimal. See HIVE-6384
        at 
org.apache.hadoop.hive.ql.io.parquet.serde.ArrayWritableObjectInspector.getObjectInspector(ArrayWritableObjectInspector.java:102)
        at 
org.apache.hadoop.hive.ql.io.parquet.serde.ArrayWritableObjectInspector.<init>(ArrayWritableObjectInspector.java:60)
        at 
org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe.initialize(ParquetHiveSerDe.java:113)
        at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:339)
        at 
org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:288)
        at 
org.apache.hadoop.hive.ql.metadata.Table.checkValidity(Table.java:194)
        at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:597)
        at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:576)
        at 
org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply$mcV$sp(ClientWrapper.scala:358)
        at 
org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply(ClientWrapper.scala:356)
        at 
org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply(ClientWrapper.scala:356)
        at 
org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:256)
        at 
org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:211)
        at 
org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:248)
        at 
org.apache.spark.sql.hive.client.ClientWrapper.createTable(ClientWrapper.scala:356)
        at 
org.apache.spark.sql.hive.HiveMetastoreCatalog.createDataSourceTable(HiveMetastoreCatalog.scala:351)
        at 
org.apache.spark.sql.hive.HiveMetastoreCatalog.createDataSourceTable(HiveMetastoreCatalog.scala:198)
        at 
org.apache.spark.sql.hive.execution.CreateMetastoreDataSource.run(commands.scala:152)
{code}


> Can't create persistent data source tables with decimal
> -------------------------------------------------------
>
>                 Key: SPARK-9757
>                 URL: https://issues.apache.org/jira/browse/SPARK-9757
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.0
>            Reporter: Michael Armbrust
>            Priority: Blocker
>
> {{ParquetHiveSerDe}} in Hive versions < 1.2.0 doesn't support decimal. 
> Persisting Parquet relations to metastore of such versions (say 0.13.1) 
> throws the following exception after SPARK-6923.
> {code}
> Caused by: java.lang.UnsupportedOperationException: Parquet does not support 
> decimal. See HIVE-6384
>       at 
> org.apache.hadoop.hive.ql.io.parquet.serde.ArrayWritableObjectInspector.getObjectInspector(ArrayWritableObjectInspector.java:102)
>       at 
> org.apache.hadoop.hive.ql.io.parquet.serde.ArrayWritableObjectInspector.<init>(ArrayWritableObjectInspector.java:60)
>       at 
> org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe.initialize(ParquetHiveSerDe.java:113)
>       at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:339)
>       at 
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:288)
>       at 
> org.apache.hadoop.hive.ql.metadata.Table.checkValidity(Table.java:194)
>       at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:597)
>       at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:576)
>       at 
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply$mcV$sp(ClientWrapper.scala:358)
>       at 
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply(ClientWrapper.scala:356)
>       at 
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply(ClientWrapper.scala:356)
>       at 
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:256)
>       at 
> org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:211)
>       at 
> org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:248)
>       at 
> org.apache.spark.sql.hive.client.ClientWrapper.createTable(ClientWrapper.scala:356)
>       at 
> org.apache.spark.sql.hive.HiveMetastoreCatalog.createDataSourceTable(HiveMetastoreCatalog.scala:351)
>       at 
> org.apache.spark.sql.hive.HiveMetastoreCatalog.createDataSourceTable(HiveMetastoreCatalog.scala:198)
>       at 
> org.apache.spark.sql.hive.execution.CreateMetastoreDataSource.run(commands.scala:152)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to