[
https://issues.apache.org/jira/browse/HIVE-6425?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15753400#comment-15753400
]
Sangita commented on HIVE-6425:
-------------------------------
I was facing the same issue. It was resolved by doing following changes in hive
meta store.
-- log into Hive Metastore DB
-- >alter table SERDE_PARAMS MODIFY PARAM_VALUE VARCHAR(400000000);
https://community.hortonworks.com/questions/33311/number-column-limitations-in-hive-over-hbase-table.html
> Unable to create external table with 3000+ columns
> --------------------------------------------------
>
> Key: HIVE-6425
> URL: https://issues.apache.org/jira/browse/HIVE-6425
> Project: Hive
> Issue Type: Bug
> Components: Metastore
> Affects Versions: 0.10.0
> Environment: Linux, CDH 4.2.0
> Reporter: Anurag
> Labels: patch
> Attachments: Hive_Script.txt
>
>
> While creating an external table in Hive to a table in HBase with 3000+
> columns, Hive shows up an error:
> FAILED: Error in metadata:
> MetaException(message:javax.jdo.JDODataStoreException: Put request failed :
> INSERT INTO "SERDE_PARAMS" ("PARAM_VALUE","SERDE_ID","PARAM_KEY") VALUES
> (?,?,?)
> NestedThrowables:
> org.datanucleus.store.rdbms.exceptions.MappedDatastoreException: INSERT INTO
> "SERDE_PARAMS" ("PARAM_VALUE","SERDE_ID","PARAM_KEY") VALUES (?,?,?) )
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)