That exception is coming from the metastore (trying to write the table 
definition).  Could you dig down into the Hive logs to see if you can get the 
underlying cause?

You can get the logs to spew on console by adding "-hiveconf 
hive.root.logger=DEBUG,console" to your Hive CLI invocation.

JVS

On Jun 15, 2010, at 11:57 AM, Ray Duong wrote:

Hi,

I'm trying to map a Hbase table in Hive that contains large number of columns.  
Since Hbase is designed to be a wide table, does Hive/Hbase integration have 
any set limitation on the number of columns it can map in one table?  I seem to 
hit a limit at 10 columns.

Thanks,
-ray

create external table hbase_t1
(
key string,
f1_a string,
f2_a string,
f1_b string,
f2_b string,
...
...
f1_m string,
f2_m string,

 )
 STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
 WITH SERDEPROPERTIES ("hbase.columns.mapping" = 
":key,f1:a,f2:a,f1:b,f2:b,f1:c,f2:c,f1:d,f2:d,f1:e,f2:e,f1:f,f2:f,f1:g,f2:g,f1:h,f2:h,f1:i,f2:i,f1:j,f2:j,f1:k,f2:k,f1:l,f2:l,f1:m,f2:m"
 )
 TBLPROPERTIES("hbase.table.name<http://hbase.table.name/>" = "t1");

Error Message:

FAILED: Error in metadata: javax.jdo.JDODataStoreException: Put request failed 
: INSERT INTO `SERDE_PARAMS` (`PARAM_VALUE`,`SERDE_ID`,`PARAM_KEY`) VALUES 
(?,?,?)
NestedThrowables:
org.datanucleus.store.mapped.exceptions.MappedDatastoreException: INSERT INTO 
`SERDE_PARAMS` (`PARAM_VALUE`,`SERDE_ID`,`PARAM_KEY`) VALUES (?,?,?)
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.DDLTask




Reply via email to