Hi, It seems a common error while configuring Mysql hive metastore. It may be issue with charset of database. Please go through the below links to solve the issue. https://qnalist.com/questions/206026/help-regarding-mysql-setup-for-metastore http://www.programering.com/a/MTMygDNwATk.html
Regards, Ravindra. On 30 June 2016 at 13:04, 刘军 <[email protected]> wrote: > 大家好,我按照 https://github.com/HuaweiBigData/carbondata/wiki/Quick-Start > 操作,开始一切正常,到了建表这句 > scala> cc.sql("create table if not exists table1 (id string, name string, > city string, age Int) STORED BY 'org.apache.carbondata.format'") > 报错,错误信息如下: > INFO 30-06 15:32:28,408 - main Query [CREATE TABLE IF NOT EXISTS TABLE1 > (ID STRING, NAME STRING, CITY STRING, AGE INT) STORED BY > 'ORG.APACHE.CARBONDATA.FORMAT'] > INFO 30-06 15:32:28,420 - Parsing command: create table if not exists > table1 (id string, name string, city string, age Int) STORED BY > 'org.apache.carbondata.format' > INFO 30-06 15:32:28,421 - Parse Completed > AUDIT 30-06 15:32:28,426 - [Pro.local]Creating Table with Database name > [default] and Table name [table1] > INFO 30-06 15:32:28,526 - Table table1 for Database default created > successfully. > INFO 30-06 15:32:28,527 - main Table table1 for Database default created > successfully. > INFO 30-06 15:32:28,527 - main Query [CREATE TABLE DEFAULT.TABLE1 USING > ORG.APACHE.SPARK.SQL.CARBONSOURCE OPTIONS (TABLENAME "DEFAULT.TABLE1", > TABLEPATH "./CARBONDATA/STORE/DEFAULT/TABLE1/METADATA") ] > WARN 30-06 15:32:28,605 - Couldn't find corresponding Hive SerDe for data > source provider org.apache.spark.sql.CarbonSource. Persisting data source > relation `default`.`table1` into Hive metastore in Spark SQL specific > format, which is NOT compatible with Hive. > WARN 30-06 15:32:49,830 - MetaStoreClient lost connection. Attempting to > reconnect. > MetaException(message:javax.jdo.JDODataStoreException: An exception was > thrown while adding/validating class(es) : Specified key was too long; max > key length is 767 bytes > com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key > was too long; max key length is 767 bytes > at sun.reflect.GeneratedConstructorAccessor30.newInstance(Unknown > Source) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:422) > at com.mysql.jdbc.Util.handleNewInstance(Util.java:411) > at com.mysql.jdbc.Util.getInstance(Util.java:386) > at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1053) > at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4096) > at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4028) > at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2490) > at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2651) > at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2728) > at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2678) > at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:894) > at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:732) > at > com.jolbox.bonecp.StatementHandle.execute(StatementHandle.java:254) > at > org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:760) > at > org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatementList(AbstractTable.java:711) > at > org.datanucleus.store.rdbms.table.AbstractTable.create(AbstractTable.java:425) > at > org.datanucleus.store.rdbms.table.AbstractTable.exists(AbstractTable.java:488) > at > org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3380) > at > org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3190) > at > org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2841) > at > org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122) > at > org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605) > at > org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:954) > at > org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679) > at > org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2045) > at > org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1365) > at > org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3827) > at > org.datanucleus.state.JDOStateManager.setIdentity(JDOStateManager.java:2571) > at > org.datanucleus.state.JDOStateManager.initialiseForPersistentNew(JDOStateManager.java:513) > at > org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:232) > at > org.datanucleus.ExecutionContextImpl.newObjectProviderForPersistentNew(ExecutionContextImpl.java:1414) > at > org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2218) > at > org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:2065) > at > org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1913) > at > org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217) > at > org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:727) > at > org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752) > at > org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:814) > at sun.reflect.GeneratedMethodAccessor32.invoke(Unknown Source) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:497) > at > org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114) > at com.sun.proxy.$Proxy2.createTable(Unknown Source) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1416) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1449) > at sun.reflect.GeneratedMethodAccessor31.invoke(Unknown Source) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:497) > at > org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107) > at > com.sun.proxy.$Proxy4.create_table_with_environment_context(Unknown Source) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_table_with_environment_context.getResult(ThriftHiveMetastore.java:9200) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_table_with_environment_context.getResult(ThriftHiveMetastore.java:9184) > at > org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) > at > org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110) > at > org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) > at > org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118) > at > org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > > > > 辛苦大家帮忙看一下,谢谢 -- Thanks & Regards, Ravi
