Yes, I'm using the latest hive-default.xml. What I'm showing is just the contents of my hive-sites.xml.
The NUCLEUS_TABLES and all of it's columns listed in the exception exists in the DB, which is what's puzzling me. On Wed, Aug 5, 2009 at 10:51 AM, Prasad Chakka <[email protected]> wrote: > Are you using the latest hive-default.xml? It should contain more > datanucleus properties than below. It is looking for a table called > ‘NUCLEUS_TABLES’ which contains list of tables that got created when the > original schema was created. > > Prasad > > ------------------------------ > *From: *Bill Graham <[email protected]> > *Reply-To: *<[email protected]>, <[email protected]> > *Date: *Wed, 5 Aug 2009 10:19:24 -0700 > *To: *<[email protected]> > *Subject: *Errors creating MySQL metastore > > > Hi, > > I'm trying to set up a MySQL metastore for Hive and I'm getting the > exceptions shown below. If anyone could shed some insight as to why this is > happening, it would be greatly appreciated. > > My hive-sites.xml is also attached below. This is how it looks when I start > the Hive client with an empty db. The schema gets created, but the errors > attached below appear. After the db is created I change > datanucleus.autoCreateSchema to false before restarting the client. The same > errors appear whenever I restart the client and run "show tables", which > takes about 30 seconds to complete. > > Any ideas how to fix this? I've experimented with many combinations of > datanucleus.autoCreateColumns and datanucleus.identifier.case, but nothing > makes a difference. > > <property> > <name>hive.metastore.local</name> > <value>true</value> > </property> > > <property> > <name>javax.jdo.option.ConnectionURL</name> > <value>jdbc:mysql://xxxxx:11000/hive</value> > </property> > > <property> > <name>javax.jdo.option.ConnectionDriverName</name> > <value>com.mysql.jdbc.Driver</value> > </property> > > <property> > <name>javax.jdo.option.ConnectionUserName</name> > <value>xxxx</value> > </property> > > <property> > <name>javax.jdo.option.ConnectionPassword</name> > <value>xxxx</value> > </property> > > <property> > <name>datanucleus.autoCreateSchema</name> > <value>true</value> > </property> > > > 2009-08-05 10:05:46,543 ERROR Datastore.Schema > (Log4JLogger.java:error(115)) - Failed to validate SchemaTable for Schema > "". Either it doesnt exist, or doesnt validate : Required columns missing > from table "NUCLEUS_TABLES" : `TABLE_NAME`, VERSION, CLASS_NAME, > INTERFACE_NAME, OWNER, `TYPE`. Perhaps your MetaData is incorrect, or you > havent enabled "datanucleus.autoCreateColumns". > Required columns missing from table "NUCLEUS_TABLES" : `TABLE_NAME`, > VERSION, CLASS_NAME, INTERFACE_NAME, OWNER, `TYPE`. Perhaps your MetaData is > incorrect, or you havent enabled "datanucleus.autoCreateColumns". > org.datanucleus.store.rdbms.exceptions.MissingColumnException: Required > columns missing from table "NUCLEUS_TABLES" : `TABLE_NAME`, VERSION, > CLASS_NAME, INTERFACE_NAME, OWNER, `TYPE`. Perhaps your MetaData is > incorrect, or you havent enabled "datanucleus.autoCreateColumns". > at > org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:280) > at > org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:173) > at > org.datanucleus.store.rdbms.SchemaAutoStarter.<init>(SchemaAutoStarter.java:101) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at > org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:576) > at > org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300) > at > org.datanucleus.store.AbstractStoreManager.initialiseAutoStart(AbstractStoreManager.java:486) > at > org.datanucleus.store.rdbms.RDBMSManager.initialiseSchema(RDBMSManager.java:821) > at > org.datanucleus.store.rdbms.RDBMSManager.<init>(RDBMSManager.java:394) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at > org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:576) > at > org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300) > at > org.datanucleus.store.FederationManager.initialiseStoreManager(FederationManager.java:106) > at > org.datanucleus.store.FederationManager.<init>(FederationManager.java:68) > at > org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:152) > at > org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:529) > at > org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:175) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at javax.jdo.JDOHelper$16.run(JDOHelper.java:1956) > at java.security.AccessController.doPrivileged(Native Method) > at javax.jdo.JDOHelper.invoke(JDOHelper.java:1951) > at > javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159) > at > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803) > at > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698) > at > org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:161) > at > org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:178) > at > org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:122) > at > org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:101) > at > org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:54) > at > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:82) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:129) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:145) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:117) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:99) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:72) > at > org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:775) > at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:786) > at > org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:398) > at > org.apache.hadoop.hive.ql.metadata.Hive.getAllTables(Hive.java:387) > at > org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:347) > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:138) > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:357) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:263) > at > org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:122) > at > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:173) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:266) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.util.RunJar.main(RunJar.java:155) > at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) > at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68) > > > thanks, > Bill > >
