Thanks, made some progress, but now getting this...

10/02/22 22:19:50 INFO Datastore.Schema: Initialising Catalog "", Schema
"APP" using "SchemaTable" auto-start option
10/02/22 22:19:50 INFO DataNucleus.Persistence: Managing Persistence of
org.apache.hadoop.hive.metastore.model.MDatabase since it was managed
previously
10/02/22 22:19:50 INFO DataNucleus.MetaData: Registering listener for
metadata initialisation
10/02/22 22:19:50 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/training/hive/hive-trunk/build/dist/lib/hive-metastore-0.6.0.jar!/package.jdo"
at line 4, column 6 : cvc-elt.1: Cannot find the declaration of element
'jdo'. - Please check your specification of DTD and the validity of the
MetaData XML that you have specified.
10/02/22 22:19:50 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/training/hive/hive-trunk/build/dist/lib/hive-metastore-0.6.0.jar!/package.jdo"
at line 291, column 13 : The content of element type "class" must match
"(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)".
- Please check your specification of DTD and the validity of the MetaData
XML that you have specified.
10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
InheritanceStrategy : new-table]
10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
InheritanceStrategy : new-table]
10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
InheritanceStrategy : new-table]
10/02/22 22:19:51 WARN DataNucleus.Persistence: Unknown Error during auto
starter execution. : Exception thrown performing schema operation : Add
classes to Catalog "", Schema "APP"
Exception thrown performing schema operation : Add classes to Catalog "",
Schema "APP"
org.datanucleus.exceptions.NucleusDataStoreException: Exception thrown
performing schema operation : Add classes to Catalog "", Schema "APP"
    at
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:152)
    at
org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:994)
    at
org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:960)
    at
org.datanucleus.store.AbstractStoreManager.initialiseAutoStart(AbstractStoreManager.java:609)
    at
org.datanucleus.store.rdbms.RDBMSManager.initialiseSchema(RDBMSManager.java:821)
    at
org.datanucleus.store.rdbms.RDBMSManager.<init>(RDBMSManager.java:394)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:576)
    at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
org.datanucleus.store.FederationManager.initialiseStoreManager(FederationManager.java:106)
    at
org.datanucleus.store.FederationManager.<init>(FederationManager.java:68)
    at
org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:152)
    at
org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:529)
    at
org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:175)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1956)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1951)
    at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:163)
    at
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:180)
    at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:121)
    at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:99)
    at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
    at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:145)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:163)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:131)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:104)
    at
org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:79)
    at
org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:365)
    at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:245)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:619)
Caused by: java.sql.SQLNonTransientConnectionException: No current
connection.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.noCurrentConnection(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.checkIfClosed(Unknown
Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.getAutoCommit(Unknown
Source)
    at
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
    ... 42 more
Caused by: java.sql.SQLException: No current connection.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
Source)
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
Source)
    ... 49 more
Nested Throwables StackTrace:
java.sql.SQLNonTransientConnectionException: No current connection.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.noCurrentConnection(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.checkIfClosed(Unknown
Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.getAutoCommit(Unknown
Source)
    at
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
    at
org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:994)
    at
org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:960)
    at
org.datanucleus.store.AbstractStoreManager.initialiseAutoStart(AbstractStoreManager.java:609)
    at
org.datanucleus.store.rdbms.RDBMSManager.initialiseSchema(RDBMSManager.java:821)
    at
org.datanucleus.store.rdbms.RDBMSManager.<init>(RDBMSManager.java:394)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:576)
    at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
org.datanucleus.store.FederationManager.initialiseStoreManager(FederationManager.java:106)
    at
org.datanucleus.store.FederationManager.<init>(FederationManager.java:68)
    at
org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:152)
    at
org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:529)
    at
org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:175)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1956)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1951)
    at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:163)
    at
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:180)
    at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:121)
    at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:99)
    at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
    at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:145)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:163)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:131)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:104)
    at
org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:79)
    at
org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:365)
    at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:245)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:619)
Caused by: java.sql.SQLException: No current connection.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
Source)
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
Source)
    ... 49 more

10/02/22 22:19:51 WARN DataNucleus.Persistence: Illegal state of AutoStart,
disabling it. To enable it, resolve earlier errors.
10/02/22 22:19:51 INFO Datastore.Schema: Catalog "", Schema "APP"
initialised - managing 0 classes
10/02/22 22:19:51 INFO metastore.ObjectStore: Initialized ObjectStore
10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
InheritanceStrategy : new-table]
10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
InheritanceStrategy : new-table]
10/02/22 22:19:51 INFO DataNucleus.Persistence: Managing Persistence of
Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
InheritanceStrategy : new-table]
10/02/22 22:19:51 ERROR server.TThreadPoolServer: Error occurred during
processing of message.
java.lang.RuntimeException: javax.jdo.JDODataStoreException: Exception
thrown performing schema operation : Add classes to Catalog "", Schema "APP"
NestedThrowables:
java.sql.SQLNonTransientConnectionException: No current connection.
    at
org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:368)
    at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:245)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:619)
Caused by: javax.jdo.JDODataStoreException: Exception thrown performing
schema operation : Add classes to Catalog "", Schema "APP"
NestedThrowables:
java.sql.SQLNonTransientConnectionException: No current connection.
    at
org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:289)
    at
org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:3741)
    at
org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
    at
org.datanucleus.store.rdbms.query.QueryCompiler.executionCompile(QueryCompiler.java:312)
    at
org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
    at
org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:174)
    at org.datanucleus.store.query.Query.executeQuery(Query.java:1443)
    at
org.datanucleus.store.rdbms.query.JDOQLQuery.executeQuery(JDOQLQuery.java:244)
    at org.datanucleus.store.query.Query.executeWithArray(Query.java:1357)
    at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:242)
    at
org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:293)
    at
org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:312)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:163)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:131)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:104)
    at
org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:79)
    at
org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:365)
    ... 4 more
Caused by: java.sql.SQLNonTransientConnectionException: No current
connection.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.noCurrentConnection(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.checkIfClosed(Unknown
Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.getAutoCommit(Unknown
Source)
    at
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
    at
org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:994)
    at
org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:960)
    at
org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:691)
    at
org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:358)
    at
org.datanucleus.store.rdbms.RDBMSManager.getExtent(RDBMSManager.java:1344)
    at
org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:3736)
    ... 19 more
Caused by: java.sql.SQLException: No current connection.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
Source)
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
Source)
    ... 32 more







On Mon, Feb 22, 2010 at 5:39 PM, Carl Steinbach <[email protected]> wrote:

> Ant is finding an older version of Ivy and using it instead of the newer
> copy that the Hive build script automatically downloads. The older copy of
> Ivy is probably located somewhere under $HOME/.ant and/or in $ANT_HOME/lib.
> You need to locate these old versions of the ivy jar and remove them. Then
> run the build script again and everything should work. Optionally, you can
> also copy the new version of ivy located in hive-trunk/build/ivy/lib to the
> places where you previously found the old versions of Ivy (do this after
> building Hive).
>
> Carl
>
>
>
>
> On Mon, Feb 22, 2010 at 5:20 PM, Something Something <
> [email protected]> wrote:
>
>> Carl is right.  I have /user/hivee/warehouse and /tmp created under HDFS.
>>
>> Carl - I am trying to follow your instructions.  Getting this...
>>
>> /home/training/hive/hive-trunk/build-common.xml:180: impossible to
>> configure ivy:settings with given file:
>> /home/training/hive/hive-trunk/ivy/ivysettings.xml :
>> java.text.ParseException: failed to load settings from
>> file:/home/training/hive/hive-trunk/ivy/ivysettings.xml: impossible to set
>> defaultTTL to eternal on class
>> org.apache.ivy.core.cache.DefaultRepositoryCacheManager
>>
>>
>> when I run:  ant package -Dhadoop.version=0.20.1
>>
>> Any ideas?
>>
>>
>> On Mon, Feb 22, 2010 at 5:12 PM, Carl Steinbach <[email protected]>wrote:
>>
>>>
>>> I think the problem is that Hive server set hive.metastore.warehouse.dir
>>>> to /user/hive/warehouse. So we have to create the directory before running
>>>> TestJdbcDriver and TestHiveServer.
>>>>
>>>
>>> Based on the output of HiveServer that something posted it looks like
>>> /user/hive/warehouse already exists:
>>>
>>>
>>>
>>>
>>> > 10/02/21 11:25:16 INFO metastore.warehouse: Deleted the diretory
>>> hdfs://localhost:9000/user/
>>> hive/warehouse/testhivedrivertable
>>>
>>> I also don't see any errors in the HiveServer output, which makes me
>>> think that the problem is due to library skew on the client side.
>>>
>>> Carl
>>>
>>
>>
>

Reply via email to