Have you tried setting "hive.in.test" to "true"? This should get rid of
many of the "table X does not exist" errors you were seeing. I know of two
other projects that have upgraded from Hive 2 to 3 that run embedded
Metastore Services and/or HS2 instances, the pull requests from these might
give you some pointers on some common changes to get things working:

https://github.com/klarna/HiveRunner/pull/126/files

https://github.com/HotelsDotCom/beeju/pull/50

Hope that helps!

Adrian

On Sat, 20 Feb 2021 at 05:15, James Baiera <james.bai...@elastic.co> wrote:

> So, last update on my issues. I have reached a point where the embedded
> hive server starts up without any exceptions thrown. It seems that I needed
> to disable direct sql. It's not entirely clear to me what that setting
> actually does, but it seems to clear up the SQL execution warnings in
> embedded mode.
>
> So final result: Even when embedding HiveServer2 into a java application
> for testing, you should specify your configurations in a hive-site.xml file
> in the resource dir. Here was mine in the end:
>
> <configuration>
>     <property>
>         <name>hive.metastore.local</name>
>         <value>true</value>
>     </property>
>     <property>
>         <name>hive.metastore.schema.verification</name>
>         <value>false</value>
>     </property>
>     <property>
>         <name>hive.metastore.schema.verification.record.version</name>
>         <value>false</value>
>     </property>
>     <property>
>         <name>datanucleus.schema.autoCreateAll</name>
>         <value>true</value>
>     </property>
>     <property>
>         <name>hive.metastore.fastpath</name>
>         <value>true</value>
>     </property>
>     <property>
>         <name>metastore.try.direct.sql</name>
>         <value>false</value>
>     </property>
>     <property>
>         <name>hive.exec.scratchdir</name>
>         <value>/tmp/hive</value>
>     </property>
>     <property>
>         <name>fs.permissions.umask-mode</name>
>         <value>022</value>
>     </property>
>     <property>
>         <name>hive.metastore.warehouse.dir</name>
>         <value>/tmp/hive/warehouse_${testSeed}</value>
>     </property>
>     <property>
>         <name>hive.metastore.metadb.dir</name>
>         <value>/tmp/hive/metastore_db_${testSeed}</value>
>     </property>
>     <property>
>         <name>javax.jdo.option.ConnectionURL</name>
>
> <value>jdbc:derby:;databaseName=/tmp/hive/metastore_db_${testSeed};create=true</value>
>     </property>
> </configuration>
>
> I just expand the setting file with my build tool to force a clean
> environment every run. My tests ended up all passing with this config, so
> hopefully it's a helpful resolution to anyone else who might have run into
> these issues.
>
> On Fri, Feb 19, 2021 at 4:00 PM James Baiera <james.bai...@elastic.co>
> wrote:
>
>> Correction, there are still exceptions related to the metastore not
>> having schemas created, but they are not keeping the service from starting.
>> Things still seem a little sketchy - this is a lot of exceptions for each
>> start up which makes me worried. I'd love to hear if anyone had any other
>> ideas about what might be going on.
>>
>> On Fri, Feb 19, 2021 at 3:51 PM James Baiera <james.bai...@elastic.co>
>> wrote:
>>
>>> Hi Stamatis,
>>>
>>> Thanks for the input, I just tried using a memory database within derby
>>> but it seems like it didn't address the core problem - Still getting errors
>>> that the self test query is failing because the tables within the metastore
>>> do not exist.
>>>
>>> I took a look around your project, and clearly things are working for
>>> you there. I had a thought that maybe some settings weren't making it into
>>> the embedded hive server because they were not in a config file (I'm
>>> passing all the configs into the embedded instance in a locally created
>>> HiveConf object). I placed a hive-site.xml file with my configurations in
>>> it on the class path, and now it does get past the initial test query. It's
>>> a little frustrating that the configuration object passed into HiveServer2
>>> is not used consistently through the application (my guess is a fresh
>>> HiveConf is being constructed somewhere).
>>>
>>> I'm still getting exceptions in my output that it cannot read the
>>> current tables, but it says right afterward that the exception doesn't
>>> indicate an error. There are parts of the startup process that have been
>>> throwing quite a few exceptions aside from this, but it seems that testing
>>> is not halted outright because of it.
>>>
>>>     [2021-02-19T13:41:01,206][WARN
>>> ][org.apache.hadoop.hive.metastore.MetaStoreDirectSql] Failed to execute
>>> [SELECT "TBLS"."TBL_NAME" FROM "TBLS"  INNER JOIN "DBS" ON "TBLS"."DB_ID" =
>>> "DBS"."DB_ID"  WHERE "DBS"."NAME" = ? AND "DBS"."CTLG_NAME" = ? AND
>>> "TBLS"."TBL_TYPE" = ? ] with parameters [default, hive, MATERIALIZED_VIEW]
>>>     javax.jdo.JDODataStoreException: Error executing SQL query "SELECT
>>> "TBLS"."TBL_NAME" FROM "TBLS"  INNER JOIN "DBS" ON "TBLS"."DB_ID" =
>>> "DBS"."DB_ID"  WHERE "DBS"."NAME" = ? AND "DBS"."CTLG_NAME" = ? AND
>>> "TBLS"."TBL_TYPE" = ?".
>>>         at
>>> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:543)
>>> ~[datanucleus-api-jdo-4.2.4.jar:?]
>>>         at
>>> org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:391)
>>> ~[datanucleus-api-jdo-4.2.4.jar:?]
>>>         at
>>> org.datanucleus.api.jdo.JDOQuery.executeWithArray(JDOQuery.java:267)
>>> ~[datanucleus-api-jdo-4.2.4.jar:?]
>>>         at
>>> org.apache.hadoop.hive.metastore.MetaStoreDirectSql.executeWithArray(MetaStoreDirectSql.java:2003)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at
>>> org.apache.hadoop.hive.metastore.MetaStoreDirectSql.getTables(MetaStoreDirectSql.java:420)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at
>>> org.apache.hadoop.hive.metastore.ObjectStore$5.getSqlResult(ObjectStore.java:1640)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at
>>> org.apache.hadoop.hive.metastore.ObjectStore$5.getSqlResult(ObjectStore.java:1636)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at
>>> org.apache.hadoop.hive.metastore.ObjectStore$GetHelper.run(ObjectStore.java:3577)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at
>>> org.apache.hadoop.hive.metastore.ObjectStore.getTablesInternal(ObjectStore.java:1648)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at
>>> org.apache.hadoop.hive.metastore.ObjectStore.getTables(ObjectStore.java:1534)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> ~[?:1.8.0_171]
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>> ~[?:1.8.0_171]
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> ~[?:1.8.0_171]
>>>         at java.lang.reflect.Method.invoke(Method.java:498)
>>> ~[?:1.8.0_171]
>>>         at
>>> org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:97)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at com.sun.proxy.$Proxy43.getTables(Unknown Source) [?:?]
>>>         at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_tables_by_type(HiveMetaStore.java:5082)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> ~[?:1.8.0_171]
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>> ~[?:1.8.0_171]
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> ~[?:1.8.0_171]
>>>         at java.lang.reflect.Method.invoke(Method.java:498)
>>> ~[?:1.8.0_171]
>>>         at
>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at
>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at com.sun.proxy.$Proxy45.get_tables_by_type(Unknown Source)
>>> [?:?]
>>>         at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTables(HiveMetaStoreClient.java:1676)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTables(HiveMetaStoreClient.java:1665)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at
>>> org.apache.hadoop.hive.ql.metadata.Hive.getTablesByType(Hive.java:1310)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at
>>> org.apache.hadoop.hive.ql.metadata.Hive.getTableObjects(Hive.java:1222)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at
>>> org.apache.hadoop.hive.ql.metadata.Hive.getAllMaterializedViewObjects(Hive.java:1217)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at
>>> org.apache.hadoop.hive.ql.metadata.HiveMaterializedViewsRegistry$Loader.run(HiveMaterializedViewsRegistry.java:166)
>>> [hive-exec-3.1.2.jar:3.1.2]
>>>         at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>> [?:1.8.0_171]
>>>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>> [?:1.8.0_171]
>>>         at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>> [?:1.8.0_171]
>>>         at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>> [?:1.8.0_171]
>>>         at java.lang.Thread.run(Thread.java:748) [?:1.8.0_171]
>>>     Caused by: java.sql.SQLSyntaxErrorException: Table/View 'TBLS' does
>>> not exist.
>>>         at
>>> org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
>>> Source) ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
>>> Source) ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
>>> Source) ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.jdbc.ConnectionChild.handleException(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.jdbc.EmbedPreparedStatement.<init>(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.jdbc.EmbedPreparedStatement42.<init>(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.jdbc.Driver42.newEmbedPreparedStatement(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.jdbc.EmbedConnection.prepareStatement(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.jdbc.EmbedConnection.prepareStatement(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> com.zaxxer.hikari.pool.ProxyConnection.prepareStatement(ProxyConnection.java:325)
>>> ~[HikariCP-2.6.1.jar:?]
>>>         at
>>> com.zaxxer.hikari.pool.HikariProxyConnection.prepareStatement(HikariProxyConnection.java)
>>> ~[HikariCP-2.6.1.jar:?]
>>>         at
>>> org.datanucleus.store.rdbms.SQLController.getStatementForQuery(SQLController.java:345)
>>> ~[datanucleus-rdbms-4.1.19.jar:?]
>>>         at
>>> org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getPreparedStatementForQuery(RDBMSQueryUtils.java:211)
>>> ~[datanucleus-rdbms-4.1.19.jar:?]
>>>         at
>>> org.datanucleus.store.rdbms.query.SQLQuery.performExecute(SQLQuery.java:633)
>>> ~[datanucleus-rdbms-4.1.19.jar:?]
>>>         at
>>> org.datanucleus.store.query.Query.executeQuery(Query.java:1855)
>>> ~[datanucleus-core-4.1.17.jar:?]
>>>         at
>>> org.datanucleus.store.rdbms.query.SQLQuery.executeWithArray(SQLQuery.java:807)
>>> ~[datanucleus-rdbms-4.1.19.jar:?]
>>>         at
>>> org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:368)
>>> ~[datanucleus-api-jdo-4.2.4.jar:?]
>>>         ... 33 more
>>>     Caused by: org.apache.derby.iapi.error.StandardException: Table/View
>>> 'TBLS' does not exist.
>>>         at
>>> org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.sql.compile.FromBaseTable.bindTableDescriptor(Unknown
>>> Source) ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.sql.compile.FromBaseTable.bindNonVTITables(Unknown
>>> Source) ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.sql.compile.TableOperatorNode.bindNonVTITables(Unknown
>>> Source) ~[derby-10.14.1.0.jar:?]
>>>         at org.apache.derby.impl.sql.compile.FromList.bindTables(Unknown
>>> Source) ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.sql.compile.SelectNode.bindNonVTITables(Unknown
>>> Source) ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.sql.compile.DMLStatementNode.bindTables(Unknown
>>> Source) ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.sql.compile.DMLStatementNode.bind(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.sql.compile.CursorNode.bindStatement(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at org.apache.derby.impl.sql.GenericStatement.prepMinion(Unknown
>>> Source) ~[derby-10.14.1.0.jar:?]
>>>         at org.apache.derby.impl.sql.GenericStatement.prepare(Unknown
>>> Source) ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.sql.conn.GenericLanguageConnectionContext.prepareInternalStatement(Unknown
>>> Source) ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.jdbc.EmbedPreparedStatement.<init>(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.jdbc.EmbedPreparedStatement42.<init>(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.jdbc.Driver42.newEmbedPreparedStatement(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.jdbc.EmbedConnection.prepareStatement(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> org.apache.derby.impl.jdbc.EmbedConnection.prepareStatement(Unknown Source)
>>> ~[derby-10.14.1.0.jar:?]
>>>         at
>>> com.zaxxer.hikari.pool.ProxyConnection.prepareStatement(ProxyConnection.java:325)
>>> ~[HikariCP-2.6.1.jar:?]
>>>         at
>>> com.zaxxer.hikari.pool.HikariProxyConnection.prepareStatement(HikariProxyConnection.java)
>>> ~[HikariCP-2.6.1.jar:?]
>>>         at
>>> org.datanucleus.store.rdbms.SQLController.getStatementForQuery(SQLController.java:345)
>>> ~[datanucleus-rdbms-4.1.19.jar:?]
>>>         at
>>> org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getPreparedStatementForQuery(RDBMSQueryUtils.java:211)
>>> ~[datanucleus-rdbms-4.1.19.jar:?]
>>>         at
>>> org.datanucleus.store.rdbms.query.SQLQuery.performExecute(SQLQuery.java:633)
>>> ~[datanucleus-rdbms-4.1.19.jar:?]
>>>         at
>>> org.datanucleus.store.query.Query.executeQuery(Query.java:1855)
>>> ~[datanucleus-core-4.1.17.jar:?]
>>>         at
>>> org.datanucleus.store.rdbms.query.SQLQuery.executeWithArray(SQLQuery.java:807)
>>> ~[datanucleus-rdbms-4.1.19.jar:?]
>>>         at
>>> org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:368)
>>> ~[datanucleus-api-jdo-4.2.4.jar:?]
>>>         ... 33 more
>>>     [2021-02-19T13:41:01,218][WARN
>>> ][org.apache.hadoop.hive.metastore.ObjectStore] Falling back to ORM path
>>> due to direct SQL failure (this is not an error): See previous errors;
>>> Error executing SQL query "SELECT "TBLS"."TBL_NAME" FROM "TBLS"  INNER JOIN
>>> "DBS" ON "TBLS"."DB_ID" = "DBS"."DB_ID"  WHERE "DBS"."NAME" = ? AND
>>> "DBS"."CTLG_NAME" = ? AND "TBLS"."TBL_TYPE" = ?". at
>>> org.apache.hadoop.hive.metastore.MetaStoreDirectSql.executeWithArray(MetaStoreDirectSql.java:2015)
>>> at
>>> org.apache.hadoop.hive.metastore.MetaStoreDirectSql.getTables(MetaStoreDirectSql.java:420)
>>> at
>>> org.apache.hadoop.hive.metastore.ObjectStore$5.getSqlResult(ObjectStore.java:1640)
>>>
>>> On Fri, Feb 19, 2021 at 6:34 AM Stamatis Zampetakis <zabe...@gmail.com>
>>> wrote:
>>>
>>>> Hi James,
>>>>
>>>> I am doing something similar with the difference that everything runs
>>>> on docker [1].
>>>> I am using Hive 3.1 (HDP though) but things work fine at least with
>>>> in-memory derby.
>>>>
>>>>     <property>
>>>>         <name>javax.jdo.option.ConnectionURL</name>
>>>>         <value>jdbc:derby:memory:metastore;create=true</value>
>>>>     </property>
>>>>
>>>> Best,
>>>> Stamatis
>>>>
>>>> [1] https://github.com/zabetak/hs2-embedded
>>>>
>>>> On Wed, Feb 17, 2021 at 10:55 PM James Baiera <james.bai...@elastic.co>
>>>> wrote:
>>>>
>>>>> Hey folks,
>>>>>
>>>>> I have a project where I test with Hive using an embedded HiveServer2
>>>>> instance within a JVM running integration tests. This has worked for Hive
>>>>> 1.2.2 in the past, and I've been able to get it to work with Hive 2.3.8,
>>>>> but have been having trouble getting it working on Hive 3.0+
>>>>>
>>>>> The error I keep running into is that the metastore tables are not
>>>>> present in the local embedded metastore. I have enabled both
>>>>> "hive.metastore.schema.verification" to be "false" and
>>>>> "datanucleus.schema.autoCreateAll" to be "true", but it seems like the
>>>>> latter setting is being ignored. Instead of starting up, the HiveServer2
>>>>> fails while trying to read from the DBS table:
>>>>>
>>>>> Self-test query [select "DB_ID" from "DBS"] failed; direct SQL is
>>>>> disabled
>>>>> javax.jdo.JDODataStoreException: Error executing SQL query "select
>>>>> "DB_ID" from "DBS"".
>>>>>      at
>>>>> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:543)
>>>>> ~[datanucleus-api-jdo-4.2.4.jar:?]
>>>>>      at
>>>>> org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:391)
>>>>> ~[datanucleus-api-jdo-4.2.4.jar:?]
>>>>>      at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:216)
>>>>> ~[datanucleus-api-jdo-4.2.4.jar:?]
>>>>>      at
>>>>> org.apache.hadoop.hive.metastore.MetaStoreDirectSql.runTestQuery(MetaStoreDirectSql.java:276)
>>>>> [hive-exec-3.1.2.jar:3.1.2]
>>>>>      at
>>>>> org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:184)
>>>>> [hive-exec-3.1.2.jar:3.1.2]
>>>>>      at
>>>>> org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:498)
>>>>> [hive-exec-3.1.2.jar:3.1.2]
>>>>>      at
>>>>> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:420)
>>>>> [hive-exec-3.1.2.jar:3.1.2]
>>>>>      at
>>>>> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:375)
>>>>> [hive-exec-3.1.2.jar:3.1.2]
>>>>>      at
>>>>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77)
>>>>> [hadoop-common-3.1.2.jar:?]
>>>>>      at
>>>>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137)
>>>>> [hadoop-common-3.1.2.jar:?]
>>>>>      at
>>>>> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:59)
>>>>> [hive-exec-3.1.2.jar:3.1.2]
>>>>>      at
>>>>> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
>>>>> [hive-exec-3.1.2.jar:3.1.2]
>>>>>      at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:718)
>>>>> [hive-exec-3.1.2.jar:3.1.2]
>>>>>      at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:696)
>>>>> [hive-exec-3.1.2.jar:3.1.2]
>>>>>      at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:690)
>>>>> [hive-exec-3.1.2.jar:3.1.2]
>>>>>      at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:767)
>>>>> [hive-exec-3.1.2.jar:3.1.2]
>>>>>      at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:538)
>>>>> [hive-exec-3.1.2.jar:3.1.2]
>>>>>      <....>
>>>>>
>>>>> Looking into the documentation, it seems that many things mention
>>>>> using the schematool to set up the metastore the first time, but this is 
>>>>> an
>>>>> embedded use case, and there is no Hive installation locally to use for
>>>>> this.
>>>>>
>>>>> I've also tried using the HiveJDBC driver with "jdbc:hive2:///" as the
>>>>> url to run on an embedded server, and I am getting the same errors.
>>>>>
>>>>> Is this use case not supported anymore in Hive 3? Am I missing
>>>>> something here?
>>>>>
>>>>

Reply via email to