Oh, thanks for reporting this. This should be a bug since SPARK_HIVE was
deprecated, we shouldn’t rely on it any more.
​


On Wed, Aug 13, 2014 at 1:23 PM, ZHENG, Xu-dong <dong...@gmail.com> wrote:

> Just find this is because below lines in make_distribution.sh doesn't work:
>
> if [ "$SPARK_HIVE" == "true" ]; then
>   cp "$FWDIR"/lib_managed/jars/datanucleus*.jar "$DISTDIR/lib/"
> fi
>
> There is no definition of $SPARK_HIVE in make_distribution.sh. I should
> set it explicitly.
>
>
>
> On Wed, Aug 13, 2014 at 1:10 PM, ZHENG, Xu-dong <dong...@gmail.com> wrote:
>
>> Hi Cheng,
>>
>> I also meet some issues when I try to start ThriftServer based a build
>> from master branch (I could successfully run it from the branch-1.0-jdbc
>> branch). Below is my build command:
>>
>> ./make-distribution.sh --skip-java-test -Phadoop-2.4 -Phive -Pyarn
>> -Dyarn.version=2.4.0 -Dhadoop.version=2.4.0 -Phive-thriftserver
>>
>> And below are the printed errors:
>>
>> ERROR CompositeService: Error starting services HiveServer2
>> org.apache.hive.service.ServiceException: Unable to connect to MetaStore!
>>         at
>> org.apache.hive.service.cli.CLIService.start(CLIService.java:85)
>>         at
>> org.apache.hive.service.CompositeService.start(CompositeService.java:70)
>>         at
>> org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:73)
>>         at
>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:71)
>>         at
>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:314)
>>          at
>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:73)
>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>> Caused by: javax.jdo.JDOFatalUserException: Class
>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory was not found.
>> NestedThrowables:
>> java.lang.ClassNotFoundException:
>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory
>>         at
>> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1175)
>>         at
>> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
>>         at
>> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:275)
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:304)
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:234)
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:209)
>>         at
>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
>>         at
>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:64)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:73)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:415)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:402)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:441)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:326)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:286)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4060)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:121)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:104)
>>         at
>> org.apache.hive.service.cli.CLIService.start(CLIService.java:82)
>>         ... 11 more
>> Caused by: java.lang.ClassNotFoundException:
>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>         at java.lang.Class.forName0(Native Method)
>>         at java.lang.Class.forName(Class.java:270)
>>         at javax.jdo.JDOHelper$18.run(JDOHelper.java:2018)
>>         at javax.jdo.JDOHelper$18.run(JDOHelper.java:2016)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.jdo.JDOHelper.forName(JDOHelper.java:2015)
>>         at
>> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1162)
>>         ... 32 more
>> 14/08/13 13:08:48 INFO AbstractService: Service:OperationManager is
>> stopped.
>> 14/08/13 13:08:48 INFO AbstractService: Service:SessionManager is stopped.
>> 14/08/13 13:08:48 INFO AbstractService: Service:CLIService is stopped.
>> 14/08/13 13:08:48 ERROR HiveThriftServer2: Error starting
>> HiveThriftServer2
>> org.apache.hive.service.ServiceException: Failed to Start HiveServer2
>>         at
>> org.apache.hive.service.CompositeService.start(CompositeService.java:80)
>>         at
>> org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:73)
>>         at
>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:71)
>>         at
>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:314)
>>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:73)
>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>> Caused by: org.apache.hive.service.ServiceException: Unable to connect to
>> MetaStore!
>>         at
>> org.apache.hive.service.cli.CLIService.start(CLIService.java:85)
>>         at
>> org.apache.hive.service.CompositeService.start(CompositeService.java:70)
>>         ... 10 more
>> Caused by: javax.jdo.JDOFatalUserException: Class
>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory was not found.
>> NestedThrowables:
>> java.lang.ClassNotFoundException:
>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory
>>         at
>> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1175)
>>         at
>> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
>>         at
>> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:275)
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:304)
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:234)
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:209)
>>         at
>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
>>         at
>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:64)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:73)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:415)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:402)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:441)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:326)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:286)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4060)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:121)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:104)
>>         at
>> org.apache.hive.service.cli.CLIService.start(CLIService.java:82)
>>         ... 11 more
>> Caused by: java.lang.ClassNotFoundException:
>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>         at java.lang.Class.forName0(Native Method)
>>         at java.lang.Class.forName(Class.java:270)
>>         at javax.jdo.JDOHelper$18.run(JDOHelper.java:2018)
>>         at javax.jdo.JDOHelper$18.run(JDOHelper.java:2016)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.jdo.JDOHelper.forName(JDOHelper.java:2015)
>>         at
>> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1162)
>>         ... 32 more
>>
>>
>> Any suggestions?
>>
>>
>> On Tue, Aug 12, 2014 at 12:47 AM, Cheng Lian <lian.cs....@gmail.com>
>> wrote:
>>
>>> Hi John, the JDBC Thrift server resides in its own build profile and
>>> need to be enabled explicitly by ./sbt/sbt -Phive-thriftserver assembly.
>>> ​
>>>
>>>
>>> On Tue, Aug 5, 2014 at 4:54 AM, John Omernik <j...@omernik.com> wrote:
>>>
>>>> I am using spark-1.1.0-SNAPSHOT right now and trying to get familiar
>>>> with the JDBC thrift server.  I have everything compiled correctly, I can
>>>> access data in spark-shell on yarn from my hive installation. Cached
>>>> tables, etc all work.
>>>>
>>>> When I execute ./sbin/start-thriftserver.sh
>>>>
>>>> I get the error below. Shouldn't it just ready my spark-env? I guess I
>>>> am lost on how to make this work.
>>>>
>>>> Thanks1
>>>>
>>>> $ ./start-thriftserver.sh
>>>>
>>>>
>>>> Spark assembly has been built with Hive, including Datanucleus jars on
>>>> classpath
>>>>
>>>> Exception in thread "main" java.lang.ClassNotFoundException:
>>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2
>>>>
>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>>
>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>>
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>
>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>>
>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>>
>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>>
>>>> at java.lang.Class.forName0(Native Method)
>>>>
>>>> at java.lang.Class.forName(Class.java:270)
>>>>
>>>> at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:311)
>>>>
>>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:73)
>>>>
>>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>
>>>
>>>
>>
>>
>> --
>> 郑旭东
>> ZHENG, Xu-dong
>>
>>
>
>
> --
> 郑旭东
> ZHENG, Xu-dong
>
>

Reply via email to