Hi John, the JDBC Thrift server resides in its own build profile and need
to be enabled explicitly by ./sbt/sbt -Phive-thriftserver assembly.
​


On Tue, Aug 5, 2014 at 4:54 AM, John Omernik <j...@omernik.com> wrote:

> I am using spark-1.1.0-SNAPSHOT right now and trying to get familiar with
> the JDBC thrift server.  I have everything compiled correctly, I can access
> data in spark-shell on yarn from my hive installation. Cached tables, etc
> all work.
>
> When I execute ./sbin/start-thriftserver.sh
>
> I get the error below. Shouldn't it just ready my spark-env? I guess I am
> lost on how to make this work.
>
> Thanks1
>
> $ ./start-thriftserver.sh
>
>
> Spark assembly has been built with Hive, including Datanucleus jars on
> classpath
>
> Exception in thread "main" java.lang.ClassNotFoundException:
> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2
>
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>
> at java.lang.Class.forName0(Native Method)
>
> at java.lang.Class.forName(Class.java:270)
>
> at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:311)
>
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:73)
>
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>

Reply via email to