Hive pulls in a ton of dependencies that we were afraid would break
existing spark applications.  For this reason all hive submodules are
optional.


On Tue, Aug 12, 2014 at 7:43 AM, John Omernik <j...@omernik.com> wrote:

> Yin helped me with that, and I appreciate the onlist followup.  A few
> questions: Why is this the case?  I guess, does building it with
> thriftserver add much more time/size to the final build? It seems that
> unless documented well, people will miss that and this situation would
> occur, why would we not just build the thrift server in? (I am not a
> programming expert, and not trying to judge the decision to have it in a
> separate profile, I would just like to understand why it'd done that way)
>
>
>
>
> On Mon, Aug 11, 2014 at 11:47 AM, Cheng Lian <lian.cs....@gmail.com>
> wrote:
>
>> Hi John, the JDBC Thrift server resides in its own build profile and need
>> to be enabled explicitly by ./sbt/sbt -Phive-thriftserver assembly.
>> ​
>>
>>
>> On Tue, Aug 5, 2014 at 4:54 AM, John Omernik <j...@omernik.com> wrote:
>>
>>> I am using spark-1.1.0-SNAPSHOT right now and trying to get familiar
>>> with the JDBC thrift server.  I have everything compiled correctly, I can
>>> access data in spark-shell on yarn from my hive installation. Cached
>>> tables, etc all work.
>>>
>>> When I execute ./sbin/start-thriftserver.sh
>>>
>>> I get the error below. Shouldn't it just ready my spark-env? I guess I
>>> am lost on how to make this work.
>>>
>>> Thanks1
>>>
>>> $ ./start-thriftserver.sh
>>>
>>>
>>> Spark assembly has been built with Hive, including Datanucleus jars on
>>> classpath
>>>
>>> Exception in thread "main" java.lang.ClassNotFoundException:
>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2
>>>
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>
>>> at java.security.AccessController.doPrivileged(Native Method)
>>>
>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>
>>> at java.lang.Class.forName0(Native Method)
>>>
>>> at java.lang.Class.forName(Class.java:270)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:311)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:73)
>>>
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>
>>
>

Reply via email to