sbt assembly with hive

2014-12-12 Thread Stephen Boesch
What is the proper way to build with hive from sbt?  The SPARK_HIVE is
deprecated. However after running the following:

   sbt -Pyarn -Phadoop-2.3 -Phive  assembly/assembly

And then
  bin/pyspark

   hivectx = HiveContext(sc)

   hivectx.hiveql(select * from my_table)

Exception: (You must build Spark with Hive. Export 'SPARK_HIVE=true' and
run sbt/sbt assembly, Py4JError(u'Trying to call a package.',))


Re: sbt assembly with hive

2014-12-12 Thread Abhi Basu
I am getting the same message when trying to get HIveContext in CDH 5.1
after enabling Spark. I am thinking Spark should come with Hive enabled
(default option) as Hive metastore is a common way to share data, due to
popularity of Hive and other SQL-Over-Hadoop technologies like Impala.

Thanks,

Abhi

On Fri, Dec 12, 2014 at 6:40 PM, Stephen Boesch java...@gmail.com wrote:


 What is the proper way to build with hive from sbt?  The SPARK_HIVE is
 deprecated. However after running the following:

sbt -Pyarn -Phadoop-2.3 -Phive  assembly/assembly

 And then
   bin/pyspark

hivectx = HiveContext(sc)

hivectx.hiveql(select * from my_table)

 Exception: (You must build Spark with Hive. Export 'SPARK_HIVE=true' and
 run sbt/sbt assembly, Py4JError(u'Trying to call a package.',))



-- 
Abhi Basu