Thanks for the (super) quick replies.  My bad - i was looking under
spark/sql/*catalyst*  instead of /spark/sql/hive


2014-06-11 17:40 GMT-07:00 Mark Hamstra <m...@clearstorydata.com>:

> And the code is right here:
> https://github.com/apache/spark/blob/master/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveContext.scala
>
>
> On Wed, Jun 11, 2014 at 5:38 PM, Michael Armbrust <mich...@databricks.com>
> wrote:
>
>> You will need to compile spark with SPARK_HIVE=true.
>>
>>
>> On Wed, Jun 11, 2014 at 5:37 PM, Stephen Boesch <java...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>   The documentation of Catalyst describes using HiveContext; however,
>>> the scala classes do not exist in Master or 1.0.0 Branch.  What is the
>>> replacement/equivalent in Master?
>>>
>>> Package does not exist:
>>> org.apache.spark.sql.hive
>>>
>>> Here is code from SQL on Spark meetup slides referencing that
>>> package/classes:
>>>
>>> val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
>>> import hiveContext._
>>>
>>> hql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
>>>
>>>
>>>
>>>
>>
>

Reply via email to