Re: Creating HiveContext in Spark-Shell fails

2016-02-15 Thread Gavin Yue
This sqlContext is one instance of hive context, do not be confused by the 
name.  



> On Feb 16, 2016, at 12:51, Prabhu Joseph <prabhujose.ga...@gmail.com> wrote:
> 
> Hi All,
> 
> On creating HiveContext in spark-shell, fails with 
> 
> Caused by: ERROR XSDB6: Another instance of Derby may have already booted the 
> database /SPARK/metastore_db.
> 
> Spark-Shell already has created metastore_db for SqlContext. 
> 
> Spark context available as sc.
> SQL context available as sqlContext.
> 
> But without HiveContext, i am able to query the data using SqlContext . 
> 
> scala>  var df = 
> sqlContext.read.format("com.databricks.spark.csv").option("header", 
> "true").option("inferSchema", "true").load("/SPARK/abc")
> df: org.apache.spark.sql.DataFrame = [Prabhu: string, Joseph: string]
> 
> So is there any real need for HiveContext inside Spark Shell. Is everything 
> that can be done with HiveContext, achievable with SqlContext inside Spark 
> Shell.
> 
> 
> 
> Thanks,
> Prabhu Joseph
> 
> 
> 
> 

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Creating HiveContext in Spark-Shell fails

2016-02-15 Thread Prabhu Joseph
Thanks Mark, that answers my question.

On Tue, Feb 16, 2016 at 10:55 AM, Mark Hamstra <m...@clearstorydata.com>
wrote:

> Welcome to
>
>     __
>
>  / __/__  ___ _/ /__
>
> _\ \/ _ \/ _ `/ __/  '_/
>
>/___/ .__/\_,_/_/ /_/\_\   version 2.0.0-SNAPSHOT
>
>   /_/
>
>
>
> Using Scala version 2.11.7 (Java HotSpot(TM) 64-Bit Server VM, Java
> 1.8.0_72)
>
> Type in expressions to have them evaluated.
>
> Type :help for more information.
>
>
> scala> sqlContext.isInstanceOf[org.apache.spark.sql.hive.HiveContext]
>
> res0: Boolean = true
>
>
>
> On Mon, Feb 15, 2016 at 8:51 PM, Prabhu Joseph <prabhujose.ga...@gmail.com
> > wrote:
>
>> Hi All,
>>
>> On creating HiveContext in spark-shell, fails with
>>
>> Caused by: ERROR XSDB6: Another instance of Derby may have already booted
>> the database /SPARK/metastore_db.
>>
>> Spark-Shell already has created metastore_db for SqlContext.
>>
>> Spark context available as sc.
>> SQL context available as sqlContext.
>>
>> But without HiveContext, i am able to query the data using SqlContext .
>>
>> scala>  var df =
>> sqlContext.read.format("com.databricks.spark.csv").option("header",
>> "true").option("inferSchema", "true").load("/SPARK/abc")
>> df: org.apache.spark.sql.DataFrame = [Prabhu: string, Joseph: string]
>>
>> So is there any real need for HiveContext inside Spark Shell. Is
>> everything that can be done with HiveContext, achievable with SqlContext
>> inside Spark Shell.
>>
>>
>>
>> Thanks,
>> Prabhu Joseph
>>
>>
>>
>>
>>
>


Re: Creating HiveContext in Spark-Shell fails

2016-02-15 Thread Mark Hamstra
Welcome to

    __

 / __/__  ___ _/ /__

_\ \/ _ \/ _ `/ __/  '_/

   /___/ .__/\_,_/_/ /_/\_\   version 2.0.0-SNAPSHOT

  /_/



Using Scala version 2.11.7 (Java HotSpot(TM) 64-Bit Server VM, Java
1.8.0_72)

Type in expressions to have them evaluated.

Type :help for more information.


scala> sqlContext.isInstanceOf[org.apache.spark.sql.hive.HiveContext]

res0: Boolean = true



On Mon, Feb 15, 2016 at 8:51 PM, Prabhu Joseph <prabhujose.ga...@gmail.com>
wrote:

> Hi All,
>
> On creating HiveContext in spark-shell, fails with
>
> Caused by: ERROR XSDB6: Another instance of Derby may have already booted
> the database /SPARK/metastore_db.
>
> Spark-Shell already has created metastore_db for SqlContext.
>
> Spark context available as sc.
> SQL context available as sqlContext.
>
> But without HiveContext, i am able to query the data using SqlContext .
>
> scala>  var df =
> sqlContext.read.format("com.databricks.spark.csv").option("header",
> "true").option("inferSchema", "true").load("/SPARK/abc")
> df: org.apache.spark.sql.DataFrame = [Prabhu: string, Joseph: string]
>
> So is there any real need for HiveContext inside Spark Shell. Is
> everything that can be done with HiveContext, achievable with SqlContext
> inside Spark Shell.
>
>
>
> Thanks,
> Prabhu Joseph
>
>
>
>
>


Creating HiveContext in Spark-Shell fails

2016-02-15 Thread Prabhu Joseph
Hi All,

On creating HiveContext in spark-shell, fails with

Caused by: ERROR XSDB6: Another instance of Derby may have already booted
the database /SPARK/metastore_db.

Spark-Shell already has created metastore_db for SqlContext.

Spark context available as sc.
SQL context available as sqlContext.

But without HiveContext, i am able to query the data using SqlContext .

scala>  var df =
sqlContext.read.format("com.databricks.spark.csv").option("header",
"true").option("inferSchema", "true").load("/SPARK/abc")
df: org.apache.spark.sql.DataFrame = [Prabhu: string, Joseph: string]

So is there any real need for HiveContext inside Spark Shell. Is everything
that can be done with HiveContext, achievable with SqlContext inside Spark
Shell.



Thanks,
Prabhu Joseph