Sorry Reynold, I want to triple check this with you. I'm looking at the 
`SparkSession.sqlContext` field in the latest 2.0 branch, and it appears that 
that val is set specifically to an instance of the `SQLContext` class. A cast 
to `HiveContext` will fail. Maybe there's a misunderstanding here. This is what 
I'm looking at:

https://github.com/apache/spark/blob/24ea875198ffcef4a4c3ba28aba128d6d7d9a395/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala#L122

Michael


> On Jul 19, 2016, at 10:01 AM, Reynold Xin <r...@databricks.com> wrote:
> 
> Yes. But in order to access methods available only in HiveContext a user cast 
> is required. 
> 
> On Tuesday, July 19, 2016, Maciej Bryński <mac...@brynski.pl 
> <mailto:mac...@brynski.pl>> wrote:
> @Reynold Xin,
> How this will work with Hive Support ?
> SparkSession.sqlContext return HiveContext ?
> 
> 2016-07-19 0:26 GMT+02:00 Reynold Xin <r...@databricks.com <javascript:;>>:
> > Good idea.
> >
> > https://github.com/apache/spark/pull/14252 
> > <https://github.com/apache/spark/pull/14252>
> >
> >
> >
> > On Mon, Jul 18, 2016 at 12:16 PM, Michael Armbrust <mich...@databricks.com 
> > <javascript:;>>
> > wrote:
> >>
> >> + dev, reynold
> >>
> >> Yeah, thats a good point.  I wonder if SparkSession.sqlContext should be
> >> public/deprecated?
> >>
> >> On Mon, Jul 18, 2016 at 8:37 AM, Koert Kuipers <ko...@tresata.com 
> >> <javascript:;>> wrote:
> >>>
> >>> in my codebase i would like to gradually transition to SparkSession, so
> >>> while i start using SparkSession i also want a SQLContext to be available 
> >>> as
> >>> before (but with a deprecated warning when i use it). this should be easy
> >>> since SQLContext is now a wrapper for SparkSession.
> >>>
> >>> so basically:
> >>> val session = SparkSession.builder.set(..., ...).getOrCreate()
> >>> val sqlc = new SQLContext(session)
> >>>
> >>> however this doesnt work, the SQLContext constructor i am trying to use
> >>> is private. SparkSession.sqlContext is also private.
> >>>
> >>> am i missing something?
> >>>
> >>> a non-gradual switch is not very realistic in any significant codebase,
> >>> and i do not want to create SparkSession and SQLContext independendly 
> >>> (both
> >>> from same SparkContext) since that can only lead to confusion and
> >>> inconsistent settings.
> >>
> >>
> >
> 
> 
> 
> --
> Maciek Bryński

Reply via email to