Re: transtition SQLContext to SparkSession

2016-07-19 Thread Reynold Xin
Yes. But in order to access methods available only in HiveContext a user
cast is required.

On Tuesday, July 19, 2016, Maciej Bryński  wrote:

> @Reynold Xin,
> How this will work with Hive Support ?
> SparkSession.sqlContext return HiveContext ?
>
> 2016-07-19 0:26 GMT+02:00 Reynold Xin 
> >:
> > Good idea.
> >
> > https://github.com/apache/spark/pull/14252
> >
> >
> >
> > On Mon, Jul 18, 2016 at 12:16 PM, Michael Armbrust <
> mich...@databricks.com >
> > wrote:
> >>
> >> + dev, reynold
> >>
> >> Yeah, thats a good point.  I wonder if SparkSession.sqlContext should be
> >> public/deprecated?
> >>
> >> On Mon, Jul 18, 2016 at 8:37 AM, Koert Kuipers  > wrote:
> >>>
> >>> in my codebase i would like to gradually transition to SparkSession, so
> >>> while i start using SparkSession i also want a SQLContext to be
> available as
> >>> before (but with a deprecated warning when i use it). this should be
> easy
> >>> since SQLContext is now a wrapper for SparkSession.
> >>>
> >>> so basically:
> >>> val session = SparkSession.builder.set(..., ...).getOrCreate()
> >>> val sqlc = new SQLContext(session)
> >>>
> >>> however this doesnt work, the SQLContext constructor i am trying to use
> >>> is private. SparkSession.sqlContext is also private.
> >>>
> >>> am i missing something?
> >>>
> >>> a non-gradual switch is not very realistic in any significant codebase,
> >>> and i do not want to create SparkSession and SQLContext independendly
> (both
> >>> from same SparkContext) since that can only lead to confusion and
> >>> inconsistent settings.
> >>
> >>
> >
>
>
>
> --
> Maciek Bryński
>


Re: transtition SQLContext to SparkSession

2016-07-19 Thread Maciej Bryński
@Reynold Xin,
How this will work with Hive Support ?
SparkSession.sqlContext return HiveContext ?

2016-07-19 0:26 GMT+02:00 Reynold Xin :
> Good idea.
>
> https://github.com/apache/spark/pull/14252
>
>
>
> On Mon, Jul 18, 2016 at 12:16 PM, Michael Armbrust 
> wrote:
>>
>> + dev, reynold
>>
>> Yeah, thats a good point.  I wonder if SparkSession.sqlContext should be
>> public/deprecated?
>>
>> On Mon, Jul 18, 2016 at 8:37 AM, Koert Kuipers  wrote:
>>>
>>> in my codebase i would like to gradually transition to SparkSession, so
>>> while i start using SparkSession i also want a SQLContext to be available as
>>> before (but with a deprecated warning when i use it). this should be easy
>>> since SQLContext is now a wrapper for SparkSession.
>>>
>>> so basically:
>>> val session = SparkSession.builder.set(..., ...).getOrCreate()
>>> val sqlc = new SQLContext(session)
>>>
>>> however this doesnt work, the SQLContext constructor i am trying to use
>>> is private. SparkSession.sqlContext is also private.
>>>
>>> am i missing something?
>>>
>>> a non-gradual switch is not very realistic in any significant codebase,
>>> and i do not want to create SparkSession and SQLContext independendly (both
>>> from same SparkContext) since that can only lead to confusion and
>>> inconsistent settings.
>>
>>
>



-- 
Maciek Bryński

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: transtition SQLContext to SparkSession

2016-07-18 Thread Reynold Xin
Good idea.

https://github.com/apache/spark/pull/14252



On Mon, Jul 18, 2016 at 12:16 PM, Michael Armbrust 
wrote:

> + dev, reynold
>
> Yeah, thats a good point.  I wonder if SparkSession.sqlContext should be
> public/deprecated?
>
> On Mon, Jul 18, 2016 at 8:37 AM, Koert Kuipers  wrote:
>
>> in my codebase i would like to gradually transition to SparkSession, so
>> while i start using SparkSession i also want a SQLContext to be available
>> as before (but with a deprecated warning when i use it). this should be
>> easy since SQLContext is now a wrapper for SparkSession.
>>
>> so basically:
>> val session = SparkSession.builder.set(..., ...).getOrCreate()
>> val sqlc = new SQLContext(session)
>>
>> however this doesnt work, the SQLContext constructor i am trying to use
>> is private. SparkSession.sqlContext is also private.
>>
>> am i missing something?
>>
>> a non-gradual switch is not very realistic in any significant codebase,
>> and i do not want to create SparkSession and SQLContext independendly (both
>> from same SparkContext) since that can only lead to confusion and
>> inconsistent settings.
>>
>
>


Re: transtition SQLContext to SparkSession

2016-07-18 Thread Benjamin Kim
From what I read, there is no more Contexts.

"SparkContext, SQLContext, HiveContext merged into SparkSession"

I have not tested it, but I don’t know if it’s true.

Cheers,
Ben


> On Jul 18, 2016, at 8:37 AM, Koert Kuipers  wrote:
> 
> in my codebase i would like to gradually transition to SparkSession, so while 
> i start using SparkSession i also want a SQLContext to be available as before 
> (but with a deprecated warning when i use it). this should be easy since 
> SQLContext is now a wrapper for SparkSession.
> 
> so basically:
> val session = SparkSession.builder.set(..., ...).getOrCreate()
> val sqlc = new SQLContext(session)
> 
> however this doesnt work, the SQLContext constructor i am trying to use is 
> private. SparkSession.sqlContext is also private.
> 
> am i missing something?
> 
> a non-gradual switch is not very realistic in any significant codebase, and i 
> do not want to create SparkSession and SQLContext independendly (both from 
> same SparkContext) since that can only lead to confusion and inconsistent 
> settings.



Re: transtition SQLContext to SparkSession

2016-07-18 Thread Michael Armbrust
+ dev, reynold

Yeah, thats a good point.  I wonder if SparkSession.sqlContext should be
public/deprecated?

On Mon, Jul 18, 2016 at 8:37 AM, Koert Kuipers  wrote:

> in my codebase i would like to gradually transition to SparkSession, so
> while i start using SparkSession i also want a SQLContext to be available
> as before (but with a deprecated warning when i use it). this should be
> easy since SQLContext is now a wrapper for SparkSession.
>
> so basically:
> val session = SparkSession.builder.set(..., ...).getOrCreate()
> val sqlc = new SQLContext(session)
>
> however this doesnt work, the SQLContext constructor i am trying to use is
> private. SparkSession.sqlContext is also private.
>
> am i missing something?
>
> a non-gradual switch is not very realistic in any significant codebase,
> and i do not want to create SparkSession and SQLContext independendly (both
> from same SparkContext) since that can only lead to confusion and
> inconsistent settings.
>


transtition SQLContext to SparkSession

2016-07-18 Thread Koert Kuipers
in my codebase i would like to gradually transition to SparkSession, so
while i start using SparkSession i also want a SQLContext to be available
as before (but with a deprecated warning when i use it). this should be
easy since SQLContext is now a wrapper for SparkSession.

so basically:
val session = SparkSession.builder.set(..., ...).getOrCreate()
val sqlc = new SQLContext(session)

however this doesnt work, the SQLContext constructor i am trying to use is
private. SparkSession.sqlContext is also private.

am i missing something?

a non-gradual switch is not very realistic in any significant codebase, and
i do not want to create SparkSession and SQLContext independendly (both
from same SparkContext) since that can only lead to confusion and
inconsistent settings.