Re: How to create SparkSession using SparkConf?

2017-04-28 Thread madhu phatak
SparkSession.builder.config() takes SparkConf as parameter. You can use
that to pass SparkConf as it is.

https://spark.apache.org/docs/2.1.0/api/java/org/apache/spark/sql/SparkSession.Builder.html#config(org.apache.spark.SparkConf)

On Fri, Apr 28, 2017 at 11:40 AM, Yanbo Liang <yblia...@gmail.com> wrote:

> StreamingContext is an old API, if you want to process streaming data, you
> can use SparkSession directly.
> FYI: http://spark.apache.org/docs/latest/structured-
> streaming-programming-guide.html
>
> Thanks
> Yanbo
>
> On Fri, Apr 28, 2017 at 12:12 AM, kant kodali <kanth...@gmail.com> wrote:
>
>> Actually one more question along the same line. This is about .getOrCreate()
>> ?
>>
>> JavaStreamingContext doesn't seem to have a way to accept SparkSession
>> object so does that mean a streaming context is not required? If so, how do
>> I pass a lambda to .getOrCreate using SparkSession? The lambda that we
>> normally pass when we call StreamingContext.getOrCreate.
>>
>>
>>
>>
>>
>>
>>
>>
>> On Thu, Apr 27, 2017 at 8:47 AM, kant kodali <kanth...@gmail.com> wrote:
>>
>>> Ahhh Thanks much! I miss my sparkConf.setJars function instead of this
>>> hacky comma separated jar names.
>>>
>>> On Thu, Apr 27, 2017 at 8:01 AM, Yanbo Liang <yblia...@gmail.com> wrote:
>>>
>>>> Could you try the following way?
>>>>
>>>> val spark = 
>>>> SparkSession.builder.appName("my-application").config("spark.jars", 
>>>> "a.jar, b.jar").getOrCreate()
>>>>
>>>>
>>>> Thanks
>>>>
>>>> Yanbo
>>>>
>>>>
>>>> On Thu, Apr 27, 2017 at 9:21 AM, kant kodali <kanth...@gmail.com>
>>>> wrote:
>>>>
>>>>> I am using Spark 2.1 BTW.
>>>>>
>>>>> On Wed, Apr 26, 2017 at 3:22 PM, kant kodali <kanth...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Hi All,
>>>>>>
>>>>>> I am wondering how to create SparkSession using SparkConf object?
>>>>>> Although I can see that most of the key value pairs we set in SparkConf 
>>>>>> we
>>>>>> can also set in SparkSession or  SparkSession.Builder however I don't see
>>>>>> sparkConf.setJars which is required right? Because we want the driver jar
>>>>>> to be distributed across the cluster whether we run it in client mode or
>>>>>> cluster mode. so I am wondering how is this possible?
>>>>>>
>>>>>> Thanks!
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>


-- 
Regards,
Madhukara Phatak
http://datamantra.io/


Re: How to create SparkSession using SparkConf?

2017-04-28 Thread Yanbo Liang
StreamingContext is an old API, if you want to process streaming data, you
can use SparkSession directly.
FYI:
http://spark.apache.org/docs/latest/structured-streaming-programming-guide.html

Thanks
Yanbo

On Fri, Apr 28, 2017 at 12:12 AM, kant kodali <kanth...@gmail.com> wrote:

> Actually one more question along the same line. This is about .getOrCreate()
> ?
>
> JavaStreamingContext doesn't seem to have a way to accept SparkSession
> object so does that mean a streaming context is not required? If so, how do
> I pass a lambda to .getOrCreate using SparkSession? The lambda that we
> normally pass when we call StreamingContext.getOrCreate.
>
>
>
>
>
>
>
>
> On Thu, Apr 27, 2017 at 8:47 AM, kant kodali <kanth...@gmail.com> wrote:
>
>> Ahhh Thanks much! I miss my sparkConf.setJars function instead of this
>> hacky comma separated jar names.
>>
>> On Thu, Apr 27, 2017 at 8:01 AM, Yanbo Liang <yblia...@gmail.com> wrote:
>>
>>> Could you try the following way?
>>>
>>> val spark = 
>>> SparkSession.builder.appName("my-application").config("spark.jars", "a.jar, 
>>> b.jar").getOrCreate()
>>>
>>>
>>> Thanks
>>>
>>> Yanbo
>>>
>>>
>>> On Thu, Apr 27, 2017 at 9:21 AM, kant kodali <kanth...@gmail.com> wrote:
>>>
>>>> I am using Spark 2.1 BTW.
>>>>
>>>> On Wed, Apr 26, 2017 at 3:22 PM, kant kodali <kanth...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi All,
>>>>>
>>>>> I am wondering how to create SparkSession using SparkConf object?
>>>>> Although I can see that most of the key value pairs we set in SparkConf we
>>>>> can also set in SparkSession or  SparkSession.Builder however I don't see
>>>>> sparkConf.setJars which is required right? Because we want the driver jar
>>>>> to be distributed across the cluster whether we run it in client mode or
>>>>> cluster mode. so I am wondering how is this possible?
>>>>>
>>>>> Thanks!
>>>>>
>>>>>
>>>>
>>>
>>
>


Re: How to create SparkSession using SparkConf?

2017-04-27 Thread kant kodali
Actually one more question along the same line. This is about .getOrCreate()
?

JavaStreamingContext doesn't seem to have a way to accept SparkSession
object so does that mean a streaming context is not required? If so, how do
I pass a lambda to .getOrCreate using SparkSession? The lambda that we
normally pass when we call StreamingContext.getOrCreate.








On Thu, Apr 27, 2017 at 8:47 AM, kant kodali <kanth...@gmail.com> wrote:

> Ahhh Thanks much! I miss my sparkConf.setJars function instead of this
> hacky comma separated jar names.
>
> On Thu, Apr 27, 2017 at 8:01 AM, Yanbo Liang <yblia...@gmail.com> wrote:
>
>> Could you try the following way?
>>
>> val spark = 
>> SparkSession.builder.appName("my-application").config("spark.jars", "a.jar, 
>> b.jar").getOrCreate()
>>
>>
>> Thanks
>>
>> Yanbo
>>
>>
>> On Thu, Apr 27, 2017 at 9:21 AM, kant kodali <kanth...@gmail.com> wrote:
>>
>>> I am using Spark 2.1 BTW.
>>>
>>> On Wed, Apr 26, 2017 at 3:22 PM, kant kodali <kanth...@gmail.com> wrote:
>>>
>>>> Hi All,
>>>>
>>>> I am wondering how to create SparkSession using SparkConf object?
>>>> Although I can see that most of the key value pairs we set in SparkConf we
>>>> can also set in SparkSession or  SparkSession.Builder however I don't see
>>>> sparkConf.setJars which is required right? Because we want the driver jar
>>>> to be distributed across the cluster whether we run it in client mode or
>>>> cluster mode. so I am wondering how is this possible?
>>>>
>>>> Thanks!
>>>>
>>>>
>>>
>>
>


Re: How to create SparkSession using SparkConf?

2017-04-27 Thread kant kodali
Ahhh Thanks much! I miss my sparkConf.setJars function instead of this
hacky comma separated jar names.

On Thu, Apr 27, 2017 at 8:01 AM, Yanbo Liang <yblia...@gmail.com> wrote:

> Could you try the following way?
>
> val spark = 
> SparkSession.builder.appName("my-application").config("spark.jars", "a.jar, 
> b.jar").getOrCreate()
>
>
> Thanks
>
> Yanbo
>
>
> On Thu, Apr 27, 2017 at 9:21 AM, kant kodali <kanth...@gmail.com> wrote:
>
>> I am using Spark 2.1 BTW.
>>
>> On Wed, Apr 26, 2017 at 3:22 PM, kant kodali <kanth...@gmail.com> wrote:
>>
>>> Hi All,
>>>
>>> I am wondering how to create SparkSession using SparkConf object?
>>> Although I can see that most of the key value pairs we set in SparkConf we
>>> can also set in SparkSession or  SparkSession.Builder however I don't see
>>> sparkConf.setJars which is required right? Because we want the driver jar
>>> to be distributed across the cluster whether we run it in client mode or
>>> cluster mode. so I am wondering how is this possible?
>>>
>>> Thanks!
>>>
>>>
>>
>


Re: How to create SparkSession using SparkConf?

2017-04-27 Thread Yanbo Liang
Could you try the following way?

val spark = SparkSession.builder.appName("my-application").config("spark.jars",
"a.jar, b.jar").getOrCreate()


Thanks

Yanbo


On Thu, Apr 27, 2017 at 9:21 AM, kant kodali <kanth...@gmail.com> wrote:

> I am using Spark 2.1 BTW.
>
> On Wed, Apr 26, 2017 at 3:22 PM, kant kodali <kanth...@gmail.com> wrote:
>
>> Hi All,
>>
>> I am wondering how to create SparkSession using SparkConf object?
>> Although I can see that most of the key value pairs we set in SparkConf we
>> can also set in SparkSession or  SparkSession.Builder however I don't see
>> sparkConf.setJars which is required right? Because we want the driver jar
>> to be distributed across the cluster whether we run it in client mode or
>> cluster mode. so I am wondering how is this possible?
>>
>> Thanks!
>>
>>
>


Re: How to create SparkSession using SparkConf?

2017-04-26 Thread kant kodali
I am using Spark 2.1 BTW.

On Wed, Apr 26, 2017 at 3:22 PM, kant kodali <kanth...@gmail.com> wrote:

> Hi All,
>
> I am wondering how to create SparkSession using SparkConf object? Although
> I can see that most of the key value pairs we set in SparkConf we can also
> set in SparkSession or  SparkSession.Builder however I don't see
> sparkConf.setJars which is required right? Because we want the driver jar
> to be distributed across the cluster whether we run it in client mode or
> cluster mode. so I am wondering how is this possible?
>
> Thanks!
>
>


How to create SparkSession using SparkConf?

2017-04-26 Thread kant kodali
Hi All,

I am wondering how to create SparkSession using SparkConf object? Although
I can see that most of the key value pairs we set in SparkConf we can also
set in SparkSession or  SparkSession.Builder however I don't see
sparkConf.setJars which is required right? Because we want the driver jar
to be distributed across the cluster whether we run it in client mode or
cluster mode. so I am wondering how is this possible?

Thanks!