Re: What happens if Livy server crashes ? All the spark jobs are gone?

2018-03-21 Thread kant kodali
I have the following but I am unable to successfully submit a job though
livy in cluster mode.

here are my settings

# spark-defaults.conf

spark.master yarn


#livy.conf


livy.spark.master=yarn

livy.spark.deploy-mode = cluster

livy.server.recovery.mode = recovery

livy.server.recovery.state-store = zookeeper

livy.server.recovery.state-store.url = localhost:2181


Anything wrong with this conf?


Thanks!

On Tue, Mar 20, 2018 at 5:38 PM, kant kodali  wrote:

> got it! is it livy.spark.deploy-mode=yarn-cluster or livy.spark.deploy-mode
> = cluster ? Sorry to ask this question. I couldn't find it in docs or the
> comments in livy.conf and I am using livy 0.4.0
>
> On Tue, Mar 20, 2018 at 5:01 PM, Meisam Fathi 
> wrote:
>
>> If you are running on cluster mode, the application should keep running
>> on YRAN.
>>
>> On Tue, Mar 20, 2018 at 3:34 PM kant kodali  wrote:
>>
>>> @Meisam Fathi I am running with yarn and zookeeper as a state store. I
>>> spawned a job via livy that reads from kafka and writes to Kafka
>>> but the moment I kill the livy server the job also is getting killed.
>>> not sure why? I believe once the livy server crashes the spark context also
>>> get's killed so do I need to need to set the livy.spark.deploy.mode ? if
>>> so, what value should I set it to?
>>>
>>>
>>> On Mon, Mar 12, 2018 at 12:30 PM, Meisam Fathi 
>>> wrote:
>>>
 On YARN, your application keeps running even if the launcher fails. So
 after recovery, Livy reconnects to the application. On Spark standalone, I
 am not sure what happens to the application of the launcher fails.

 Thanks,
 Meisam

 On Mon, Mar 12, 2018 at 10:34 AM kant kodali 
 wrote:

> can someone please explain how YARN helps here? And why not spark
> master?
>
> On Mon, Mar 12, 2018 at 3:41 AM, Matteo Durighetto <
> m.durighe...@miriade.it> wrote:
>
>>
>>
>> 2018-03-12 9:58 GMT+01:00 kant kodali :
>>
>>> Sorry I see there is a recovery mode and also I can set state store
>>> to zookeeper but looks like I need YARN? because I get the error message
>>> below
>>>
>>> "requirement failed: Session recovery requires YARN"
>>>
>>>
>>> I am using spark standalone and I don't use YARN anywhere in my
>>> cluster. is there any other option for recovery in this case?
>>>
>>>
>>> On Sun, Mar 11, 2018 at 11:57 AM, kant kodali 
>>> wrote:
>>>
 Hi All,

 When my live server crashes it looks like all my spark jobs are
 gone. I am trying to see how I can make it more resilient? other 
 words, I
 would like spark jobs that were spawned by Livy to be running even if 
 my
 Livy server crashes because in theory Livy server can crash anytime and
 Spark Jobs should run for weeks or months in my case. How can I achieve
 this?

 Thanks!


>>> Hello,
>>  to enable recovery in Livy you need Spark on YARN
>>
>> ( https://spark.apache.org/docs/latest/running-on-yarn.html )
>>
>>
>>
>> Kind Regards
>>
>
>
>>>
>


Re: What happens if Livy server crashes ? All the spark jobs are gone?

2018-03-20 Thread kant kodali
got it! is it livy.spark.deploy-mode=yarn-cluster or livy.spark.deploy-mode
= cluster ? Sorry to ask this question. I couldn't find it in docs or the
comments in livy.conf and I am using livy 0.4.0

On Tue, Mar 20, 2018 at 5:01 PM, Meisam Fathi 
wrote:

> If you are running on cluster mode, the application should keep running on
> YRAN.
>
> On Tue, Mar 20, 2018 at 3:34 PM kant kodali  wrote:
>
>> @Meisam Fathi I am running with yarn and zookeeper as a state store. I
>> spawned a job via livy that reads from kafka and writes to Kafka
>> but the moment I kill the livy server the job also is getting killed. not
>> sure why? I believe once the livy server crashes the spark context also
>> get's killed so do I need to need to set the livy.spark.deploy.mode ? if
>> so, what value should I set it to?
>>
>>
>> On Mon, Mar 12, 2018 at 12:30 PM, Meisam Fathi 
>> wrote:
>>
>>> On YARN, your application keeps running even if the launcher fails. So
>>> after recovery, Livy reconnects to the application. On Spark standalone, I
>>> am not sure what happens to the application of the launcher fails.
>>>
>>> Thanks,
>>> Meisam
>>>
>>> On Mon, Mar 12, 2018 at 10:34 AM kant kodali  wrote:
>>>
 can someone please explain how YARN helps here? And why not spark
 master?

 On Mon, Mar 12, 2018 at 3:41 AM, Matteo Durighetto <
 m.durighe...@miriade.it> wrote:

>
>
> 2018-03-12 9:58 GMT+01:00 kant kodali :
>
>> Sorry I see there is a recovery mode and also I can set state store
>> to zookeeper but looks like I need YARN? because I get the error message
>> below
>>
>> "requirement failed: Session recovery requires YARN"
>>
>>
>> I am using spark standalone and I don't use YARN anywhere in my
>> cluster. is there any other option for recovery in this case?
>>
>>
>> On Sun, Mar 11, 2018 at 11:57 AM, kant kodali 
>> wrote:
>>
>>> Hi All,
>>>
>>> When my live server crashes it looks like all my spark jobs are
>>> gone. I am trying to see how I can make it more resilient? other words, 
>>> I
>>> would like spark jobs that were spawned by Livy to be running even if my
>>> Livy server crashes because in theory Livy server can crash anytime and
>>> Spark Jobs should run for weeks or months in my case. How can I achieve
>>> this?
>>>
>>> Thanks!
>>>
>>>
>> Hello,
>  to enable recovery in Livy you need Spark on YARN
>
> ( https://spark.apache.org/docs/latest/running-on-yarn.html )
>
>
>
> Kind Regards
>


>>


Re: What happens if Livy server crashes ? All the spark jobs are gone?

2018-03-20 Thread Meisam Fathi
If you are running on cluster mode, the application should keep running on
YRAN.

On Tue, Mar 20, 2018 at 3:34 PM kant kodali  wrote:

> @Meisam Fathi I am running with yarn and zookeeper as a state store. I
> spawned a job via livy that reads from kafka and writes to Kafka
> but the moment I kill the livy server the job also is getting killed. not
> sure why? I believe once the livy server crashes the spark context also
> get's killed so do I need to need to set the livy.spark.deploy.mode ? if
> so, what value should I set it to?
>
>
> On Mon, Mar 12, 2018 at 12:30 PM, Meisam Fathi 
> wrote:
>
>> On YARN, your application keeps running even if the launcher fails. So
>> after recovery, Livy reconnects to the application. On Spark standalone, I
>> am not sure what happens to the application of the launcher fails.
>>
>> Thanks,
>> Meisam
>>
>> On Mon, Mar 12, 2018 at 10:34 AM kant kodali  wrote:
>>
>>> can someone please explain how YARN helps here? And why not spark master?
>>>
>>> On Mon, Mar 12, 2018 at 3:41 AM, Matteo Durighetto <
>>> m.durighe...@miriade.it> wrote:
>>>


 2018-03-12 9:58 GMT+01:00 kant kodali :

> Sorry I see there is a recovery mode and also I can set state store to
> zookeeper but looks like I need YARN? because I get the error message 
> below
>
> "requirement failed: Session recovery requires YARN"
>
>
> I am using spark standalone and I don't use YARN anywhere in my
> cluster. is there any other option for recovery in this case?
>
>
> On Sun, Mar 11, 2018 at 11:57 AM, kant kodali 
> wrote:
>
>> Hi All,
>>
>> When my live server crashes it looks like all my spark jobs are gone.
>> I am trying to see how I can make it more resilient? other words, I would
>> like spark jobs that were spawned by Livy to be running even if my Livy
>> server crashes because in theory Livy server can crash anytime and Spark
>> Jobs should run for weeks or months in my case. How can I achieve this?
>>
>> Thanks!
>>
>>
> Hello,
  to enable recovery in Livy you need Spark on YARN

 ( https://spark.apache.org/docs/latest/running-on-yarn.html )



 Kind Regards

>>>
>>>
>


Re: What happens if Livy server crashes ? All the spark jobs are gone?

2018-03-20 Thread kant kodali
@Meisam Fathi I am running with yarn and zookeeper as a state store. I
spawned a job via livy that reads from kafka and writes to Kafka
but the moment I kill the livy server the job also is getting killed. not
sure why? I believe once the livy server crashes the spark context also
get's killed so do I need to need to set the livy.spark.deploy.mode ? if
so, what value should I set it to?


On Mon, Mar 12, 2018 at 12:30 PM, Meisam Fathi 
wrote:

> On YARN, your application keeps running even if the launcher fails. So
> after recovery, Livy reconnects to the application. On Spark standalone, I
> am not sure what happens to the application of the launcher fails.
>
> Thanks,
> Meisam
>
> On Mon, Mar 12, 2018 at 10:34 AM kant kodali  wrote:
>
>> can someone please explain how YARN helps here? And why not spark master?
>>
>> On Mon, Mar 12, 2018 at 3:41 AM, Matteo Durighetto <
>> m.durighe...@miriade.it> wrote:
>>
>>>
>>>
>>> 2018-03-12 9:58 GMT+01:00 kant kodali :
>>>
 Sorry I see there is a recovery mode and also I can set state store to
 zookeeper but looks like I need YARN? because I get the error message below

 "requirement failed: Session recovery requires YARN"


 I am using spark standalone and I don't use YARN anywhere in my
 cluster. is there any other option for recovery in this case?


 On Sun, Mar 11, 2018 at 11:57 AM, kant kodali 
 wrote:

> Hi All,
>
> When my live server crashes it looks like all my spark jobs are gone.
> I am trying to see how I can make it more resilient? other words, I would
> like spark jobs that were spawned by Livy to be running even if my Livy
> server crashes because in theory Livy server can crash anytime and Spark
> Jobs should run for weeks or months in my case. How can I achieve this?
>
> Thanks!
>
>
 Hello,
>>>  to enable recovery in Livy you need Spark on YARN
>>>
>>> ( https://spark.apache.org/docs/latest/running-on-yarn.html )
>>>
>>>
>>>
>>> Kind Regards
>>>
>>
>>


Re: What happens if Livy server crashes ? All the spark jobs are gone?

2018-03-12 Thread kant kodali
can someone please explain how YARN helps here? And why not spark master?

On Mon, Mar 12, 2018 at 3:41 AM, Matteo Durighetto 
wrote:

>
>
> 2018-03-12 9:58 GMT+01:00 kant kodali :
>
>> Sorry I see there is a recovery mode and also I can set state store to
>> zookeeper but looks like I need YARN? because I get the error message below
>>
>> "requirement failed: Session recovery requires YARN"
>>
>>
>> I am using spark standalone and I don't use YARN anywhere in my cluster.
>> is there any other option for recovery in this case?
>>
>>
>> On Sun, Mar 11, 2018 at 11:57 AM, kant kodali  wrote:
>>
>>> Hi All,
>>>
>>> When my live server crashes it looks like all my spark jobs are gone. I
>>> am trying to see how I can make it more resilient? other words, I would
>>> like spark jobs that were spawned by Livy to be running even if my Livy
>>> server crashes because in theory Livy server can crash anytime and Spark
>>> Jobs should run for weeks or months in my case. How can I achieve this?
>>>
>>> Thanks!
>>>
>>>
>> Hello,
>  to enable recovery in Livy you need Spark on YARN
>
> ( https://spark.apache.org/docs/latest/running-on-yarn.html )
>
>
>
> Kind Regards
>


Re: What happens if Livy server crashes ? All the spark jobs are gone?

2018-03-12 Thread Matteo Durighetto
2018-03-12 9:58 GMT+01:00 kant kodali :

> Sorry I see there is a recovery mode and also I can set state store to
> zookeeper but looks like I need YARN? because I get the error message below
>
> "requirement failed: Session recovery requires YARN"
>
>
> I am using spark standalone and I don't use YARN anywhere in my cluster.
> is there any other option for recovery in this case?
>
>
> On Sun, Mar 11, 2018 at 11:57 AM, kant kodali  wrote:
>
>> Hi All,
>>
>> When my live server crashes it looks like all my spark jobs are gone. I
>> am trying to see how I can make it more resilient? other words, I would
>> like spark jobs that were spawned by Livy to be running even if my Livy
>> server crashes because in theory Livy server can crash anytime and Spark
>> Jobs should run for weeks or months in my case. How can I achieve this?
>>
>> Thanks!
>>
>>
> Hello,
 to enable recovery in Livy you need Spark on YARN

( https://spark.apache.org/docs/latest/running-on-yarn.html )



Kind Regards


Re: What happens if Livy server crashes ? All the spark jobs are gone?

2018-03-12 Thread kant kodali
Sorry I see there is a recovery mode and also I can set state store to
zookeeper but looks like I need YARN? because I get the error message below

"requirement failed: Session recovery requires YARN"


I am using spark standalone and I don't use YARN anywhere in my cluster. is
there any other option for recovery in this case?


On Sun, Mar 11, 2018 at 11:57 AM, kant kodali  wrote:

> Hi All,
>
> When my live server crashes it looks like all my spark jobs are gone. I am
> trying to see how I can make it more resilient? other words, I would like
> spark jobs that were spawned by Livy to be running even if my Livy server
> crashes because in theory Livy server can crash anytime and Spark Jobs
> should run for weeks or months in my case. How can I achieve this?
>
> Thanks!
>
>