??????Do we need to kill a spark job every time we change and deploy it?

2018-11-30 Thread 965
I think if your job is running and you want to deploy a new jar which is the new version for the other, spark will think the new jar is another job , they distinguish job by Job ID , so if you want to replace the jar ,you have to kil job every time; --

Re: Do we need to kill a spark job every time we change and deploy it?

2018-11-28 Thread Irving Duran
Are you referring to have spark picking up a new jar build? If so, you can probably script that on bash. Thank You, Irving Duran On Wed, Nov 28, 2018 at 12:44 PM Mina Aslani wrote: > Hi, > > I have a question for you. > Do we need to kill a spark job every time we change and deploy it to >

Do we need to kill a spark job every time we change and deploy it?

2018-11-28 Thread Mina Aslani
Hi, I have a question for you. Do we need to kill a spark job every time we change and deploy it to cluster? Or, is there a way for Spark to automatically pick up the recent jar version? Best regards, Mina