Are you referring to have spark picking up a new jar build?  If so, you can
probably script that on bash.

Thank You,

Irving Duran


On Wed, Nov 28, 2018 at 12:44 PM Mina Aslani <aslanim...@gmail.com> wrote:

> Hi,
>
> I have a question for you.
> Do we need to kill a spark job every time we change and deploy it to
> cluster? Or, is there a way for Spark to automatically pick up the recent
> jar version?
>
> Best regards,
> Mina
>

Reply via email to