Hi,

What is the correct way to stop fully the Spark job which is running as
yarn-client using spark-submit.

We are using sc.stop in the code and can see the job still running (in yarn
resource manager) after final hive insert is complete.

The code flow is

start context
do somework
insert to hive
sc.stop

This is sparkling water job is that matters.

Is there anything else needed ?

Thanks,

J

Reply via email to