Title says it all, I upload a JAR, I run a job via client.run(Job<T>
job).get(); and I do get a result - all is computed ok, however, that
application is not marked as "completed" in Spark UI and it hangs
there indefinitely and I have to kill it myself.

What should I do, if I want to mark successfully run application as
completed so it is not running / is not idle anymore?

Thanks!

-- 
Stefan Miklosovic

Reply via email to