Livy's programmatic API (the Java/Scala APIs) uses Interactive Sessions not
Batch Sessions. When you submit a batch job to Livy it spins up a Spark App
and runs your job (jar) then shuts down. In an interactive session (like
the client) a Spark App is spun up and you then submit jobs to it until you
shut it down yourself. In your example you would follow up with client.stop
(); once you've finished using it (this will shutdown the Spark context and
cancel any unfinished jobs). You can find a short example of how to use the
client at
https://livy.incubator.apache.org/docs/latest/programmatic-api.html
                                                                                
   
 Alex Bozarth                                                                   
   
 Software Engineer                                                              
   
 Spark Technology Center                                                        
   
                                                                                
   
                                                                                
     
                                                                                
     
                                                                                
     
 E-mail: ajboz...@us.ibm.com                                                    
     
 GitHub: github.com/ajbozarth                                                   
     
                                                                   505 Howard 
Street 
                                                             San Francisco, CA 
94105 
                                                                       United 
States 
                                                                                
     








From:   Stefan Miklosovic <mikloso...@gmail.com>
To:     user@livy.incubator.apache.org
Date:   10/29/2017 03:19 AM
Subject:        After successfully computed job submitted programmatically, it
            is still running in Spark UI and marked as idle in Livy UI



Title says it all, I upload a JAR, I run a job via client.run(Job<T>
job).get(); and I do get a result - all is computed ok, however, that
application is not marked as "completed" in Spark UI and it hangs
there indefinitely and I have to kill it myself.

What should I do, if I want to mark successfully run application as
completed so it is not running / is not idle anymore?

Thanks!

--
Stefan Miklosovic



Reply via email to