Hello,

I would like to understand following.

1. What if livy server dies? What will happen to existing jobs? Will they
still be in running state in spark cluster? If yes, how to track their
status?
2. Is there High Availability mode deployment available for Apache Livy
server like secondary Livy server or something?
3. Can we submit more than one job using batches api? If yes, is there any
limit on upper number? How to submit more than one job using single batches
api call?
4. If multiple job submission in single batches api call is allowed then
    1. Are those jobs going to run in parallel or sequential?
    2. Can i define dependency between these jobs?
5. How can i debug a job that I've submitted using batches API?


Thanks,
Ravindra Chandrakar

Reply via email to