That seems the best options as of now since the URL can be used to access the actual Spark UI and that should have the most information.
Bikas ________________________________ From: Vivek Suvarna <vikk...@gmail.com> Sent: Friday, August 11, 2017 3:20:46 AM To: user@livy.incubator.apache.org Subject: Re: Propagating pyspark errors to Livy I'm currently getting the tracking url and giving it back to the calling program. Guess that's the best option then. Thanks Sent from my iPhone On 11 Aug 2017, at 2:55 PM, Saisai Shao <sai.sai.s...@gmail.com<mailto:sai.sai.s...@gmail.com>> wrote: I think you should check Spark application log to see the details, it is hard for Livy to get actual error from Spark. On Fri, Aug 11, 2017 at 12:03 PM, Vivek <viveksuva...@yahoo.co.in<mailto:viveksuva...@yahoo.co.in>> wrote: Hi, Is there anyway to propagate errors from pyspark back to the calling program via Livy? Currently the Livy logs only tells me that the batch job has failed. How do I get the actual error on the spark side. Regards Vivek Sent from my iPhone