Hi,

Is there anyway to propagate errors from pyspark back to the calling program 
via Livy?
Currently the Livy logs only tells me that the batch job has failed. How do I 
get the actual error on the spark side. 

Regards
Vivek


Sent from my iPhone

Reply via email to