Hi,

I have been using Spark 2.4.5, for the past month. When a structured
streaming query fails, it appears on the UI as a failed job. But after a
while these failed jobs expire(disappear) from the UI. Is there a setting
which expires failed jobs? I was using Spark 2.2 before this, I have never
seen this behavior. 

Also I tried to call the Spark API to return failed jobs, it returns empty.

Thanks,
Puneet



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to