Hi,

I executed a task on Spark in YARN and it failed.
I see just "executor lost" message from YARNClientScheduler, no further
details..
(I read ths error can be connected to spark.yarn.executor.memoryOverhead
setting and already played with this param)

How to go more deeply in details in log files and find exact reason? How can
log of failed task be examined?

Unfortunately I haven't access to UI of Spark just can use command line.

Thanks!

Serg.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/log-files-of-failed-task-tp22183.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to