Hello all, I have noticed the (what I think) is a erroneous behavior when using the WebUI:
1. Launching App from Eclipse to a cluster (with 3 workers) 2. Using Spark 0.9.0 (Cloudera distr 5.0.1) 3. The Application makes the worker write to the stdout using System.out.println(...) When the Application finishes, the executor summary table looks like: Executor Summary ExecutorID Worker Cores Memory State Logs 5 worker-20140717121825-p...com-7078 12 5120 KILLED stdout stderr 4 worker-20140717121833-f...com-7078 12 5120 KILLED stdout stderr 3 worker-20140717121833-p...com-7078 12 5120 KILLED stdout stderr In this case ExecutorID are 3,4, 5. and I cannot see the stdout. When I click stdout it is empty because it is directing to wrong Executor number. The URL looks like : http://p....com:18081/logPage/?appId=app-20140723144514-0176&executorId=5&logType=stdout If I manually change the URL to : http://p....com:18081/logPage/?appId=app-20140723144514-0176&executorId=2&logType=stdout Then I can see the stdout. However, I don't see this error if while the Spark application is running I cancel it from Eclipse. In that case the Executor Summary Table looks like: Executor Summary ExecutorID Worker Cores Memory State Logs 2 worker-20140717121825-phineas-edca.us.oracle.com-7078 12 5120 KILLED stdout stderr 1 worker-20140717121833-perry-edca.us.oracle.com-7078 12 5120 KILLED stdout stderr 0 worker-20140717121833-ferb-edca.us.oracle.com-7078 12 5120 KILLED stdout stderr In this case ExecutorID are 0,1, 2. and I can see the stdout correctly. Does anyone know what is going on? Also, a related question has been asked in this list already: http://apache-spark-user-list.1001560.n3.nabble.com/ghost-executor-messing-up-UI-s-stdout-stderr-links-td122.html Thank you, Bruno -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-in-History-UI-Seeing-stdout-stderr-tp10540.html Sent from the Apache Spark User List mailing list archive at Nabble.com.