On 27 Apr 2015, at 07:51, ÐΞ€ρ@Ҝ (๏̯͡๏) 
<deepuj...@gmail.com<mailto:deepuj...@gmail.com>> wrote:

Spark 1.3

1. View stderr/stdout from executor from Web UI: when the job is running i 
figured out the executor that am suppose to see, and those two links show 4 
special characters on browser.

2. Tail on Yarn logs:


/apache/hadoop/bin/yarn logs -applicationId  application_1429087638744_151059 | 
less

Threw me: Application has not completed. Logs are only available after an 
application completes


Any other ideas that i can try ?


There's some stuff on log streaming of running Apps on Hadoop 2.6+ which can 
stream logs of running apps to HDFS. I don't know if spark supports that (I 
haven't tested it) so won't give the details right now.

You can go from the RM to the node managers running the containers, and view 
the logs that way.


From some other notes of mine:




One configuration to aid debugging is tell the nodemanagers to keep data for a 
short period after containers finish

<!-- 10 minutes after a failure to see what is left in the directory-->
<property>
  <name>yarn.nodemanager.delete.debug-delay-sec</name>
  <value>600</value>
</property>


You can then retrieve logs by either the web UI, or by connecting to the server 
(usually by ssh) and retrieve the logs from the log directory

We also recommend making sure that YARN kills processes

<!--time before the process gets a -9 -->
<property>
  <name>yarn.nodemanager.sleep-delay-before-sigkill.ms</name>
  <value>30000</value>
</property>


Reply via email to