Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Lucene-hadoop Wiki" for 
change notification.

The following page has been changed by Amareshwari:
http://wiki.apache.org/lucene-hadoop/HowToDebugMapReducePrograms

------------------------------------------------------------------------------
  
[http://lucene.apache.org/hadoop/api/org/apache/hadoop/filecache/DistributedCache.html#addCacheFile(java.net.URI,%20org.apache.hadoop.conf.Configuration)
 DistributedCache.addCacheFile(URI,conf)] and 
[http://lucene.apache.org/hadoop/api/org/apache/hadoop/filecache/DistributedCache.html#setCacheFiles
 DistributedCache.setCacheFiles(URIs,conf)] where URI is of the form 
"hdfs://host:port/<absolutepath>#<script-name>".
  For Streaming, the file can be added through command line option -cacheFile.
  
+ == Default Behavior ==
+ 
+ For Java programs:
+ Stdout, stderr are shown on job UI. Stack trace is printed on diagnostics.
+ 
+ For Pipes:
+ Stdout, stderr are shown on the job UI.
+ Default gdb script is run which prints info abt threads: thread Id and 
function in which it was running when task failed. 
+ And prints stack tarce where task has failed.
+ 
+ For Streaming:
+ Stdout, stderr are shown on the Job UI.
+ The exception details are shown on task diagnostics.
+ 

Reply via email to