I have used  Spark 1.4  for 6 months.  Thanks  all the members of this 
community for your great work.I have a question  about the logging issue. I 
hope this question can be solved.
The program is running under this configurations: YARN Cluster, YARN-client 
mode.
In Scala,writing a code like:rdd.map( a => println(a) );   will get the output 
about the value of a in our console.
However,in Java (1.7),writing rdd.map( new Function<Integer,Integer>(){     
@Override     public  Integer   call(Integer a) throws Exception {              
System.out.println(a);     }});won't get the output in our console.
The configuration is the same.
I have try this code but not work either: rdd.map( new 
Function<Integer,Integer>(){     @Override     public  Integer   call(Integer 
a) throws Exception {            org.apache.log4j.Logger log = 
Logger.getLogger(this.getClass());             log.info(a);             
log.warn(a);             log.error(a);             log.fatal(a);     }});
No output either:final   org.apache.log4j.Logger log = 
Logger.getLogger(this.getClass()); rdd.map( new Function<Integer,Integer>(){    
 @Override     public  Integer   call(Integer a) throws Exception {             
log.info(a);             log.warn(a);             log.error(a);             
log.fatal(a);     }});
It seems that the output of stdout in worker doesn't send the output back to 
our driver.I am wonder why it works in scala but not in java.Is there a  simple 
way to make java work like scala?
Thanks.                                                                         
          

Reply via email to