Oops, the job names have year, month, day and also hour, minute. So something like job_201505121130_0001.
Young On Tue, May 12, 2015 at 12:24 PM, Young Han <[email protected]> wrote: > Suppose you've created HDFS in ~/. Then the log files are in > ~/hadoop_data/hadoop_local-YOURUSER/userlogs/job_yyyymmdd_xxxx/attempt_yyyymmdd_xxxx_*/log-file > where: > > - yyyymmdd is the date you started Hadoop > - xxxx is the job number > - attempt_* will be different for different workers > - log-file can be stderr, stdout, or syslog > > Typically you want to look at syslog for anything printed using log4j. > Metrics, if enabled, will show up in stderr. > > If you're running distributed, each machine will log things to their local > disk. You can also get to the logs by going to (assuming your Hadoop files > are in ~/) ~/hadoop-x.y.z/logs/userlogs: the folders in here contain > symlinks to the directories above. > > Young > > > > On Tue, May 12, 2015 at 11:26 AM, Cheng Wang <[email protected]> > wrote: > >> Hello, >> >> I am sure if it is good to ask here.. I am new to Giraph. >> I found that in the example codes provided by Giraph package, there are >> many outputs are stored into log4j files... e.g. >> SimplePageRankComputation.java. >> >> I am wondering where are the log files stored in the log4j? >> >> >> Thanks >> Cheng >> > >
