Thank you mike,

Infact running any sample code gives the same error. Yes, I already realized
that output folder MUST be deleted before running it and I am doing it in
every run. As far as concerned with the tmp directory, I issue the following
command and get the output as follows;

ah...@aai:/usr/local/share/hadoop-0.20.1$ *bin/hadoop dfs -ls*
Found 1 items
drwxr-xr-x   - ahmad supergroup          0 2009-11-12 12:50
/user/ahmad/input
ah...@aai:/usr/local/share/hadoop-0.20.1$

Also I can see a /tmp directory and subdirectories in it. Please let me
know, if I am not checking it correctly. To my understanding, it tries to
write some log files and is unable to get those files at
/home/ahmad/hadoop-dev/logs/
userlogs/attempt_200911111450_0001_m_000004_0/

Is their any parameter, that I can fix or any directory access issue?

Thanks,

--
Ahmad


> 09/11/11 14:52:15 INFO mapred.JobClient:  map 0% reduce 0%
> 09/11/11 14:52:22 INFO mapred.JobClient: Task Id :
> attempt_200911111450_0001_m_
000004_0, Status : FAILED
> java.io.FileNotFoundException: File
>
>
/home/ahmad/hadoop-dev/logs/userlogs/attempt_200911111450_0001_m_000004_0/log.tmp
> does not exist.
>    at
>
>
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)


On Thu, Nov 12, 2009 at 12:40 PM, Mike Kendall <[email protected]> wrote:

> My first guess is that your tmp directory isn't set up correctly.  Also, I
> don't know about WordCountv2 but the original wordcount needed an output
> directory passed along with an input directory (the directory has to not
> exist when you start the job).
>
> -mike
>
> On Wed, Nov 11, 2009 at 5:33 PM, Ahmad Ali Iqbal
> <[email protected]>wrote:
>
> > Hi all,
> >
> > I am a new user of hadoop and trying to run the WordCount v2 example in a
> > pseudo distributed operation given at
> > http://hadoop.apache.org/common/docs/current/mapred_tutorial.html<
> >
> http://hadoop.apache.org/common/docs/current/mapred_tutorial.html#Example%3A+WordCount+v2.0
> > >but
> > when I run it I get errors saying log.tmp does not exist. In order to
> > setup pseudo distributed mode, I have followed in instructions from
> > http://hadoop.apache.org/common/docs/current/quickstart.html and
> modified
> > the configuration files. ssh localhost is working fine (but in order to
> run
> > this program, I tested in both ways; by connecting localhost through ssh
> > and
> > without connecting it). Please see the output as follows
> >
> > ah...@aai:~/hadoop-dev$ ls
> > bin  dfs  logs  wcv1  wcv2  WordCountv1.java  WordCountv2.java
> > ah...@aai:~/hadoop-dev$ ls -l wcv2/
> > total 24
> > drwxr-xr-x 2 ahmad ahmad 4096 2009-11-11 14:47 input
> > drwxr-xr-x 2 ahmad ahmad 4096 2009-11-09 16:05 output
> > -rw-r--r-- 1 ahmad ahmad   12 2009-11-03 15:08 patterns.txt
> > drwxr-xr-x 2 ahmad ahmad 4096 2009-11-03 13:47 wordcount_classes
> > -rw-r--r-- 1 ahmad ahmad 5536 2009-11-06 10:37 wordcount.jar
> > ah...@aai:~/hadoop-dev$ bin/hadoop namenode -format
> > 09/11/11 14:49:13 INFO namenode.NameNode: STARTUP_MSG:
> > /*****************************
> > *******************************
> > STARTUP_MSG: Starting NameNode
> > STARTUP_MSG:   host = aai/127.0.1.1
> > STARTUP_MSG:   args = [-format]
> > STARTUP_MSG:   version = 0.20.1
> > STARTUP_MSG:   build =
> > http://svn.apache.org/repos/asf/hadoop/common/tags/release-0.20.1-rc1 -r
> > 810220; compiled by 'oom' on Tue Sep  1 20:55:56 UTC 2009
> > ************************************************************/
> > 09/11/11 14:49:14 INFO namenode.FSNamesystem:
> > fsOwner=ahmad,ahmad,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
> > 09/11/11 14:49:14 INFO namenode.FSNamesystem: supergroup=supergroup
> > 09/11/11 14:49:14 INFO namenode.FSNamesystem: isPermissionEnabled=true
> > 09/11/11 14:49:14 INFO common.Storage: Image file of size 95 saved in 0
> > seconds.
> > 09/11/11 14:49:14 INFO common.Storage: Storage directory
> > dfs/hadoop-ahmad/dfs/name has been successfully formatted.
> > 09/11/11 14:49:14 INFO namenode.NameNode: SHUTDOWN_MSG:
> > /************************************************************
> > SHUTDOWN_MSG: Shutting down NameNode at aai/127.0.1.1
> > ************************************************************/
> > ah...@aai:~/hadoop-dev$ bin/start-all.sh
> > starting namenode, logging to
> > /home/ahmad/hadoop-dev/bin/../logs/hadoop-ahmad-namenode-aai.out
> > ah...@localhost's password:
> > localhost: starting datanode, logging to
> > /home/ahmad/hadoop-dev/bin/../logs/hadoop-ahmad-datanode-aai.out
> > ah...@localhost's password:
> > localhost: starting secondarynamenode, logging to
> > /home/ahmad/hadoop-dev/bin/../logs/hadoop-ahmad-secondarynamenode-aai.out
> > starting jobtracker, logging to
> > /home/ahmad/hadoop-dev/bin/../logs/hadoop-ahmad-jobtracker-aai.out
> > ah...@localhost's password:
> > localhost: starting tasktracker, logging to
> > /home/ahmad/hadoop-dev/bin/../logs/hadoop-ahmad-tasktracker-aai.out
> > ah...@aai:~/hadoop-dev$ bin/hadoop fs -put wcv2/input input
> > ah...@aai:~/hadoop-dev$ bin/hadoop jar wcv2/wordcount.jar WordCountv2
> > input
> > output
> > 09/11/11 14:52:14 INFO mapred.FileInputFormat: Total input paths to
> process
> > : 2
> > 09/11/11 14:52:14 INFO mapred.JobClient: Running job:
> job_200911111450_0001
> > 09/11/11 14:52:15 INFO mapred.JobClient:  map 0% reduce 0%
> > 09/11/11 14:52:22 INFO mapred.JobClient: Task Id :
> > attempt_200911111450_0001_m_000004_0, Status : FAILED
> > java.io.FileNotFoundException: File
> >
> >
> /home/ahmad/hadoop-dev/logs/userlogs/attempt_200911111450_0001_m_000004_0/log.tmp
> > does not exist.
> >    at
> >
> >
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
> >    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:192)
> >    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142)
> >    at
> >
> org.apache.hadoop.fs.RawLocalFileSystem.rename(RawLocalFileSystem.java:253)
> >    at
> >
> org.apache.hadoop.fs.ChecksumFileSystem.rename(ChecksumFileSystem.java:406)
> >    at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:193)
> >    at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:230)
> >    at org.apache.hadoop.mapred.Child.main(Child.java:138)
> >
> > 09/11/11 14:52:23 WARN mapred.JobClient: Error reading task
> >
> >
> outputhttp://aai:50060/tasklog?plaintext=true&taskid=attempt_200911111450_0001_m_000004_0&filter=stdout
> > 09/11/11 14:52:23 WARN mapred.JobClient: Error reading task
> >
> >
> outputhttp://aai:50060/tasklog?plaintext=true&taskid=attempt_200911111450_0001_m_000004_0&filter=stderr
> > 09/11/11 14:52:29 INFO mapred.JobClient: Task Id :
> > attempt_200911111450_0001_m_000004_1, Status : FAILED
> > java.io.FileNotFoundException: File
> >
> >
> /home/ahmad/hadoop-dev/logs/userlogs/attempt_200911111450_0001_m_000004_1/log.tmp
> > does not exist.
> >    at
> >
> >
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
> >    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:192)
> >    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142)
> >    at
> >
> org.apache.hadoop.fs.RawLocalFileSystem.rename(RawLocalFileSystem.java:253)
> >    at
> >
> org.apache.hadoop.fs.ChecksumFileSystem.rename(ChecksumFileSystem.java:406)
> >    at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:193)
> >    at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:230)
> >    at org.apache.hadoop.mapred.Child.main(Child.java:138)
> >
> > 09/11/11 14:52:29 WARN mapred.JobClient: Error reading task
> >
> >
> outputhttp://aai:50060/tasklog?plaintext=true&taskid=attempt_200911111450_0001_m_000004_1&filter=stdout
> > 09/11/11 14:52:29 WARN mapred.JobClient: Error reading task
> >
> >
> outputhttp://aai:50060/tasklog?plaintext=true&taskid=attempt_200911111450_0001_m_000004_1&filter=stderr
> > 09/11/11 14:52:35 INFO mapred.JobClient: Task Id :
> > attempt_200911111450_0001_m_000004_2, Status : FAILED
> > java.io.FileNotFoundException: File
> >
> >
> /home/ahmad/hadoop-dev/logs/userlogs/attempt_200911111450_0001_m_000004_2/log.tmp
> > does not exist.
> >    at
> >
> >
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
> >    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:192)
> >    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142)
> >    at
> >
> org.apache.hadoop.fs.RawLocalFileSystem.rename(RawLocalFileSystem.java:253)
> >    at
> >
> org.apache.hadoop.fs.ChecksumFileSystem.rename(ChecksumFileSystem.java:406)
> >    at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:193)
> >    at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:230)
> >    at org.apache.hadoop.mapred.Child.main(Child.java:138)
> >
> > 09/11/11 14:52:35 WARN mapred.JobClient: Error reading task
> >
> >
> outputhttp://aai:50060/tasklog?plaintext=true&taskid=attempt_200911111450_0001_m_000004_2&filter=stdout
> > 09/11/11 14:52:35 WARN mapred.JobClient: Error reading task
> >
> >
> outputhttp://aai:50060/tasklog?plaintext=true&taskid=attempt_200911111450_0001_m_000004_2&filter=stderr
> > 09/11/11 14:52:47 INFO mapred.JobClient: Task Id :
> > attempt_200911111450_0001_m_000003_0, Status : FAILED
> > java.io.FileNotFoundException: File
> >
> >
> /home/ahmad/hadoop-dev/logs/userlogs/attempt_200911111450_0001_m_000003_0/log.tmp
> > does not exist.
> >    at
> >
> >
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
> >    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:192)
> >    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142)
> >    at
> >
> org.apache.hadoop.fs.RawLocalFileSystem.rename(RawLocalFileSystem.java:253)
> >    at
> >
> org.apache.hadoop.fs.ChecksumFileSystem.rename(ChecksumFileSystem.java:406)
> >    at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:193)
> >    at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:230)
> >    at org.apache.hadoop.mapred.Child.main(Child.java:138)
> >
> > 09/11/11 14:52:47 WARN mapred.JobClient: Error reading task
> >
> >
> outputhttp://aai:50060/tasklog?plaintext=true&taskid=attempt_200911111450_0001_m_000003_0&filter=stdout
> > 09/11/11 14:52:47 WARN mapred.JobClient: Error reading task
> >
> >
> outputhttp://aai:50060/tasklog?plaintext=true&taskid=attempt_200911111450_0001_m_000003_0&filter=stderr
> > 09/11/11 14:52:53 INFO mapred.JobClient: Task Id :
> > attempt_200911111450_0001_m_000003_1, Status : FAILED
> > java.io.FileNotFoundException: File
> >
> >
> /home/ahmad/hadoop-dev/logs/userlogs/attempt_200911111450_0001_m_000003_1/log.tmp
> > does not exist.
> >    at
> >
> >
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
> >    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:192)
> >    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142)
> >    at
> >
> org.apache.hadoop.fs.RawLocalFileSystem.rename(RawLocalFileSystem.java:253)
> >    at
> >
> org.apache.hadoop.fs.ChecksumFileSystem.rename(ChecksumFileSystem.java:406)
> >    at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:193)
> >    at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:230)
> >    at org.apache.hadoop.mapred.Child.main(Child.java:138)
> >
> > 09/11/11 14:52:53 WARN mapred.JobClient: Error reading task
> >
> >
> outputhttp://aai:50060/tasklog?plaintext=true&taskid=attempt_200911111450_0001_m_000003_1&filter=stdout
> > 09/11/11 14:52:53 WARN mapred.JobClient: Error reading task
> >
> >
> outputhttp://aai:50060/tasklog?plaintext=true&taskid=attempt_200911111450_0001_m_000003_1&filter=stderr
> > 09/11/11 14:52:59 INFO mapred.JobClient: Task Id :
> > attempt_200911111450_0001_m_000003_2, Status : FAILED
> > java.io.FileNotFoundException: File
> >
> >
> /home/ahmad/hadoop-dev/logs/userlogs/attempt_200911111450_0001_m_000003_2/log.tmp
> > does not exist.
> >    at
> >
> >
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
> >    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:192)
> >    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142)
> >    at
> >
> org.apache.hadoop.fs.RawLocalFileSystem.rename(RawLocalFileSystem.java:253)
> >    at
> >
> org.apache.hadoop.fs.ChecksumFileSystem.rename(ChecksumFileSystem.java:406)
> >    at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:193)
> >    at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:230)
> >    at org.apache.hadoop.mapred.Child.main(Child.java:138)
> >
> > 09/11/11 14:52:59 WARN mapred.JobClient: Error reading task
> >
> >
> outputhttp://aai:50060/tasklog?plaintext=true&taskid=attempt_200911111450_0001_m_000003_2&filter=stdout
> > 09/11/11 14:52:59 WARN mapred.JobClient: Error reading task
> >
> >
> outputhttp://aai:50060/tasklog?plaintext=true&taskid=attempt_200911111450_0001_m_000003_2&filter=stderr
> > 09/11/11 14:53:04 INFO mapred.JobClient: Job complete:
> > job_200911111450_0001
> > 09/11/11 14:53:04 INFO mapred.JobClient: Counters: 0
> > Exception in thread "main" java.io.IOException: Job failed!
> >    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1252)
> >    at WordCountv2.run(WordCountv2.java:116)
> >    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >    at WordCountv2.main(WordCountv2.java:121)
> >    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >    at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >    at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >    at java.lang.reflect.Method.invoke(Method.java:597)
> >    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> > ah...@aai:~/hadoop-dev$
> >
> >
> > Would be grateful for your help.
> >
> >
> > Regards,
> > Ahmad
> >
>

Reply via email to