hi

I am new user to hadoop. Can anyone tell me how to make basic configuration
settings to run hadoop on a single machine.

I have copied the contents of hadoop-site.xml below. Apart from making
changes in hadoop-site.xml i haven't done any changes.

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>

<property>
    <name>fs.default.name</name>
    <value>(here i have given IP address of my machine):9000</value>
    <description>
        The name of the default file system. Either the literal string
        "local" or a host:port for NDFS.
    </description>
</property>

<property>
    <name>mapred.job.tracker</name>
    <value>(here i have given IP address of my machine):9001</value>
    <description>
        The host and port that the MapReduce job tracker runs at. If
        "local", then jobs are run in-process as a single map and
        reduce task.
    </description>
</property>

<property>
    <name>mapred.tasktracker.tasks.maximum</name>
    <value>2</value>
    <description>
        The maximum number of tasks that will be run simultaneously by
        a task tracker. This should be adjusted according to the heap size
        per task, the amount of RAM available, and CPU consumption of each
task.
    </description>
</property>

<property>
    <name>mapred.child.java.opts</name>
    <value>-Xmx200m</value>
    <description>
        You can specify other Java options for each map or reduce task here,
        but most likely you will want to adjust the heap size.
    </description>
</property>

<property>
    <name>dfs.name.dir</name>
    <value>/home/Hadoop/hadoop-install/hadoop-0.13.1/filesystem</value>
</property>

<property>
    <name>dfs.data.dir</name>
    <value>/home/Hadoop/hadoop-install/hadoop-0.13.1
/filesystem/userdata</value>
</property>

<property>
    <name>mapred.system.dir</name>
    <value>/home/Hadoop/hadoop-install/hadoop-0.13.1
/filesystem/mapreduce/system</value>
</property>

<property>
    <name>mapred.local.dir</name>
    <value>/home/Hadoop/hadoop-install/hadoop-0.13.1
/filesystem/mapreduce/local</value>
</property>

<property>
    <name>dfs.replication</name>
    <value>1</value>
</property>
</configuration>

When i try to run wordcount in 'hadoop-0.13.1-examples.jar' i get following
stack trace:

07/09/05 15:38:25 INFO mapred.FileInputFormat: Total input paths to process
: 1
07/09/05 15:38:25 INFO mapred.JobClient: Running job: job_0002
07/09/05 15:38:26 INFO mapred.JobClient:  map 0% reduce 0%
07/09/05 15:38:29 INFO mapred.JobClient: Task Id : task_0002_m_000000_0,
Status : FAILED
07/09/05 15:38:29 INFO mapred.JobClient: Task Id : task_0002_m_000001_0,
Status : FAILED
07/09/05 15:38:32 INFO mapred.JobClient: Task Id : task_0002_m_000000_1,
Status : FAILED
07/09/05 15:38:32 INFO mapred.JobClient: Task Id : task_0002_m_000001_1,
Status : FAILED
07/09/05 15:38:35 INFO mapred.JobClient: Task Id : task_0002_m_000000_2,
Status : FAILED
07/09/05 15:38:35 INFO mapred.JobClient: Task Id : task_0002_m_000001_2,
Status : FAILED
07/09/05 15:38:38 INFO mapred.JobClient:  map 100% reduce 100%
java.io.IOException: Job failed!
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:604)
        at org.apache.hadoop.examples.WordCount.main(WordCount.java:148)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(
NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(
DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:585)
        at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(
ProgramDriver.java:69)
        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java
:140)
        at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java
:40)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(
NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(
DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:585)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:155)


Please let me know where i have gone wrong.

Reply via email to