2 nodes: "ibmT43", "gentoo1".

ibmT43 = NameNode + JobTracker   +   TaskTracker + DataNode
gentoo1 = TaskTracker + DataNode

===conf/hadoop-site.xml=== - identical on 2 hosts:

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
        <property>
                <name>fs.default.name</name>
                <value>hdfs://ibmT43:9000/</value>
        </property>
        <property>
                <name>mapred.job.tracker</name>
                <value>ibmT43:9001</value>
        </property>
        <property>
                <name>dfs.replication</name>
                <value>2</value>
        </property>
        <property>
                <name>mapred.task.timeout</name>
                <value>20000</value>
        </property>

        <property>
                <name>hadoop.pipes.executable</name>
                <value>/bin/logparser</value>
        </property>

        <property>
          <name>mapred.map.tasks</name>
          <value>10</value>
        </property>
        <property>
          <name>mapred.reduce.tasks</name>
          <value>1</value>
        </property>
</configuration>

---------------------------

/etc/hosts on all two hosts also identical:

192.168.2.1    ibmT43
192.168.2.22   gentoo1
# 127.0.0.1     localhost
# it will have no effect if i will uncomment "localhost"
--------------------------

binary hdfs://bin/logparser is correct - it have been working in past.

--------------------------

bin/hadoop pipes -input /input -output /output1 -conf 123test.xml

====123test.xml===:
<?xml version="1.0"?>
<configuration>

<!--
<property>
  <name>mapred.reduce.tasks</name>
  <value>2</value>
</property>
-->

<property>
  <name>hadoop.pipes.java.recordreader</name>
  <value>true</value>
</property>

<property>
  <name>hadoop.pipes.java.recordwriter</name>
  <value>true</value>
</property>

</configuration>

-------------------------------------

Running job:
had...@ibmt43 ~/hadoop-0.18.3 $ bin/hadoop pipes -input /input -output /output1 -conf 123test.xml 09/06/18 22:38:50 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 09/06/18 22:38:50 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). 09/06/18 22:38:50 INFO mapred.FileInputFormat: Total input paths to process : 4 09/06/18 22:38:50 INFO mapred.FileInputFormat: Total input paths to process : 4
09/06/18 22:38:51 INFO mapred.JobClient: Running job: job_200906182236_0001
09/06/18 22:38:52 INFO mapred.JobClient:  map 0% reduce 0%
09/06/18 22:39:03 INFO mapred.JobClient:  map 5% reduce 0%
09/06/18 22:39:12 INFO mapred.JobClient:  map 11% reduce 0%
09/06/18 22:39:23 INFO mapred.JobClient:  map 5% reduce 0%
09/06/18 22:39:23 INFO mapred.JobClient: Task Id : attempt_200906182236_0001_m_000002_0, Status : FAILED Task attempt_200906182236_0001_m_000002_0 failed to report status for 22 seconds. Killing! 09/06/18 22:39:24 INFO mapred.JobClient: Task Id : attempt_200906182236_0001_m_000003_0, Status : FAILED Task attempt_200906182236_0001_m_000003_0 failed to report status for 22 seconds. Killing!
09/06/18 22:39:33 INFO mapred.JobClient:  map 0% reduce 0%
09/06/18 22:39:33 INFO mapred.JobClient: Task Id : attempt_200906182236_0001_m_000000_0, Status : FAILED Task attempt_200906182236_0001_m_000000_0 failed to report status for 23 seconds. Killing! attempt_200906182236_0001_m_000000_0: Hadoop Pipes Exception: write error to file: Connection reset by peer at SerialUtils.cc:129 in virtual void Hadoo
pUtils::FileOutStream::write(const void*, size_t)
09/06/18 22:39:33 INFO mapred.JobClient: Task Id : attempt_200906182236_0001_m_000001_0, Status : FAILED Task attempt_200906182236_0001_m_000001_0 failed to report status for 23 seconds. Killing!
09/06/18 22:39:34 INFO mapred.JobClient:  map 2% reduce 0%
09/06/18 22:39:38 INFO mapred.JobClient:  map 5% reduce 0%
09/06/18 22:39:43 INFO mapred.JobClient:  map 8% reduce 0%
09/06/18 22:39:48 INFO mapred.JobClient:  map 12% reduce 0%
09/06/18 22:39:54 INFO mapred.JobClient:  map 9% reduce 0%
09/06/18 22:39:54 INFO mapred.JobClient: Task Id : attempt_200906182236_0001_m_000004_0, Status : FAILED Task attempt_200906182236_0001_m_000004_0 failed to report status for 22 seconds. Killing! attempt_200906182236_0001_m_000004_0: Hadoop Pipes Exception: write error to file: Connection reset by peer at SerialUtils.cc:129 in virtual void Hadoo
pUtils::FileOutStream::write(const void*, size_t)
09/06/18 22:39:59 INFO mapred.JobClient:  map 6% reduce 0%
09/06/18 22:39:59 INFO mapred.JobClient: Task Id : attempt_200906182236_0001_m_000005_0, Status : FAILED Task attempt_200906182236_0001_m_000005_0 failed to report status for 22 seconds. Killing! attempt_200906182236_0001_m_000005_0: Hadoop Pipes Exception: write error to file: Connection reset by peer at SerialUtils.cc:129 in virtual void Hadoo
pUtils::FileOutStream::write(const void*, size_t)
09/06/18 22:40:03 INFO mapred.JobClient: Task Id : attempt_200906182236_0001_m_000003_1, Status : FAILED Task attempt_200906182236_0001_m_000003_1 failed to report status for 20 seconds. Killing!

-------------------------
20 sec, 50 sec or 999999 sec - not important.
----------------------------

thank you

Reply via email to