thanks much!

checked out latest trunk and regenerated non sparse matrices  and perform
addition, works like a charm, nice, its is an interesting framework... can't
wait enough for  BSP-based graph...:)

I assume that in C = A + B, C is discarded at the end of computation as this
is POC, doing a re-test to see if C = A + B in fact is B = A + B, scan on
hbase  table data should reveal this

in fact structure before addition in hbase is:


HBase Shell; enter 'help<RETURN>' for list of supported commands.
Version: 0.20.3, rUnknown, Sat Feb  6 19:18:26 EST 2010
hbase(main):001:0> list
DenseMatrix_randfppzt

DenseMatrix_randihcnl

DenseMatrix_randiwgls

DenseMatrix_randnamfs

SparseMatrix_randrjyas

SparseMatrix_randzvyfi

hama.admin.table

7 row(s) in 0.1300 seconds
hbase(main):002:0>

with latest matrices being:

hbase(main):002:0> scan "hama.admin.table"
ROW                          COLUMN+CELL

 matrixA                     column=path:, timestamp=1266338252994,
value=DenseMatrix_randfppzt
 matrixB                     column=path:, timestamp=1266338349346,
value=DenseMatrix_randnamfs
2 row(s) in 0.0420 seconds
hbase(main):003:0>

dumping  DenseMatrix_randfppz and DenseMatrix_randnamfs before computation

#$HAMA_HOME/bin/hama examples rand -m 10 -r 10 5 5 30.5% matrixA
$HAMA_HOME/bin/hama examples rand -m 10 -r 10 5 5 100% matrixA

#$HAMA_HOME/bin/hama examples rand -m 10 -r 10 5 5 30.5% matrixB
$HAMA_HOME/bin/hama examples rand -m 10 -r 10 5 5 100% matrixB

hbase(main):004:0> scan "DenseMatrix_randfppzt"
ROW                          COLUMN+CELL

 000000000000000             column=column:0, timestamp=1266338206942,
value=\x3F\xB5\xD4\xB2u>\xF9\x40
 000000000000000             column=column:1, timestamp=1266338206942,
value=\x3F\xE7\x5D\xA3\x95\x80\x89E
 000000000000000             column=column:2, timestamp=1266338206942,
value=\x3F\xE6 m\xC7\xFE\x81:
 000000000000000             column=column:3, timestamp=1266338206942,
value=\x3F\xE8\x88\x3BXT\xBC\xB6
 000000000000000             column=column:4, timestamp=1266338206942,
value=\x3F\x90\xA8\xCE\xC8\x86\xEA\xC0
 000000000000001             column=column:0, timestamp=1266338204247,
value=\x3F\xEEm\xE7\xE5So\x5B
 000000000000001             column=column:1, timestamp=1266338204247,
value=\x3F\xEA\x80\x25\xFCb\x82\xF8
 000000000000001             column=column:2, timestamp=1266338204247,
value=\x3F\xD2sxc\xDC\x9B\x92
 000000000000001             column=column:3, timestamp=1266338204247,
value=\x3F\xEC\xA5\xC3\x8A\xFE\xBE\xE1
 000000000000001             column=column:4, timestamp=1266338204247,
value=\x3F\xE9C\xE7\x9D\xA0\xB5\xD1
 000000000000002             column=column:0, timestamp=1266338211747,
value=\x3F\xE6l\xF0\xDE\xF2\xDC\x1F
 000000000000002             column=column:1, timestamp=1266338211747,
value=\x3F\xE2\xC9\x40\x40\x23Q\x95
 000000000000002             column=column:2, timestamp=1266338211747,
value=\x3F\xDA\xA1L-fr\x18
 000000000000002             column=column:3, timestamp=1266338211747,
value=\x3F\xEAJI2\xA3q\x80
 000000000000002             column=column:4, timestamp=1266338211747,
value=\x3F\xD4\xEBw\x10\xE3\xE0\xF2
 000000000000003             column=column:0, timestamp=1266338219666,
value=\x3F\xD1j\x04\xC4D\x00T
 000000000000003             column=column:1, timestamp=1266338219666,
value=\x3F\xDA\xC1\x07 \x87\x82\x24
 000000000000003             column=column:2, timestamp=1266338219666,
value=\x3F\x9Ag\x1Ae\x8F\xA8\xE0
 000000000000003             column=column:3, timestamp=1266338219666,
value=\x3F\xE7\xC7\x1E\xEF8\x8F\x2B
 000000000000003             column=column:4, timestamp=1266338219666,
value=\x3F\xDF\x97\xEB\xF4\x90=\xA6
 000000000000004             column=column:0, timestamp=1266338221964,
value=\x3F\xEAp\x8Ew\xC3\x85\x0A
 000000000000004             column=column:1, timestamp=1266338221964,
value=\x3F\xD4\x84\x28\xDD\xEC\x13\x16
 000000000000004             column=column:2, timestamp=1266338221964,
value=\x3F\xD1\xFCX\xC9-\xEF\xB0
 000000000000004             column=column:3, timestamp=1266338221964,
value=\x3F\xCA\x8F\xA1\x0A\x86\x0FT
 000000000000004             column=column:4, timestamp=1266338221964,
value=\x3F\xD1:\xBF\xA2\x00\xF2\x98
 metadata                    column=aliase:name, timestamp=1266338252991,
value=matrixA
 metadata                    column=attribute:columns,
timestamp=1266338159915, value=\x00\x00\x00\x05
 metadata                    column=attribute:reference,
timestamp=1266338159914, value=\x00\x00\x00\x01
 metadata                    column=attribute:rows, timestamp=1266338159915,
value=\x00\x00\x00\x05
 metadata                    column=attribute:type, timestamp=1266338252991,
value=DenseMatrix
6 row(s) in 0.2040 seconds

and

hbase(main):006:0> scan "DenseMatrix_randnamfs"
ROW                          COLUMN+CELL

 000000000000000             column=column:0, timestamp=1266338303225,
value=\x3F\xC2\xEB\xE3\xE6X/\x94
 000000000000000             column=column:1, timestamp=1266338303225,
value=\x3F\xC9\xA4\x24\x24\x10I\x24
 000000000000000             column=column:2, timestamp=1266338303225,
value=\x3F\xD2\x9C\x88\x28\xD4\xB4\xE4
 000000000000000             column=column:3, timestamp=1266338303225,
value=\x3F\xAB\xF7\xC0E\x8E8\xD0
 000000000000000             column=column:4, timestamp=1266338303225,
value=\x3F\xD9\xA8\xA7p\x1F\xE0\x94
 000000000000001             column=column:0, timestamp=1266338301140,
value=\x3F\xAA\xC2ryG\xAA\x60
 000000000000001             column=column:1, timestamp=1266338301140,
value=\x3F\xEF\x0B\x13\xB3l\x27\x88
 000000000000001             column=column:2, timestamp=1266338301140,
value=\x3F\xEF<\x81\x2Bv\x9F\x18
 000000000000001             column=column:3, timestamp=1266338301140,
value=\x3F\xEB\xB7B\x12\xA6\xA9p
 000000000000001             column=column:4, timestamp=1266338301140,
value=\x3F\xED\xB0\x12\x0EI\xF7Q
 000000000000002             column=column:0, timestamp=1266338312605,
value=\x3F\xEA\x19:\x94\xA0g\x92
 000000000000002             column=column:1, timestamp=1266338312605,
value=\x3F\xE5Ac\xF2\xB7\x8C\xE6
 000000000000002             column=column:2, timestamp=1266338312605,
value=\x3F\xE03\x0C\xFD\x0BP\xF1
 000000000000002             column=column:3, timestamp=1266338312605,
value=\x3F\xEA\xB6W\xDC\x9F\x81\xA5
 000000000000002             column=column:4, timestamp=1266338312605,
value=\x3F\xE4\xB8i\x81KAD
 000000000000003             column=column:0, timestamp=1266338311654,
value=\x3F\xEB\xF1\xAC\x03\x7F\xF0\x99
 000000000000003             column=column:1, timestamp=1266338311654,
value=\x3F\xA6\xF1N\xD1\xE6\xA8
 000000000000003             column=column:2, timestamp=1266338311654,
value=\x3F\xEE\xFD\xEA\xB13\x88
 000000000000003             column=column:3, timestamp=1266338311654,
value=\x3F\xEEo\x3F\x9B\x3BZ\xCB
 000000000000003             column=column:4, timestamp=1266338311654,
value=\x3F\xB67\x00\xEAH\xE1p
 000000000000004             column=column:0, timestamp=1266338317482,
value=\x3F\xC8\xF9\x06\xF4\x7B\x0BL
 000000000000004             column=column:1, timestamp=1266338317482,
value=\x3F\xD0\xB4:rl\x83\xD4
 000000000000004             column=column:2, timestamp=1266338317482,
value=\x3F\xE6\xD3\xE7\x01\xCDB\xF3
 000000000000004             column=column:3, timestamp=1266338317482,
value=\x3F\xE0\xF8\xF7\x3B\xA2\xF7\xB9
 000000000000004             column=column:4, timestamp=1266338317482,
value=\x3F\xE2b\x9C\x8B\x17\xC6\xF6
 metadata                    column=aliase:name, timestamp=1266338349344,
value=matrixB
 metadata                    column=attribute:columns,
timestamp=1266338257506, value=\x00\x00\x00\x05
 metadata                    column=attribute:reference,
timestamp=1266338257505, value=\x00\x00\x00\x01
 metadata                    column=attribute:rows, timestamp=1266338257506,
value=\x00\x00\x00\x05
 metadata                    column=attribute:type, timestamp=1266338349344,
value=DenseMatrix

and after computation shows same tabe structure although dont know if B = A
+ B (would need to de-code byte representation)

$HAMA_HOME/bin/hama examples add matrixA matrixB

hbase(main):001:0> list
DenseMatrix_randfppzt

DenseMatrix_randihcnl

DenseMatrix_randiwgls

DenseMatrix_randnamfs

SparseMatrix_randrjyas

SparseMatrix_randzvyfi

hama.admin.table

7 row(s) in 0.1390 seconds
hbase(main):002:0>

also no console output was genererated and table was purged at the jobs
end...

C = A + B
10/02/16 11:44:28 INFO matrix.AbstractMatrix: Initializing the matrix
storage.
10/02/16 11:44:31 INFO matrix.AbstractMatrix: Create Matrix
DenseMatrix_randtonup
10/02/16 11:44:31 WARN mapred.JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the same.
10/02/16 11:44:31 WARN mapred.JobClient: No job jar file set.  User classes
may not be found. See JobConf(Class) or JobConf#setJar(String).
10/02/16 11:44:32 INFO mapred.JobClient: Running job: job_201002161111_0008
10/02/16 11:44:33 INFO mapred.JobClient:  map 0% reduce 0%
10/02/16 11:44:43 INFO mapred.JobClient:  map 100% reduce 0%
10/02/16 11:44:55 INFO mapred.JobClient:  map 100% reduce 100%
10/02/16 11:44:57 INFO mapred.JobClient: Job complete: job_201002161111_0008
10/02/16 11:44:57 INFO mapred.JobClient: Counters: 15
10/02/16 11:44:57 INFO mapred.JobClient:   Job Counters
10/02/16 11:44:57 INFO mapred.JobClient:     Launched reduce tasks=1
10/02/16 11:44:57 INFO mapred.JobClient:     Launched map tasks=1
10/02/16 11:44:57 INFO mapred.JobClient:     Data-local map tasks=1
10/02/16 11:44:57 INFO mapred.JobClient:   FileSystemCounters
10/02/16 11:44:57 INFO mapred.JobClient:     FILE_BYTES_READ=601
10/02/16 11:44:57 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=1234
10/02/16 11:44:57 INFO mapred.JobClient:   Map-Reduce Framework
10/02/16 11:44:57 INFO mapred.JobClient:     Reduce input groups=5
10/02/16 11:44:57 INFO mapred.JobClient:     Combine output records=0
10/02/16 11:44:57 INFO mapred.JobClient:     Map input records=5
10/02/16 11:44:57 INFO mapred.JobClient:     Reduce shuffle bytes=0
10/02/16 11:44:57 INFO mapred.JobClient:     Reduce output records=5
10/02/16 11:44:57 INFO mapred.JobClient:     Spilled Records=10
10/02/16 11:44:57 INFO mapred.JobClient:     Map output bytes=585
10/02/16 11:44:57 INFO mapred.JobClient:     Combine input records=0
10/02/16 11:44:57 INFO mapred.JobClient:     Map output records=5
10/02/16 11:44:57 INFO mapred.JobClient:     Reduce input records=5
10/02/16 11:45:00 INFO client.HBaseAdmin: Disabled DenseMatrix_randtonup
10/02/16 11:45:00 INFO client.HBaseAdmin: Deleted DenseMatrix_randtonup
10/02/16 11:45:00 INFO zookeeper.ZooKeeper: Closing session:
0x126d78c7fa5004c
10/02/16 11:45:00 INFO zookeeper.ClientCnxn: Closing ClientCnxn for session:
0x126d78c7fa5004c
10/02/16 11:45:00 INFO zookeeper.ClientCnxn: Exception while closing send
thread for session 0x126d78c7fa5004c : Read error rc = -1
java.nio.DirectByteBuffer[pos=0 lim=4 cap=4]
10/02/16 11:45:00 WARN zookeeper.ClientCnxn: Ignoring exception during
shutdown input
java.net.SocketException: Socket is not connected
at sun.nio.ch.SocketChannelImpl.shutdown(Native Method)
at sun.nio.ch.SocketChannelImpl.shutdownInput(SocketChannelImpl.java:640)
at sun.nio.ch.SocketAdaptor.shutdownInput(SocketAdaptor.java:360)
at org.apache.zookeeper.ClientCnxn$SendThread.cleanup(ClientCnxn.java:999)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:984)
10/02/16 11:45:00 INFO zookeeper.ClientCnxn: Disconnecting ClientCnxn for
session: 0x126d78c7fa5004c
10/02/16 11:45:00 INFO zookeeper.ZooKeeper: Session: 0x126d78c7fa5004c
closed
10/02/16 11:45:00 INFO zookeeper.ClientCnxn: EventThread shut down




best regards



On Tue, Feb 16, 2010 at 10:25 AM, Edward J. Yoon <[email protected]>wrote:

> Sorry, I just noticed that the addition of sparse matrices is not
> implemented yet. I'll fix it soon.
>
> Currently, an addition of dense matrices will only be run.
>
> # bin/hama examples rand 100 100 100% testA
> # bin/hama examples rand 100 100 100% testB
> # bin/hama examples add testA testB
>
> On Tue, Feb 16, 2010 at 11:07 PM, Edward J. Yoon <[email protected]>
> wrote:
> > Hi, Andrew
> >
> >>> java.lang.IndexOutOfBoundsException: v1.size != v2.size (25 != 23)
> >
> > Eh, It seems an un-expected exception. Let me check it out.
> >
> >>> Another question is about lack of "breadth first search" sample,
> >>> is this a sub-project of next release?
> >
> > Using M/R iterations, you can simply implement it --
> > http://blog.udanax.org/2009/02/breadth-first-search-mapreduce.html
> >
> > But, an M/R iterative job is *awfully* slow, so I think it's not worth
> > to implement.
> >
> > Instead, we trying to implement a BSP-based graph computing framework
> > to improve the performance of graph algorithms on Hadoop.
> >
> > On Tue, Feb 16, 2010 at 10:41 PM, Andrew Milkowski
> > <[email protected]> wrote:
> >> Hi there
> >>
> >> Have following working setup:
> >>
> >> hadoop 0.20.1
> >> hbase 0.20.3
> >> hama latest SVN trunk version
> >>
> >> Notes:
> >>
> >> had to replace and rebuild hama with new hbase lib (upgraded to version
> >> 0.20.3)
> >>
> >> 1. successfully rand hama example to generate random matrix
> >>
> >> $HAMA_HOME/bin/hama examples rand -m 10 -r 10 100 100 30.5% matrixA
> >>
> >> Note: verified hbase table was generated (via shell list and scan)
> >>
> >> 2. moved on to other examples, generated 2 random matrices:
> >>
> >> $HAMA_HOME/bin/hama examples rand -m 10 -r 10 100 100 30.5% matrixA
> >> $HAMA_HOME/bin/hama examples rand -m 10 -r 10 100 100 30.5% matrixB
> >>
> >> then an attempt at distributed addition:
> >>
> >> $HAMA_HOME/bin/hama examples add1  matrixA matrixB
> >>
> >>
> >> 3. in similar venue tried multi, norms and and similarity (all generated
> >> exceptions)
> >>
> >> Exceptions appear to be hama code related (and not environmentally
> specific)
> >>
> >> Culprit could be in improper usage of input matrix data to add,multi,
> norms
> >> and similarity.
> >>
> >> Another question is about lack of "breadth first search" sample, is this
> a
> >> sub-project of next release?
> >>
> >> thanks
> >>
> >> below attached console stack trace of bash-3.2$ ./run_examples.sh (Note:
> >> ignore  INFO zookeeper.ClientCnxn: Exception while closing send thread
> for
> >> session... exception that is a known "INFO" level exception present in
> >> zookeeper client/server socket code fragment)
> >>
> >>
> >> --------------------------------------------
> >>
> >> run_examples.sh:
> >>
> >> #!/bin/bash
> >>
> >> export HAMA_JAR="${HAMA_HOME}/hama-0.2.0-dev-examples.jar"
> >>
> >> # cygwin path translation
> >> #if $cygwin; then
> >> #  HAMA_CLASSPATH=`cygpath -p -w "$HAMA_CLASSPATH"`
> >> #fi
> >>
> >> $HAMA_HOME/bin/hama examples rand -m 10 -r 10 100 100 30.5% matrixA
> >>
> >> $HAMA_HOME/bin/hama examples rand -m 10 -r 10 100 100 30.5% matrixB
> >>
> >> $HAMA_HOME/bin/hama examples add matrixA matrixB
> >>
> >> exit $?
> >>
> >>
> >>
> >>
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client
> >> environment:zookeeper.version=3.2.1-808558, built on 08/27/2009 18:48
> GMT
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client environment:
> host.name
> >> =192.168.0.11
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client
> >> environment:java.version=1.6.0_17
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client
> >> environment:java.vendor=Apple Inc.
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client
> >>
> environment:java.home=/System/Library/Frameworks/JavaVM.framework/Versions/1.6.0/Home
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client
> >>
> environment:java.class.path=/opt/local/src/hama/current/bin/../conf:/System/Library/Frameworks/JavaVM.framework/Versions/1.6.0/Home/lib/tools.jar:/opt/local/src/hama/current/bin/../build/classes:/opt/local/src/hama/current/bin/../build/test:/opt/local/src/hama/current/bin/../build:/opt/local/src/hama/current/bin/../hama-0.2.0-dev-examples.jar:/opt/local/src/hama/current/bin/../hama-0.2.0-dev.jar:/opt/local/src/hama/current/bin/../lib/AgileJSON-2009-03-30.jar:/opt/local/src/hama/current/bin/../lib/commons-cli-2.0-SNAPSHOT.jar:/opt/local/src/hama/current/bin/../lib/commons-el-from-jetty-5.1.4.jar:/opt/local/src/hama/current/bin/../lib/commons-httpclient-3.0.1.jar:/opt/local/src/hama/current/bin/../lib/commons-logging-1.0.4.jar:/opt/local/src/hama/current/bin/../lib/commons-logging-api-1.0.4.jar:/opt/local/src/hama/current/bin/../lib/commons-math-1.1.jar:/opt/local/src/hama/current/bin/../lib/hadoop-0.20.1-core.jar:/opt/local/src/hama/current/bin/../lib/hadoop-0.20.1-test.jar:/opt/local/src/hama/current/bin/../lib/hbase-0.20.3-test.jar:/opt/local/src/hama/current/bin/../lib/hbase-0.20.3.jar:/opt/local/src/hama/current/bin/../lib/jasper-compiler-5.5.12.jar:/opt/local/src/hama/current/bin/../lib/jasper-runtime-5.5.12.jar:/opt/local/src/hama/current/bin/../lib/javacc.jar:/opt/local/src/hama/current/bin/../lib/jetty-6.1.14.jar:/opt/local/src/hama/current/bin/../lib/jetty-util-6.1.14.jar:/opt/local/src/hama/current/bin/../lib/jruby-complete-1.2.0.jar:/opt/local/src/hama/current/bin/../lib/json.jar:/opt/local/src/hama/current/bin/../lib/junit-3.8.1.jar:/opt/local/src/hama/current/bin/../lib/libthrift-r771587.jar:/opt/local/src/hama/current/bin/../lib/log4j-1.2.13.jar:/opt/local/src/hama/current/bin/../lib/log4j-1.2.15.jar:/opt/local/src/hama/current/bin/../lib/servlet-api-2.5-6.1.14.jar:/opt/local/src/hama/current/bin/../lib/xmlenc-0.52.jar:/opt/local/src/hama/current/bin/../lib/zookeeper-3.2.1.jar:/opt/local/src/hama/current/bin/../lib/jetty-ext/*.jar:/opt/local/src/hama/current/bin/../lib/findbugs/annotations.jar:/opt/local/src/hama/current/bin/../lib/findbugs/ant.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-analysis-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-commons-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-tree-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-util-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-xml-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/bcel.jar:/opt/local/src/hama/current/bin/../lib/findbugs/dom4j-full.jar:/opt/local/src/hama/current/bin/../lib/findbugs/findbugs-ant.jar:/opt/local/src/hama/current/bin/../lib/findbugs/findbugs.jar:/opt/local/src/hama/current/bin/../lib/findbugs/findbugsGUI.jar:/opt/local/src/hama/current/bin/../lib/findbugs/jsr305.jar:/opt/local/src/hama/current/bin/../lib/findbugs/plugin/coreplugin.jar:/opt/local/src/hadoop/current/conf:/opt/local/src/hadoop/current/conf
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client
> >>
> environment:java.library.path=.:/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client
> >>
> environment:java.io.tmpdir=/var/folders/pM/pMTPzO70Hc4Gd2FB24T-dU+++TM/-Tmp-/
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client
> >> environment:java.compiler=<NA>
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client environment:os.name
> =Mac
> >> OS X
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client
> >> environment:os.arch=x86_64
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client
> >> environment:os.version=10.6.2
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client environment:
> user.name
> >> =hadoop
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client
> >> environment:user.home=/Users/hadoop
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Client
> >> environment:user.dir=/opt/local/hadoop/jobs/hama
> >> 10/02/16 08:28:29 INFO zookeeper.ZooKeeper: Initiating client
> connection,
> >> connectString=localhost:2181 sessionTimeout=60000
> >>
> watcher=org.apache.hadoop.hbase.client.hconnectionmanager$clientzkwatc...@2efb56b1
> >> 10/02/16 08:28:29 INFO zookeeper.ClientCnxn:
> zookeeper.disableAutoWatchReset
> >> is false
> >> 10/02/16 08:28:29 INFO zookeeper.ClientCnxn: Attempting connection to
> server
> >> localhost/127.0.0.1:2181
> >> 10/02/16 08:28:29 INFO zookeeper.ClientCnxn: Priming connection to
> >> java.nio.channels.SocketChannel[connected
> >> local=/127.0.0.1:58238remote=localhost/
> >> 127.0.0.1:2181]
> >> 10/02/16 08:28:29 INFO zookeeper.ClientCnxn: Server connection
> successful
> >> 10/02/16 08:28:30 INFO matrix.AbstractMatrix: Initializing the matrix
> >> storage.
> >> 10/02/16 08:28:32 INFO matrix.AbstractMatrix: Create Matrix
> >> SparseMatrix_randliqns
> >> 10/02/16 08:28:32 INFO matrix.AbstractMatrix: Create the 100 * 100
> random
> >> matrix : SparseMatrix_randliqns
> >> Wrote input for Map #0
> >> Wrote input for Map #1
> >> Wrote input for Map #2
> >> Wrote input for Map #3
> >> Wrote input for Map #4
> >> Wrote input for Map #5
> >> Wrote input for Map #6
> >> Wrote input for Map #7
> >> Wrote input for Map #8
> >> Wrote input for Map #9
> >> 10/02/16 08:28:33 WARN mapred.JobClient: Use GenericOptionsParser for
> >> parsing the arguments. Applications should implement Tool for the same.
> >> 10/02/16 08:28:33 WARN mapred.JobClient: No job jar file set.  User
> classes
> >> may not be found. See JobConf(Class) or JobConf#setJar(String).
> >> 10/02/16 08:28:33 INFO input.FileInputFormat: Total input paths to
> process :
> >> 10
> >> 10/02/16 08:28:34 INFO mapred.JobClient: Running job:
> job_201002160823_0001
> >> 10/02/16 08:28:35 INFO mapred.JobClient:  map 0% reduce 0%
> >> 10/02/16 08:28:51 INFO mapred.JobClient:  map 20% reduce 0%
> >> 10/02/16 08:29:00 INFO mapred.JobClient:  map 40% reduce 0%
> >> 10/02/16 08:29:06 INFO mapred.JobClient:  map 40% reduce 1%
> >> 10/02/16 08:29:09 INFO mapred.JobClient:  map 60% reduce 2%
> >> 10/02/16 08:29:15 INFO mapred.JobClient:  map 80% reduce 2%
> >> 10/02/16 08:29:18 INFO mapred.JobClient:  map 80% reduce 3%
> >> 10/02/16 08:29:21 INFO mapred.JobClient:  map 100% reduce 4%
> >> 10/02/16 08:29:24 INFO mapred.JobClient:  map 100% reduce 5%
> >> 10/02/16 08:29:27 INFO mapred.JobClient:  map 100% reduce 12%
> >> 10/02/16 08:29:30 INFO mapred.JobClient:  map 100% reduce 20%
> >> 10/02/16 08:29:36 INFO mapred.JobClient:  map 100% reduce 23%
> >> 10/02/16 08:29:39 INFO mapred.JobClient:  map 100% reduce 33%
> >> 10/02/16 08:29:42 INFO mapred.JobClient:  map 100% reduce 40%
> >> 10/02/16 08:29:48 INFO mapred.JobClient:  map 100% reduce 43%
> >> 10/02/16 08:29:51 INFO mapred.JobClient:  map 100% reduce 60%
> >> 10/02/16 08:30:01 INFO mapred.JobClient:  map 100% reduce 70%
> >> 10/02/16 08:30:04 INFO mapred.JobClient:  map 100% reduce 73%
> >> 10/02/16 08:30:07 INFO mapred.JobClient:  map 100% reduce 80%
> >> 10/02/16 08:30:12 INFO mapred.JobClient:  map 100% reduce 100%
> >> 10/02/16 08:30:14 INFO mapred.JobClient: Job complete:
> job_201002160823_0001
> >> 10/02/16 08:30:14 INFO mapred.JobClient: Counters: 16
> >> 10/02/16 08:30:14 INFO mapred.JobClient:   Job Counters
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     Launched reduce tasks=10
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     Launched map tasks=10
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     Data-local map tasks=10
> >> 10/02/16 08:30:14 INFO mapred.JobClient:   FileSystemCounters
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     FILE_BYTES_READ=42372
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     HDFS_BYTES_READ=1080
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=87764
> >> 10/02/16 08:30:14 INFO mapred.JobClient:   Map-Reduce Framework
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     Reduce input groups=100
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     Combine output records=0
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     Map input records=10
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     Reduce shuffle bytes=42463
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     Reduce output records=100
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     Spilled Records=200
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     Map output bytes=41912
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     Combine input records=0
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     Map output records=100
> >> 10/02/16 08:30:14 INFO mapred.JobClient:     Reduce input records=100
> >> 10/02/16 08:30:14 INFO zookeeper.ZooKeeper: Closing session:
> >> 0x126d6f5b8a90003
> >> 10/02/16 08:30:14 INFO zookeeper.ClientCnxn: Closing ClientCnxn for
> session:
> >> 0x126d6f5b8a90003
> >> 10/02/16 08:30:14 INFO zookeeper.ClientCnxn: Exception while closing
> send
> >> thread for session 0x126d6f5b8a90003 : Read error rc = -1
> >> java.nio.DirectByteBuffer[pos=0 lim=4 cap=4]
> >> 10/02/16 08:30:14 WARN zookeeper.ClientCnxn: Ignoring exception during
> >> shutdown input
> >> java.net.SocketException: Socket is not connected
> >> at sun.nio.ch.SocketChannelImpl.shutdown(Native Method)
> >>  at
> sun.nio.ch.SocketChannelImpl.shutdownInput(SocketChannelImpl.java:640)
> >> at sun.nio.ch.SocketAdaptor.shutdownInput(SocketAdaptor.java:360)
> >>  at
> org.apache.zookeeper.ClientCnxn$SendThread.cleanup(ClientCnxn.java:999)
> >> at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:984)
> >> 10/02/16 08:30:14 INFO zookeeper.ClientCnxn: Disconnecting ClientCnxn
> for
> >> session: 0x126d6f5b8a90003
> >> 10/02/16 08:30:14 INFO zookeeper.ZooKeeper: Session: 0x126d6f5b8a90003
> >> closed
> >> 10/02/16 08:30:14 INFO zookeeper.ClientCnxn: EventThread shut down
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client
> >> environment:zookeeper.version=3.2.1-808558, built on 08/27/2009 18:48
> GMT
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client environment:
> host.name
> >> =192.168.0.11
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client
> >> environment:java.version=1.6.0_17
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client
> >> environment:java.vendor=Apple Inc.
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client
> >>
> environment:java.home=/System/Library/Frameworks/JavaVM.framework/Versions/1.6.0/Home
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client
> >>
> environment:java.class.path=/opt/local/src/hama/current/bin/../conf:/System/Library/Frameworks/JavaVM.framework/Versions/1.6.0/Home/lib/tools.jar:/opt/local/src/hama/current/bin/../build/classes:/opt/local/src/hama/current/bin/../build/test:/opt/local/src/hama/current/bin/../build:/opt/local/src/hama/current/bin/../hama-0.2.0-dev-examples.jar:/opt/local/src/hama/current/bin/../hama-0.2.0-dev.jar:/opt/local/src/hama/current/bin/../lib/AgileJSON-2009-03-30.jar:/opt/local/src/hama/current/bin/../lib/commons-cli-2.0-SNAPSHOT.jar:/opt/local/src/hama/current/bin/../lib/commons-el-from-jetty-5.1.4.jar:/opt/local/src/hama/current/bin/../lib/commons-httpclient-3.0.1.jar:/opt/local/src/hama/current/bin/../lib/commons-logging-1.0.4.jar:/opt/local/src/hama/current/bin/../lib/commons-logging-api-1.0.4.jar:/opt/local/src/hama/current/bin/../lib/commons-math-1.1.jar:/opt/local/src/hama/current/bin/../lib/hadoop-0.20.1-core.jar:/opt/local/src/hama/current/bin/../lib/hadoop-0.20.1-test.jar:/opt/local/src/hama/current/bin/../lib/hbase-0.20.3-test.jar:/opt/local/src/hama/current/bin/../lib/hbase-0.20.3.jar:/opt/local/src/hama/current/bin/../lib/jasper-compiler-5.5.12.jar:/opt/local/src/hama/current/bin/../lib/jasper-runtime-5.5.12.jar:/opt/local/src/hama/current/bin/../lib/javacc.jar:/opt/local/src/hama/current/bin/../lib/jetty-6.1.14.jar:/opt/local/src/hama/current/bin/../lib/jetty-util-6.1.14.jar:/opt/local/src/hama/current/bin/../lib/jruby-complete-1.2.0.jar:/opt/local/src/hama/current/bin/../lib/json.jar:/opt/local/src/hama/current/bin/../lib/junit-3.8.1.jar:/opt/local/src/hama/current/bin/../lib/libthrift-r771587.jar:/opt/local/src/hama/current/bin/../lib/log4j-1.2.13.jar:/opt/local/src/hama/current/bin/../lib/log4j-1.2.15.jar:/opt/local/src/hama/current/bin/../lib/servlet-api-2.5-6.1.14.jar:/opt/local/src/hama/current/bin/../lib/xmlenc-0.52.jar:/opt/local/src/hama/current/bin/../lib/zookeeper-3.2.1.jar:/opt/local/src/hama/current/bin/../lib/jetty-ext/*.jar:/opt/local/src/hama/current/bin/../lib/findbugs/annotations.jar:/opt/local/src/hama/current/bin/../lib/findbugs/ant.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-analysis-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-commons-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-tree-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-util-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-xml-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/bcel.jar:/opt/local/src/hama/current/bin/../lib/findbugs/dom4j-full.jar:/opt/local/src/hama/current/bin/../lib/findbugs/findbugs-ant.jar:/opt/local/src/hama/current/bin/../lib/findbugs/findbugs.jar:/opt/local/src/hama/current/bin/../lib/findbugs/findbugsGUI.jar:/opt/local/src/hama/current/bin/../lib/findbugs/jsr305.jar:/opt/local/src/hama/current/bin/../lib/findbugs/plugin/coreplugin.jar:/opt/local/src/hadoop/current/conf:/opt/local/src/hadoop/current/conf
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client
> >>
> environment:java.library.path=.:/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client
> >>
> environment:java.io.tmpdir=/var/folders/pM/pMTPzO70Hc4Gd2FB24T-dU+++TM/-Tmp-/
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client
> >> environment:java.compiler=<NA>
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client environment:os.name
> =Mac
> >> OS X
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client
> >> environment:os.arch=x86_64
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client
> >> environment:os.version=10.6.2
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client environment:
> user.name
> >> =hadoop
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client
> >> environment:user.home=/Users/hadoop
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Client
> >> environment:user.dir=/opt/local/hadoop/jobs/hama
> >> 10/02/16 08:30:16 INFO zookeeper.ZooKeeper: Initiating client
> connection,
> >> connectString=localhost:2181 sessionTimeout=60000
> >>
> watcher=org.apache.hadoop.hbase.client.hconnectionmanager$clientzkwatc...@2efb56b1
> >> 10/02/16 08:30:16 INFO zookeeper.ClientCnxn:
> zookeeper.disableAutoWatchReset
> >> is false
> >> 10/02/16 08:30:16 INFO zookeeper.ClientCnxn: Attempting connection to
> server
> >> localhost/fe80:0:0:0:0:0:0:1%1:2181
> >> 10/02/16 08:30:16 INFO zookeeper.ClientCnxn: Priming connection to
> >> java.nio.channels.SocketChannel[connected
> local=/fe80:0:0:0:0:0:0:1%1:58457
> >> remote=localhost/fe80:0:0:0:0:0:0:1%1:2181]
> >> 10/02/16 08:30:16 INFO zookeeper.ClientCnxn: Server connection
> successful
> >> 10/02/16 08:30:17 INFO matrix.AbstractMatrix: Initializing the matrix
> >> storage.
> >> 10/02/16 08:30:19 INFO matrix.AbstractMatrix: Create Matrix
> >> SparseMatrix_randflbeh
> >> 10/02/16 08:30:19 INFO matrix.AbstractMatrix: Create the 100 * 100
> random
> >> matrix : SparseMatrix_randflbeh
> >> Wrote input for Map #0
> >> Wrote input for Map #1
> >> Wrote input for Map #2
> >> Wrote input for Map #3
> >> Wrote input for Map #4
> >> Wrote input for Map #5
> >> Wrote input for Map #6
> >> Wrote input for Map #7
> >> Wrote input for Map #8
> >> Wrote input for Map #9
> >> 10/02/16 08:30:20 WARN mapred.JobClient: Use GenericOptionsParser for
> >> parsing the arguments. Applications should implement Tool for the same.
> >> 10/02/16 08:30:20 WARN mapred.JobClient: No job jar file set.  User
> classes
> >> may not be found. See JobConf(Class) or JobConf#setJar(String).
> >> 10/02/16 08:30:20 INFO input.FileInputFormat: Total input paths to
> process :
> >> 10
> >> 10/02/16 08:30:21 INFO mapred.JobClient: Running job:
> job_201002160823_0002
> >> 10/02/16 08:30:22 INFO mapred.JobClient:  map 0% reduce 0%
> >> 10/02/16 08:30:33 INFO mapred.JobClient:  map 20% reduce 0%
> >> 10/02/16 08:30:42 INFO mapred.JobClient:  map 40% reduce 0%
> >> 10/02/16 08:30:45 INFO mapred.JobClient:  map 40% reduce 1%
> >> 10/02/16 08:30:51 INFO mapred.JobClient:  map 60% reduce 2%
> >> 10/02/16 08:30:57 INFO mapred.JobClient:  map 80% reduce 2%
> >> 10/02/16 08:31:00 INFO mapred.JobClient:  map 80% reduce 4%
> >> 10/02/16 08:31:03 INFO mapred.JobClient:  map 100% reduce 4%
> >> 10/02/16 08:31:06 INFO mapred.JobClient:  map 100% reduce 12%
> >> 10/02/16 08:31:12 INFO mapred.JobClient:  map 100% reduce 30%
> >> 10/02/16 08:31:22 INFO mapred.JobClient:  map 100% reduce 43%
> >> 10/02/16 08:31:25 INFO mapred.JobClient:  map 100% reduce 50%
> >> 10/02/16 08:31:31 INFO mapred.JobClient:  map 100% reduce 53%
> >> 10/02/16 08:31:34 INFO mapred.JobClient:  map 100% reduce 63%
> >> 10/02/16 08:31:37 INFO mapred.JobClient:  map 100% reduce 70%
> >> 10/02/16 08:31:43 INFO mapred.JobClient:  map 100% reduce 73%
> >> 10/02/16 08:31:46 INFO mapred.JobClient:  map 100% reduce 83%
> >> 10/02/16 08:31:49 INFO mapred.JobClient:  map 100% reduce 90%
> >> 10/02/16 08:31:55 INFO mapred.JobClient:  map 100% reduce 93%
> >> 10/02/16 08:31:58 INFO mapred.JobClient:  map 100% reduce 100%
> >> 10/02/16 08:32:00 INFO mapred.JobClient: Job complete:
> job_201002160823_0002
> >> 10/02/16 08:32:00 INFO mapred.JobClient: Counters: 16
> >> 10/02/16 08:32:00 INFO mapred.JobClient:   Job Counters
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     Launched reduce tasks=10
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     Launched map tasks=10
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     Data-local map tasks=10
> >> 10/02/16 08:32:00 INFO mapred.JobClient:   FileSystemCounters
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     FILE_BYTES_READ=42512
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     HDFS_BYTES_READ=1080
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=88044
> >> 10/02/16 08:32:00 INFO mapred.JobClient:   Map-Reduce Framework
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     Reduce input groups=100
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     Combine output records=0
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     Map input records=10
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     Reduce shuffle bytes=42603
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     Reduce output records=100
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     Spilled Records=200
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     Map output bytes=42052
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     Combine input records=0
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     Map output records=100
> >> 10/02/16 08:32:00 INFO mapred.JobClient:     Reduce input records=100
> >> 10/02/16 08:32:00 INFO zookeeper.ZooKeeper: Closing session:
> >> 0x126d6f5b8a9000e
> >> 10/02/16 08:32:00 INFO zookeeper.ClientCnxn: Closing ClientCnxn for
> session:
> >> 0x126d6f5b8a9000e
> >> 10/02/16 08:32:00 INFO zookeeper.ClientCnxn: Exception while closing
> send
> >> thread for session 0x126d6f5b8a9000e : Read error rc = -1
> >> java.nio.DirectByteBuffer[pos=0 lim=4 cap=4]
> >> 10/02/16 08:32:00 WARN zookeeper.ClientCnxn: Ignoring exception during
> >> shutdown input
> >> java.net.SocketException: Socket is not connected
> >> at sun.nio.ch.SocketChannelImpl.shutdown(Native Method)
> >>  at
> sun.nio.ch.SocketChannelImpl.shutdownInput(SocketChannelImpl.java:640)
> >> at sun.nio.ch.SocketAdaptor.shutdownInput(SocketAdaptor.java:360)
> >>  at
> org.apache.zookeeper.ClientCnxn$SendThread.cleanup(ClientCnxn.java:999)
> >> at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:984)
> >> 10/02/16 08:32:00 INFO zookeeper.ClientCnxn: Disconnecting ClientCnxn
> for
> >> session: 0x126d6f5b8a9000e
> >> 10/02/16 08:32:00 INFO zookeeper.ZooKeeper: Session: 0x126d6f5b8a9000e
> >> closed
> >> 10/02/16 08:32:00 INFO zookeeper.ClientCnxn: EventThread shut down
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client
> >> environment:zookeeper.version=3.2.1-808558, built on 08/27/2009 18:48
> GMT
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client environment:
> host.name
> >> =192.168.0.11
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client
> >> environment:java.version=1.6.0_17
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client
> >> environment:java.vendor=Apple Inc.
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client
> >>
> environment:java.home=/System/Library/Frameworks/JavaVM.framework/Versions/1.6.0/Home
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client
> >>
> environment:java.class.path=/opt/local/src/hama/current/bin/../conf:/System/Library/Frameworks/JavaVM.framework/Versions/1.6.0/Home/lib/tools.jar:/opt/local/src/hama/current/bin/../build/classes:/opt/local/src/hama/current/bin/../build/test:/opt/local/src/hama/current/bin/../build:/opt/local/src/hama/current/bin/../hama-0.2.0-dev-examples.jar:/opt/local/src/hama/current/bin/../hama-0.2.0-dev.jar:/opt/local/src/hama/current/bin/../lib/AgileJSON-2009-03-30.jar:/opt/local/src/hama/current/bin/../lib/commons-cli-2.0-SNAPSHOT.jar:/opt/local/src/hama/current/bin/../lib/commons-el-from-jetty-5.1.4.jar:/opt/local/src/hama/current/bin/../lib/commons-httpclient-3.0.1.jar:/opt/local/src/hama/current/bin/../lib/commons-logging-1.0.4.jar:/opt/local/src/hama/current/bin/../lib/commons-logging-api-1.0.4.jar:/opt/local/src/hama/current/bin/../lib/commons-math-1.1.jar:/opt/local/src/hama/current/bin/../lib/hadoop-0.20.1-core.jar:/opt/local/src/hama/current/bin/../lib/hadoop-0.20.1-test.jar:/opt/local/src/hama/current/bin/../lib/hbase-0.20.3-test.jar:/opt/local/src/hama/current/bin/../lib/hbase-0.20.3.jar:/opt/local/src/hama/current/bin/../lib/jasper-compiler-5.5.12.jar:/opt/local/src/hama/current/bin/../lib/jasper-runtime-5.5.12.jar:/opt/local/src/hama/current/bin/../lib/javacc.jar:/opt/local/src/hama/current/bin/../lib/jetty-6.1.14.jar:/opt/local/src/hama/current/bin/../lib/jetty-util-6.1.14.jar:/opt/local/src/hama/current/bin/../lib/jruby-complete-1.2.0.jar:/opt/local/src/hama/current/bin/../lib/json.jar:/opt/local/src/hama/current/bin/../lib/junit-3.8.1.jar:/opt/local/src/hama/current/bin/../lib/libthrift-r771587.jar:/opt/local/src/hama/current/bin/../lib/log4j-1.2.13.jar:/opt/local/src/hama/current/bin/../lib/log4j-1.2.15.jar:/opt/local/src/hama/current/bin/../lib/servlet-api-2.5-6.1.14.jar:/opt/local/src/hama/current/bin/../lib/xmlenc-0.52.jar:/opt/local/src/hama/current/bin/../lib/zookeeper-3.2.1.jar:/opt/local/src/hama/current/bin/../lib/jetty-ext/*.jar:/opt/local/src/hama/current/bin/../lib/findbugs/annotations.jar:/opt/local/src/hama/current/bin/../lib/findbugs/ant.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-analysis-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-commons-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-tree-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-util-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/asm-xml-3.0.jar:/opt/local/src/hama/current/bin/../lib/findbugs/bcel.jar:/opt/local/src/hama/current/bin/../lib/findbugs/dom4j-full.jar:/opt/local/src/hama/current/bin/../lib/findbugs/findbugs-ant.jar:/opt/local/src/hama/current/bin/../lib/findbugs/findbugs.jar:/opt/local/src/hama/current/bin/../lib/findbugs/findbugsGUI.jar:/opt/local/src/hama/current/bin/../lib/findbugs/jsr305.jar:/opt/local/src/hama/current/bin/../lib/findbugs/plugin/coreplugin.jar:/opt/local/src/hadoop/current/conf:/opt/local/src/hadoop/current/conf
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client
> >>
> environment:java.library.path=.:/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client
> >>
> environment:java.io.tmpdir=/var/folders/pM/pMTPzO70Hc4Gd2FB24T-dU+++TM/-Tmp-/
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client
> >> environment:java.compiler=<NA>
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client environment:os.name
> =Mac
> >> OS X
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client
> >> environment:os.arch=x86_64
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client
> >> environment:os.version=10.6.2
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client environment:
> user.name
> >> =hadoop
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client
> >> environment:user.home=/Users/hadoop
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Client
> >> environment:user.dir=/opt/local/hadoop/jobs/hama
> >> 10/02/16 08:32:02 INFO zookeeper.ZooKeeper: Initiating client
> connection,
> >> connectString=localhost:2181 sessionTimeout=60000
> >>
> watcher=org.apache.hadoop.hbase.client.hconnectionmanager$clientzkwatc...@76f8968f
> >> 10/02/16 08:32:02 INFO zookeeper.ClientCnxn:
> zookeeper.disableAutoWatchReset
> >> is false
> >> 10/02/16 08:32:02 INFO zookeeper.ClientCnxn: Attempting connection to
> server
> >> localhost/0:0:0:0:0:0:0:1:2181
> >> 10/02/16 08:32:02 INFO zookeeper.ClientCnxn: Priming connection to
> >> java.nio.channels.SocketChannel[connected local=/0:0:0:0:0:0:0:1%0:58679
> >> remote=localhost/0:0:0:0:0:0:0:1:2181]
> >> 10/02/16 08:32:02 INFO zookeeper.ClientCnxn: Server connection
> successful
> >> C = A + B
> >> 10/02/16 08:32:05 INFO matrix.AbstractMatrix: Initializing the matrix
> >> storage.
> >> 10/02/16 08:32:07 INFO matrix.AbstractMatrix: Create Matrix
> >> DenseMatrix_randwhqtl
> >> 10/02/16 08:32:07 WARN mapred.JobClient: Use GenericOptionsParser for
> >> parsing the arguments. Applications should implement Tool for the same.
> >> 10/02/16 08:32:07 WARN mapred.JobClient: No job jar file set.  User
> classes
> >> may not be found. See JobConf(Class) or JobConf#setJar(String).
> >> 10/02/16 08:32:08 INFO mapred.JobClient: Running job:
> job_201002160823_0003
> >> 10/02/16 08:32:09 INFO mapred.JobClient:  map 0% reduce 0%
> >> 10/02/16 08:32:23 INFO mapred.JobClient: Task Id :
> >> attempt_201002160823_0003_m_000000_0, Status : FAILED
> >> java.lang.IndexOutOfBoundsException: v1.size != v2.size (25 != 23)
> >> at
> >>
> org.apache.hama.matrix.AbstractVector.checkComformantSize(AbstractVector.java:92)
> >>  at org.apache.hama.matrix.DenseVector.add(DenseVector.java:150)
> >> at
> >>
> org.apache.hama.matrix.algebra.MatrixAdditionMap.map(MatrixAdditionMap.java:35)
> >>  at
> >>
> org.apache.hama.matrix.algebra.MatrixAdditionMap.map(MatrixAdditionMap.java:18)
> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:583)
> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
> >>  at org.apache.hadoop.mapred.Child.main(Child.java:170)
> >>
> >> 10/02/16 08:32:33 INFO mapred.JobClient: Task Id :
> >> attempt_201002160823_0003_m_000000_1, Status : FAILED
> >> java.lang.IndexOutOfBoundsException: v1.size != v2.size (25 != 23)
> >> at
> >>
> org.apache.hama.matrix.AbstractVector.checkComformantSize(AbstractVector.java:92)
> >>  at org.apache.hama.matrix.DenseVector.add(DenseVector.java:150)
> >> at
> >>
> org.apache.hama.matrix.algebra.MatrixAdditionMap.map(MatrixAdditionMap.java:35)
> >>  at
> >>
> org.apache.hama.matrix.algebra.MatrixAdditionMap.map(MatrixAdditionMap.java:18)
> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:583)
> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
> >>  at org.apache.hadoop.mapred.Child.main(Child.java:170)
> >>
> >> 10/02/16 08:32:42 INFO mapred.JobClient: Task Id :
> >> attempt_201002160823_0003_m_000000_2, Status : FAILED
> >> java.lang.IndexOutOfBoundsException: v1.size != v2.size (25 != 23)
> >> at
> >>
> org.apache.hama.matrix.AbstractVector.checkComformantSize(AbstractVector.java:92)
> >>  at org.apache.hama.matrix.DenseVector.add(DenseVector.java:150)
> >> at
> >>
> org.apache.hama.matrix.algebra.MatrixAdditionMap.map(MatrixAdditionMap.java:35)
> >>  at
> >>
> org.apache.hama.matrix.algebra.MatrixAdditionMap.map(MatrixAdditionMap.java:18)
> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:583)
> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
> >>  at org.apache.hadoop.mapred.Child.main(Child.java:170)
> >>
> >> 10/02/16 08:32:54 INFO mapred.JobClient: Job complete:
> job_201002160823_0003
> >> 10/02/16 08:32:54 INFO mapred.JobClient: Counters: 3
> >> 10/02/16 08:32:54 INFO mapred.JobClient:   Job Counters
> >> 10/02/16 08:32:54 INFO mapred.JobClient:     Launched map tasks=4
> >> 10/02/16 08:32:54 INFO mapred.JobClient:     Data-local map tasks=4
> >> 10/02/16 08:32:54 INFO mapred.JobClient:     Failed map tasks=1
> >> 10/02/16 08:32:56 INFO client.HBaseAdmin: Disabled DenseMatrix_randwhqtl
> >> 10/02/16 08:32:56 INFO client.HBaseAdmin: Deleted DenseMatrix_randwhqtl
> >> 10/02/16 08:32:56 INFO zookeeper.ZooKeeper: Closing session:
> >> 0x126d6f5b8a90019
> >> 10/02/16 08:32:56 INFO zookeeper.ClientCnxn: Closing ClientCnxn for
> session:
> >> 0x126d6f5b8a90019
> >> 10/02/16 08:32:56 INFO zookeeper.ClientCnxn: Exception while closing
> send
> >> thread for session 0x126d6f5b8a90019 : Read error rc = -1
> >> java.nio.DirectByteBuffer[pos=0 lim=4 cap=4]
> >> 10/02/16 08:32:56 WARN zookeeper.ClientCnxn: Ignoring exception during
> >> shutdown input
> >> java.net.SocketException: Socket is not connected
> >> at sun.nio.ch.SocketChannelImpl.shutdown(Native Method)
> >>  at
> sun.nio.ch.SocketChannelImpl.shutdownInput(SocketChannelImpl.java:640)
> >> at sun.nio.ch.SocketAdaptor.shutdownInput(SocketAdaptor.java:360)
> >>  at
> org.apache.zookeeper.ClientCnxn$SendThread.cleanup(ClientCnxn.java:999)
> >> at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:984)
> >> 10/02/16 08:32:56 INFO zookeeper.ClientCnxn: Disconnecting ClientCnxn
> for
> >> session: 0x126d6f5b8a90019
> >> 10/02/16 08:32:56 INFO zookeeper.ZooKeeper: Session: 0x126d6f5b8a90019
> >> closed
> >> 10/02/16 08:32:56 INFO zookeeper.ClientCnxn: EventThread shut down
> >>  run (rand, rand, add)
> >>
> >
> >
> >
> > --
> > Best Regards, Edward J. Yoon @ NHN, corp.
> > [email protected]
> > http://blog.udanax.org
> >
>
>
>
> --
> Best Regards, Edward J. Yoon @ NHN, corp.
> [email protected]
> http://blog.udanax.org
>

Reply via email to