[ 
https://issues.apache.org/jira/browse/HAMA-443?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13148162#comment-13148162
 ] 

Edward J. Yoon commented on HAMA-443:
-------------------------------------

With args, I received below errors :

{code}
# core/bin/hama jar examples/target/hama-examples-0.4.0-incubating-SNAPSHOT.jar 
sssp Klewno result /user/root/data/adjacencylist.seq2
Single Source Shortest Path Example:
<Startvertex name> <optional: output path> <optional: path to own adjacency 
list textfile!> <optional: output enabled: true/false>
Using new output folder: result
Setting start vertex to Klewno!
11/11/11 09:23:23 INFO bsp.FileInputFormat: Total input paths to process : 1
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.32.152:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.95.222:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.141.17:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.101.200:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.70.34:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.158.254:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.152.146:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.27.99:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.14.150:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.86.139:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.55.193:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.160.225:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.173.74:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.13.215:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.42.79:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.173.104:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.214.25:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.85.3:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.202.200:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.85.124:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.239.220:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.127.214:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.61.161:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.83.107:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.227.16:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.232.99:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.100.220:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.84.54:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.106.192:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.48.35:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.16.27:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.48.195:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.88.244:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.115.175:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.244.64:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.180.215:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.148.185:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.127.105:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.26.30:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.68.55:50010
11/11/11 09:23:23 INFO net.NetworkTopology: Adding a new node: 
/default-rack/172.27.223.141:50010
11/11/11 09:23:23 INFO bsp.FileInputFormat: Total # of splits: 200
11/11/11 09:23:23 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
11/11/11 09:23:23 INFO compress.CodecPool: Got brand-new decompressor
11/11/11 09:23:24 INFO compress.CodecPool: Got brand-new decompressor
11/11/11 09:23:24 INFO compress.CodecPool: Got brand-new decompressor
11/11/11 09:23:25 INFO compress.CodecPool: Got brand-new decompressor
11/11/11 09:23:26 WARN hdfs.DFSClient: DFSOutputStream ResponseProcessor 
exception  for block blk_961295701580962452_2536java.io.EOFException
        at java.io.DataInputStream.readFully(DataInputStream.java:180)
        at java.io.DataInputStream.readLong(DataInputStream.java:399)
        at 
org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:119)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$ResponseProcessor.run(DFSClient.java:2424)

11/11/11 09:23:26 WARN hdfs.DFSClient: Error Recovery for block 
blk_961295701580962452_2536 bad datanode[0] 172.27.32.152:50010
11/11/11 09:23:26 WARN hdfs.DFSClient: Error Recovery for block 
blk_961295701580962452_2536 in pipeline 172.27.32.152:50010, 
172.27.88.244:50010, 172.27.52.13:50010: bad datanode 172.27.32.152:50010
11/11/11 09:23:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:26 INFO hdfs.DFSClient: Abandoning block 
blk_3183301347197780505_2536
11/11/11 09:23:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:26 INFO hdfs.DFSClient: Abandoning block 
blk_5552423712400540134_2536
11/11/11 09:23:27 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:27 INFO hdfs.DFSClient: Abandoning block 
blk_-3695961061153052510_2536
11/11/11 09:23:27 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:27 INFO hdfs.DFSClient: Abandoning block 
blk_-1959712604538439989_2536
11/11/11 09:23:27 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:27 INFO hdfs.DFSClient: Abandoning block 
blk_5723845720432146331_2536
11/11/11 09:23:32 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:32 INFO hdfs.DFSClient: Abandoning block 
blk_-9011360127839438138_2537
11/11/11 09:23:32 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:32 INFO hdfs.DFSClient: Abandoning block 
blk_5706156511394032513_2537
11/11/11 09:23:33 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:33 INFO hdfs.DFSClient: Abandoning block 
blk_6843621826844247162_2537
11/11/11 09:23:33 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:33 INFO hdfs.DFSClient: Abandoning block 
blk_-3250670865834931033_2537
11/11/11 09:23:33 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:33 INFO hdfs.DFSClient: Abandoning block 
blk_4089487198639501080_2537
11/11/11 09:23:38 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:38 INFO hdfs.DFSClient: Abandoning block 
blk_6648705143580536886_2537
11/11/11 09:23:38 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:38 INFO hdfs.DFSClient: Abandoning block 
blk_-3490582585706689275_2537
11/11/11 09:23:39 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:39 INFO hdfs.DFSClient: Abandoning block 
blk_-7662361325058636428_2537
11/11/11 09:23:39 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:39 INFO hdfs.DFSClient: Abandoning block 
blk_-2571084933259034438_2537
11/11/11 09:23:39 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:39 INFO hdfs.DFSClient: Abandoning block 
blk_-707188914098310327_2537
11/11/11 09:23:44 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:44 INFO hdfs.DFSClient: Abandoning block 
blk_-3472622299529134100_2537
11/11/11 09:23:44 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:44 INFO hdfs.DFSClient: Abandoning block 
blk_-4797515008608630273_2537
11/11/11 09:23:45 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:45 INFO hdfs.DFSClient: Abandoning block 
blk_5923160601439134956_2537
11/11/11 09:23:45 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:45 INFO hdfs.DFSClient: Abandoning block 
blk_-7571896600592593591_2537
11/11/11 09:23:45 INFO hdfs.DFSClient: Exception in createBlockOutputStream 
java.io.EOFException
11/11/11 09:23:45 INFO hdfs.DFSClient: Abandoning block 
blk_3433459239518319183_2537
11/11/11 09:23:50 WARN hdfs.DFSClient: DataStreamer Exception: 
java.io.IOException: Unable to create new block.
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2845)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)

11/11/11 09:23:50 WARN hdfs.DFSClient: Error Recovery for block 
blk_-3472622299529134100_2537 bad datanode[0] nodes == null
11/11/11 09:23:50 WARN hdfs.DFSClient: Could not get block locations. Source 
file "/user/root/data/hama-partitions/part-98" - Aborting...
11/11/11 09:23:50 WARN hdfs.DFSClient: DataStreamer Exception: 
java.io.IOException: Unable to create new block.
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2845)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)

11/11/11 09:23:50 WARN hdfs.DFSClient: Error Recovery for block 
blk_-4797515008608630273_2537 bad datanode[0] nodes == null
11/11/11 09:23:50 WARN hdfs.DFSClient: Could not get block locations. Source 
file "/user/root/data/hama-partitions/part-144" - Aborting...
11/11/11 09:23:51 WARN hdfs.DFSClient: DataStreamer Exception: 
java.io.IOException: Unable to create new block.
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2845)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)

11/11/11 09:23:51 WARN hdfs.DFSClient: Error Recovery for block 
blk_5923160601439134956_2537 bad datanode[0] nodes == null
11/11/11 09:23:51 WARN hdfs.DFSClient: Could not get block locations. Source 
file "/user/root/data/hama-partitions/part-126" - Aborting...
11/11/11 09:23:51 WARN hdfs.DFSClient: DataStreamer Exception: 
java.io.IOException: Unable to create new block.
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2845)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)

11/11/11 09:23:51 WARN hdfs.DFSClient: Error Recovery for block 
blk_-7571896600592593591_2537 bad datanode[0] nodes == null
11/11/11 09:23:51 WARN hdfs.DFSClient: Could not get block locations. Source 
file "/user/root/data/hama-partitions/part-151" - Aborting...
11/11/11 09:23:51 WARN hdfs.DFSClient: DataStreamer Exception: 
java.io.IOException: Unable to create new block.
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2845)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)

11/11/11 09:23:51 WARN hdfs.DFSClient: Error Recovery for block 
blk_3433459239518319183_2537 bad datanode[0] nodes == null
11/11/11 09:23:51 WARN hdfs.DFSClient: Could not get block locations. Source 
file "/user/root/data/hama-partitions/part-135" - Aborting...
java.io.EOFException
        at java.io.DataInputStream.readByte(DataInputStream.java:250)
        at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298)
        at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319)
        at org.apache.hadoop.io.Text.readString(Text.java:400)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:2901)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2826)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)
11/11/11 09:23:52 ERROR hdfs.DFSClient: Exception closing file 
/user/root/data/hama-partitions/part-126 : java.io.EOFException
java.io.EOFException
        at java.io.DataInputStream.readByte(DataInputStream.java:250)
        at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298)
        at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319)
        at org.apache.hadoop.io.Text.readString(Text.java:400)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:2901)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2826)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)
11/11/11 09:23:52 ERROR hdfs.DFSClient: Exception closing file 
/user/root/data/hama-partitions/part-135 : java.io.EOFException
java.io.EOFException
        at java.io.DataInputStream.readByte(DataInputStream.java:250)
        at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298)
        at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319)
        at org.apache.hadoop.io.Text.readString(Text.java:400)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:2901)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2826)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)
11/11/11 09:23:52 ERROR hdfs.DFSClient: Exception closing file 
/user/root/data/hama-partitions/part-144 : java.io.EOFException
java.io.EOFException
        at java.io.DataInputStream.readByte(DataInputStream.java:250)
        at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298)
        at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319)
        at org.apache.hadoop.io.Text.readString(Text.java:400)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:2901)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2826)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)
11/11/11 09:23:52 ERROR hdfs.DFSClient: Exception closing file 
/user/root/data/hama-partitions/part-151 : java.io.EOFException
java.io.EOFException
        at java.io.DataInputStream.readByte(DataInputStream.java:250)
        at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298)
        at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319)
        at org.apache.hadoop.io.Text.readString(Text.java:400)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:2901)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2826)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)
11/11/11 09:23:53 ERROR hdfs.DFSClient: Exception closing file 
/user/root/data/hama-partitions/part-98 : java.io.EOFException
java.io.EOFException
        at java.io.DataInputStream.readByte(DataInputStream.java:250)
        at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298)
        at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319)
        at org.apache.hadoop.io.Text.readString(Text.java:400)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:2901)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2826)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)
{code}
                
> SSSP doesn't works with multi-tasks
> -----------------------------------
>
>                 Key: HAMA-443
>                 URL: https://issues.apache.org/jira/browse/HAMA-443
>             Project: Hama
>          Issue Type: Bug
>          Components: examples
>    Affects Versions: 0.3.0
>            Reporter: Edward J. Yoon
>            Assignee: Thomas Jungblut
>             Fix For: 0.4.0
>
>         Attachments: HAMA-443.patch, HAMA-443_1.patch, HAMA-443_v2.patch, 
> HAMA-443_v3.patch, patch.txt
>
>
> {code}
> root@Cnode1:/usr/local/src/hama-trunk# core/bin/hama jar 
> examples/target/hama-examples-0.4.0-incubating-SNAPSHOT.jar sssp Klewno xx 
> /user/root/edward/sssp-adjacencylist.txt
> Single Source Shortest Path Example:
> <Startvertex name> <optional: output path> <optional: path to own adjacency 
> list textfile!>
> Setting default start vertex to "Frankfurt"!
> Setting start vertex to Klewno!
> Using new output folder: xx
> 11/09/26 09:46:24 INFO graph.ShortestPaths: Starting data partitioning...
> 11/09/26 09:47:12 INFO graph.ShortestPaths: Finished!
> 11/09/26 09:47:12 INFO bsp.BSPJobClient: Running job: job_201109260929_0004
> 11/09/26 09:47:15 INFO bsp.BSPJobClient: Current supersteps number: 0
> 11/09/26 09:47:21 INFO bsp.BSPJobClient: Current supersteps number: 1
> 11/09/26 09:47:33 INFO bsp.BSPJobClient: The total number of supersteps: 1
> Job Finished in 21.553 seconds
> -------------------- RESULTS --------------------
> java.lang.NullPointerException
>         at 
> org.apache.hama.examples.graph.ShortestPathsBase.printOutput(ShortestPathsBase.java:93)
>         at 
> org.apache.hama.examples.graph.ShortestPaths.main(ShortestPaths.java:239)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at 
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>         at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>         at org.apache.hama.examples.ExampleDriver.main(ExampleDriver.java:37)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hama.util.RunJar.main(RunJar.java:145)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Reply via email to