Hello, after change the domain name, it still fails.
I just realise something:
when i run :
camelot01% bin/hadoop jar hadoop-0.12.3-test.jar testrpc
it fails and return:
07/05/23 20:56:25 INFO ipc.Server: IPC Server listener on 1234: starting
07/05/23 20:56:25 INFO ipc.Server: IPC Server handler 0 on 1234: starting
07/05/23 20:56:25 INFO ipc.Server: IPC Server handler 0 on 1234 call
error: java.io.IOException: bobo
java.io.IOException: bobo
at org.apache.hadoop.ipc.TestRPC$TestImpl.error(TestRPC.java:90)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:336)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:559)
07/05/23 20:56:25 INFO ipc.Server: Stopping server on 1234
07/05/23 20:56:25 INFO ipc.Server: Stopping IPC Server listener on 1234
what is the bobo Exception? and how to fix it?
thank you for replying!!
yu-yang
Michael Bieniosek wrote:
Try using fully-qualified domain names in your xml file.
On 5/23/07 11:22 AM, "yu-yang chen" <[EMAIL PROTECTED]> wrote:
I know this question pop up so mant times, but I couldn't find a right
answer to my case....
hello everyone, i just started to use hadoop and encountered problem on
setting up the single node system:
This is what inside my hadoop-site.xml:
------------------------------------------------------------------------------
------
camelot01% cat hadoop-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>fs.default.name</name>
<value>localhost:9000</value>
</property>
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
------------------------------------------------------------------------------
-------
After:
bin/hadoop namenode -format
Re-format filesystem in /tmp/hadoop-yyc04/dfs/name ? (Y or N) Y
Formatted /tmp/hadoop-yyc04/dfs/name
camelot01% bin/start-all.sh<---works fine, passwrdless ssh also works
And this is the error i obtain:
camelot01% bin/hadoop jar hadoop-0.12.3-examples.jar pi 10 20
Number of Maps = 10 Samples per Map = 20
07/05/23 19:14:11 INFO ipc.Client: Retrying connect to server:
localhost/146.169.2.131:9000. Already tried 1 time(s).
07/05/23 19:14:12 INFO ipc.Client: Retrying connect to server:
localhost/146.169.2.131:9000. Already tried 2 time(s).
07/05/23 19:14:13 INFO ipc.Client: Retrying connect to server:
localhost/146.169.2.131:9000. Already tried 3 time(s).
07/05/23 19:14:14 INFO ipc.Client: Retrying connect to server:
localhost/146.169.2.131:9000. Already tried 4 time(s).
07/05/23 19:14:15 INFO ipc.Client: Retrying connect to server:
localhost/146.169.2.131:9000. Already tried 5 time(s).
07/05/23 19:14:16 INFO ipc.Client: Retrying connect to server:
localhost/146.169.2.131:9000. Already tried 6 time(s).
07/05/23 19:14:17 INFO ipc.Client: Retrying connect to server:
localhost/146.169.2.131:9000. Already tried 7 time(s).
07/05/23 19:14:18 INFO ipc.Client: Retrying connect to server:
localhost/146.169.2.131:9000. Already tried 8 time(s).
07/05/23 19:14:19 INFO ipc.Client: Retrying connect to server:
localhost/146.169.2.131:9000. Already tried 9 time(s).
07/05/23 19:14:20 INFO ipc.Client: Retrying connect to server:
localhost/146.169.2.131:9000. Already tried 10 time(s).
java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333)
at
java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:195)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:364)
at java.net.Socket.connect(Socket.java:507)
at
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:149)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:529)
at org.apache.hadoop.ipc.Client.call(Client.java:458)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:163)
at org.apache.hadoop.dfs.$Proxy0.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:247)
at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:105)
at
org.apache.hadoop.dfs.DistributedFileSystem$RawDistributedFileSystem.initializ
e(DistributedFileSystem.java:67)
at
org.apache.hadoop.fs.FilterFileSystem.initialize(FilterFileSystem.java:57)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:160)
at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:119)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:91)
at
org.apache.hadoop.examples.PiEstimator.launch(PiEstimator.java:169)
at org.apache.hadoop.examples.PiEstimator.main(PiEstimator.java:226)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.j
ava:25)
at java.lang.reflect.Method.invoke(Method.java:585)
at
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.j
ava:71)
at
org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:143)
at
org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:40)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.j
ava:25)
at java.lang.reflect.Method.invoke(Method.java:585)
at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
can anyone please tell me why? I did not change any other files apart
from hadoop-env.xml (set JAVA_HOME) and hadoop-site.xml
please help....thank you so much!
yu-yang