Hi I tried it with java 6 but with no success
here are the links for log and out file of jobtracker with java 6
logfile link
http://pastebin.com/bvWZRt0A

outfile link is here which is a bit different from the java 7
http://pastebin.com/4YCZhQGh

Also please keep in mind that i can run hadoop 0.20 with java home path set
to java 7.

waqas

On Wed, May 16, 2012 at 4:44 PM, Harsh J <ha...@cloudera.com> wrote:

> Hi,
>
> JRE version: 7.0_01-b08
>
> This version (1.7 JDK/JRE) isn't supported by 1.x Hadoop. Please
> switch your JAVA_HOME to a 1.6-based JRE.
>
> What you're running into is a JVM SIGFPE, which am sure-enough is
> cause of the bad java version used.
>
> On Wed, May 16, 2012 at 7:50 PM, waqas latif <waqas...@gmail.com> wrote:
> > ohh my mistake..sorry I misunderstood that to be private in mailing
> list. i
> > made it public now. and yes I configured the right java home. Please
> have a
> > look at pastebin links again.
> >
> > On Wed, May 16, 2012 at 4:16 PM, Harsh J <ha...@cloudera.com> wrote:
> >
> >> These paste links are marked private. Since am not their uploader, I
> >> can't view them. Your response hints on a java issue? Have you
> >> configured the right JAVA_HOME in hadoop-env.sh?
> >>
> >> On Wed, May 16, 2012 at 7:30 PM, waqas latif <waqas...@gmail.com>
> wrote:
> >> > Harsh, I have 2 files for jobtracker with extension log and out. Here
> are
> >> > both files. There is an error message in out file. Please check both
> of
> >> > them and see if you can guide me in this.
> >> > here are the pastebin links for both files. I also want to mention
> that
> >> > with same jave setup I can run hadoop 0.20 successfully but cant
> >> configure
> >> > hadoop 1.0
> >> >
> >> > jobtracker log file
> >> > http://pastebin.com/61dqSiYJ
> >> >
> >> > jobtracker out file
> >> > http://pastebin.com/Mf7yrB8p
> >> >
> >> > waqas
> >> >
> >> > On Wed, May 16, 2012 at 3:47 PM, Harsh J <ha...@cloudera.com> wrote:
> >> >
> >> >> Can you paste logs that say "jobtracker" from $HADOOP_HOME/logs/ into
> >> >> pastebin.com and pass back the pasted link? Looks like your MR
> >> >> services aren't starting for some reason and logs will tell you why.
> >> >>
> >> >> On Wed, May 16, 2012 at 7:08 PM, waqas latif <waqas...@gmail.com>
> >> wrote:
> >> >> > Hi Harsh,
> >> >> > I run this. but still there is no jobtracker and tasktracker in the
> >> jps
> >> >> > output. I only have datanode, namenode and secondarynamenode in jps
> >> >> output.
> >> >> >
> >> >> > waqas
> >> >> >
> >> >> > On Wed, May 16, 2012 at 3:28 PM, Harsh J <ha...@cloudera.com>
> wrote:
> >> >> >
> >> >> >> You have configured a MR cluster, but it isn't up.
> >> >> >>
> >> >> >> Run:
> >> >> >>
> >> >> >> bin/start-mapred.sh
> >> >> >>
> >> >> >> Then check for "JobTracker" and "TaskTracker" in 'jps' output.
> >> >> >>
> >> >> >> Then re-run your example pi job, and it should go through.
> >> >> >>
> >> >> >> On Wed, May 16, 2012 at 6:51 PM, waqas latif <waqas...@gmail.com>
> >> >> wrote:
> >> >> >> > Hi I am trying to configure Hadoop 1.0. in pseudodistributed
> mode.
> >> >> >> >
> >> >> >> > But when I run the pi example given in the hadoop distribution,
> I
> >> get
> >> >> the
> >> >> >> > error mentioned in title. Can someone please help me and guide
> me
> >> how
> >> >> >> can I
> >> >> >> > fix this problem. Also its a request that please suggest
> solution
> >> as
> >> >> well
> >> >> >> > if possible along with pinpointing the problem.
> >> >> >> >
> >> >> >> > here is what i get by running jps
> >> >> >> >
> >> >> >> > 8322 Jps
> >> >> >> > 7611 SecondaryNameNode
> >> >> >> > 7474 DataNode
> >> >> >> > 7341 NameNode
> >> >> >> >
> >> >> >> > Here is complete error message.
> >> >> >> >
> >> >> >> > Number of Maps  = 10
> >> >> >> > Samples per Map = 100
> >> >> >> > Wrote input for Map #0
> >> >> >> > Wrote input for Map #1
> >> >> >> > Wrote input for Map #2
> >> >> >> > Wrote input for Map #3
> >> >> >> > Wrote input for Map #4
> >> >> >> > Wrote input for Map #5
> >> >> >> > Wrote input for Map #6
> >> >> >> > Wrote input for Map #7
> >> >> >> > Wrote input for Map #8
> >> >> >> > Wrote input for Map #9
> >> >> >> > Starting Job
> >> >> >> > 12/05/16 13:11:56 INFO ipc.Client: Retrying connect to server:
> >> >> localhost/
> >> >> >> > 127.0.0.1:8021. Already tried 0 time(s).
> >> >> >> > 12/05/16 13:11:57 INFO ipc.Client: Retrying connect to server:
> >> >> localhost/
> >> >> >> > 127.0.0.1:8021. Already tried 1 time(s).
> >> >> >> > 12/05/16 13:11:58 INFO ipc.Client: Retrying connect to server:
> >> >> localhost/
> >> >> >> > 127.0.0.1:8021. Already tried 2 time(s).
> >> >> >> > 12/05/16 13:11:59 INFO ipc.Client: Retrying connect to server:
> >> >> localhost/
> >> >> >> > 127.0.0.1:8021. Already tried 3 time(s).
> >> >> >> > 12/05/16 13:12:00 INFO ipc.Client: Retrying connect to server:
> >> >> localhost/
> >> >> >> > 127.0.0.1:8021. Already tried 4 time(s).
> >> >> >> > 12/05/16 13:12:01 INFO ipc.Client: Retrying connect to server:
> >> >> localhost/
> >> >> >> > 127.0.0.1:8021. Already tried 5 time(s).
> >> >> >> > 12/05/16 13:12:02 INFO ipc.Client: Retrying connect to server:
> >> >> localhost/
> >> >> >> > 127.0.0.1:8021. Already tried 6 time(s).
> >> >> >> > 12/05/16 13:12:03 INFO ipc.Client: Retrying connect to server:
> >> >> localhost/
> >> >> >> > 127.0.0.1:8021. Already tried 7 time(s).
> >> >> >> > 12/05/16 13:12:04 INFO ipc.Client: Retrying connect to server:
> >> >> localhost/
> >> >> >> > 127.0.0.1:8021. Already tried 8 time(s).
> >> >> >> > 12/05/16 13:12:05 INFO ipc.Client: Retrying connect to server:
> >> >> localhost/
> >> >> >> > 127.0.0.1:8021. Already tried 9 time(s).
> >> >> >> > java.net.ConnectException: Call to localhost/127.0.0.1:8021failed
> on
> >> >> >> > connection exception: java.net.ConnectException: Connection
> refused
> >> >> >> >        at
> >> org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
> >> >> >> >        at org.apache.hadoop.ipc.Client.call(Client.java:1071)
> >> >> >> >        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
> >> >> >> >        at
> >> org.apache.hadoop.mapred.$Proxy2.getProtocolVersion(Unknown
> >> >> >> > Source)
> >> >> >> >        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
> >> >> >> >        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
> >> >> >> >        at
> >> >> >> >
> >> org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:480)
> >> >> >> >        at
> >> org.apache.hadoop.mapred.JobClient.init(JobClient.java:474)
> >> >> >> >        at
> >> >> org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:457)
> >> >> >> >        at
> >> >> org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1260)
> >> >> >> >        at
> >> >> >> >
> >> org.apache.hadoop.examples.PiEstimator.estimate(PiEstimator.java:297)
> >> >> >> >        at
> >> >> >> org.apache.hadoop.examples.PiEstimator.run(PiEstimator.java:342)
> >> >> >> >        at
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >> >> >> >        at
> >> >> >> org.apache.hadoop.examples.PiEstimator.main(PiEstimator.java:351)
> >> >> >> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> >> Method)
> >> >> >> >        at
> >> >> >> >
> >> >> >>
> >> >>
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >> >> >> >        at
> >> >> >> >
> >> >> >>
> >> >>
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> >> >> >        at java.lang.reflect.Method.invoke(Method.java:601)
> >> >> >> >        at
> >> >> >> >
> >> >> >>
> >> >>
> >>
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
> >> >> >> >        at
> >> >> >> >
> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
> >> >> >> >        at
> >> >> >> >
> >> org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
> >> >> >> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> >> Method)
> >> >> >> >        at
> >> >> >> >
> >> >> >>
> >> >>
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >> >> >> >        at
> >> >> >> >
> >> >> >>
> >> >>
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> >> >> >        at java.lang.reflect.Method.invoke(Method.java:601)
> >> >> >> >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> >> >> >> > Caused by: java.net.ConnectException: Connection refused
> >> >> >> >        at sun.nio.ch.SocketChannelImpl.checkConnect(Native
> Method)
> >> >> >> >        at
> >> >> >> >
> >> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> >> >> >> >        at
> >> >> >> >
> >> >> >>
> >> >>
> >>
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
> >> >> >> >        at
> org.apache.hadoop.net.NetUtils.connect(NetUtils.java:656)
> >> >> >> >        at
> >> >> >> >
> >> >>
> org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
> >> >> >> >        at
> >> >> >> >
> >> >>
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
> >> >> >> >        at
> >> >> >> >
> >> org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
> >> >> >> >        at
> >> org.apache.hadoop.ipc.Client.getConnection(Client.java:1202)
> >> >> >> >        at org.apache.hadoop.ipc.Client.call(Client.java:1046)
> >> >> >> >        ... 24 more
> >> >> >>
> >> >> >>
> >> >> >>
> >> >> >> --
> >> >> >> Harsh J
> >> >> >>
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Harsh J
> >> >>
> >>
> >>
> >>
> >> --
> >> Harsh J
> >>
>
>
>
> --
> Harsh J
>

Reply via email to