Hello all
The exception was being throw cause the JDK version I had was a lower one.
The minute I updated that it worked well! So I guess having the right
version of java matters alot!!
Thanks for your help!
Mithila
On Tue, Nov 25, 2008 at 9:57 AM, Steve Loughran wrote:
> Mithila Nagendra wrot
Mithila Nagendra wrote:
Hey steve
The version is: Linux enpc3740.eas.asu.edu 2.6.9-67.0.20.EL #1 Wed Jun 18
12:23:46 EDT 2008 i686 i686 i386 GNU/Linux, this is what I got when I used
the command uname -a
On Tue, Nov 25, 2008 at 1:50 PM, Steve Loughran <[EMAIL PROTECTED]> wrote:
Mithila Nagendr
Hey steve
The version is: Linux enpc3740.eas.asu.edu 2.6.9-67.0.20.EL #1 Wed Jun 18
12:23:46 EDT 2008 i686 i686 i386 GNU/Linux, this is what I got when I used
the command uname -a
On Tue, Nov 25, 2008 at 1:50 PM, Steve Loughran <[EMAIL PROTECTED]> wrote:
> Mithila Nagendra wrote:
>
>> Hey Steve
>
Mithila Nagendra wrote:
Hey Steve
I deleted what ever I needed to.. still no luck..
You said that the classpath might be messed up.. Is there some way I can
reset it? For the root user? What path do I set it to.
Let's start with what kind of machine is this? Windows? or Linux. If
Linux, whic
Hey Steve
I deleted what ever I needed to.. still no luck..
You said that the classpath might be messed up.. Is there some way I can
reset it? For the root user? What path do I set it to.
Mithila
On Mon, Nov 24, 2008 at 8:54 PM, Steve Loughran <[EMAIL PROTECTED]> wrote:
> Mithila Nagendra wrote
Mithila Nagendra wrote:
Hey Steve
Out of the following which one do I remove - just making sure.. I got rid
of commons-logging-1.0.4.jar
commons-logging-api-1.0.4.jar
commons-logging-1.1.1-sources.jar commons-logging-1.1.1-sources.jar
Hadoop is currently built with commons-logging-1.0.4.jar,
Hey Steve
Out of the following which one do I remove - just making sure.. I got rid
of commons-logging-1.0.4.jar
commons-logging-api-1.0.4.jar
commons-logging-1.1.1-sources.jar commons-logging-1.1.1-sources.jar
Thanks!
Mithila
On Mon, Nov 24, 2008 at 6:32 PM, Steve Loughran <[EMAIL PROTECTED]
Thanks Steve! Will take a look at it..
Mithila
On Mon, Nov 24, 2008 at 6:32 PM, Steve Loughran <[EMAIL PROTECTED]> wrote:
> Mithila Nagendra wrote:
>
>> I tried dropping the jar files into the lib. It still doesnt work.. The
>> following is how the lib looks after the new files were put in:
>>
>>
Mithila Nagendra wrote:
I tried dropping the jar files into the lib. It still doesnt work.. The
following is how the lib looks after the new files were put in:
[EMAIL PROTECTED] hadoop-0.17.2.1]$ cd bin
[EMAIL PROTECTED] bin]$ ls
hadoophadoop-daemon.sh rccstart-all.sh
start
I tried dropping the jar files into the lib. It still doesnt work.. The
following is how the lib looks after the new files were put in:
[EMAIL PROTECTED] hadoop-0.17.2.1]$ cd bin
[EMAIL PROTECTED] bin]$ ls
hadoophadoop-daemon.sh rccstart-all.sh
start-dfs.sh stop-all.sh
Download the 1.1.1.tar.gz binaries. This file will have a bunch of JAR
files; drop the JAR files in to $HADOOP_HOME/lib and see what happens.
Alex
On Fri, Nov 21, 2008 at 9:19 AM, Mithila Nagendra <[EMAIL PROTECTED]> wrote:
> Hey ALex
> Which file do I download from the apache commons website?
>
Hey ALex
Which file do I download from the apache commons website?
Thanks
Mithila
On Fri, Nov 21, 2008 at 8:15 PM, Mithila Nagendra <[EMAIL PROTECTED]> wrote:
> I tried the 0.18.2 as welll.. it gave me the same exception.. so tried the
> lower version.. I should check if this works.. Thanks!
>
>
I tried the 0.18.2 as welll.. it gave me the same exception.. so tried the
lower version.. I should check if this works.. Thanks!
On Fri, Nov 21, 2008 at 5:06 AM, Alex Loddengaard <[EMAIL PROTECTED]> wrote:
> Maybe try downloading the Apache Commons - Logging jars (<
> http://commons.apache.org/d
Maybe try downloading the Apache Commons - Logging jars (<
http://commons.apache.org/downloads/download_logging.cgi>) and drop them in
to $HADOOP_HOME/lib.
Just curious, if you're starting a new cluster, why have you chosen to use
0.17.* and not 0.18.2? It would be a good idea to use 0.18.2 if pos
Hey
The version is: Linux enpc3740.eas.asu.edu 2.6.9-67.0.20.EL #1 Wed Jun 18
12:23:46 EDT 2008 i686 i686 i386 GNU/Linux, this is what I got when I used
the command uname -a (thanks Tom!)
Yea it is bin/start-all.. Following is the exception that I got when i tried
to start the daemons..
[EMAIL P
On Wed, Nov 19, 2008 at 5:31 PM, Mithila Nagendra <[EMAIL PROTECTED]> wrote:
> Oh is that so. Im not sure which UNIX it is since Im working with a cluster
> that is remotely accessed.
If you can get a shell on the machine, try typing "uname -a" to see
what type of UNIX it is.
Alternatively, the o
Oh is that so. Im not sure which UNIX it is since Im working with a cluster
that is remotely accessed. I will send the error out, but its gotta do with
Log4Jogger class...
On Wed, Nov 19, 2008 at 10:48 PM, Alex Loddengaard <[EMAIL PROTECTED]>wrote:
> None of your emails have had attachments. I t
None of your emails have had attachments. I think this list might strip
them. Can you copy-paste the error? Though I think the error won't be
useful. I'm pretty confident your issue is with Java. What UNIX are you
using?
Alex
On Wed, Nov 19, 2008 at 11:38 AM, Mithila Nagendra <[EMAIL PROTECTE
Hey
My hadoop version is 0.17.0.. check out the screen shots i ve put in..
Mithila
On Wed, Nov 19, 2008 at 9:29 PM, Sagar Naik <[EMAIL PROTECTED]> wrote:
> Mithila Nagendra wrote:
>
>> Hello
>> I m currently a student at Arizona State University, Tempe, Arizona,
>> pursuing my masters in Compute
By "UNIX" do you mean FreeBSD? The Hadoop configuration is platform
agnostic, so your issue is probably related to your Java configuration
(classpath, etc).
Alex
On Wed, Nov 19, 2008 at 10:20 AM, Mithila Nagendra <[EMAIL PROTECTED]> wrote:
> I ve attached the screen shots of the exception and ha
Mithila Nagendra wrote:
Hello
I m currently a student at Arizona State University, Tempe, Arizona,
pursuing my masters in Computer Science. I m currently involved in a
research project that makes use of Hadoop to run various map reduce
functions. Hence I searched the web on whats the best way to
I ve attached the screen shots of the exception and hadoop-site.xml.
Thanks!
On Wed, Nov 19, 2008 at 9:12 PM, Mithila Nagendra <[EMAIL PROTECTED]> wrote:
> Hello
> I m currently a student at Arizona State University, Tempe, Arizona,
> pursuing my masters in Computer Science. I m currently involve
Oops, missed the part where you already tried that.
On Mon, Jun 2, 2008 at 3:23 PM, Michael Di Domenico <[EMAIL PROTECTED]>
wrote:
> Depending on your windows version, there is a dos command called "subst"
> which you could use to virtualize a drive letter on your third machine
>
>
> On Fri, May
Depending on your windows version, there is a dos command called "subst"
which you could use to virtualize a drive letter on your third machine
On Fri, May 30, 2008 at 4:35 AM, Sridhar Raman <[EMAIL PROTECTED]>
wrote:
> Should the installation paths be the same in all the nodes? Most
> documenta
24 matches
Mail list logo