Thanks for the suggestion! That fixed it! My .bash_profile now looks like:

PATH=$PATH:$HOME/bin:/usr/java/jdk1.6.0_06
JAVA_HOME=/usr/java/jdk1.6.0_06; export JAVA_HOME

Thanks so much for the help!


I do have a couple more questions about getting the word count example to
work... If I copy and paste the code into a file called 'example' or
'example.C', I don't see how the compile and execute commands are even
touching that code. Would someone be willing to explain?

I have a few other questions, but I'm going to read up some more
documentation first :-) Thanks!

-SM


On Thu, Jun 26, 2008 at 4:45 AM, Zheng Shao <[EMAIL PROTECTED]> wrote:

> The error message still mentions "/usr/java/jre1.6.0_06/lib/tools.jar".
> You can try changing PATH to jdk as well.
>
> Ant shouldn't be looking for files in the jre directory.
>
> By the way, are you on cygwin? Not sure if the hadoop native lib build
> is supported on cygwin or not.
>
>
> Zheng
> -----Original Message-----
> From: Sandy [mailto:[EMAIL PROTECTED]
> Sent: Wednesday, June 25, 2008 2:31 PM
> To: [email protected]
> Subject: Re: Compiling Word Count in C++ : Hadoop Pipes
>
> My apologies. I had thought I had made that change already.
>
> Regardless, I still get the same error:
> $ ant -Dcompile.c++=yes compile-c++-examples
> Unable to locate tools.jar. Expected to find it in
> /usr/java/jre1.6.0_06/lib/tools.jar
> Buildfile: build.xml
>
> init:
>    [touch] Creating /tmp/null265867151
>   [delete] Deleting: /tmp/null265867151
>     [exec] svn: '.' is not a working copy
>     [exec] svn: '.' is not a working copy
>
> check-c++-makefiles:
>
> create-c++-examples-pipes-makefile:
>
> create-c++-pipes-makefile:
>
> create-c++-utils-makefile:
>
> BUILD FAILED
> /home/sjm/Desktop/hadoop-0.16.4/build.xml:947: Execute failed:
> java.io.IOException: Cannot run program
> "/home/sjm/Desktop/hadoop-0.16.4/src/c++/utils/configure" (in directory
> "/home/sjm/Desktop/hadoop-0.16.4/build/c++-build/Linux-i386-32/utils"):
> java.io.IOException: error=13, Permission denied
>
> Total time: 1 second
>
> My .bash_profile now contains the line
> JAVA_HOME=/usr/java/jdk1.6.0_06; export JAVA_HOME
>
> I then did
> source .bash_profile
> conf/hadoop-env.sh
>
> Is there anything else I need to do to make the changes take effect?
>
> Thanks again for the assistance.
>
> -SM
>
> On Wed, Jun 25, 2008 at 3:43 PM, lohit <[EMAIL PROTECTED]> wrote:
>
> > may be set it to JDK home? I have set it to my JDK.
> >
> > ----- Original Message ----
> > From: Sandy <[EMAIL PROTECTED]>
> > To: [email protected]
> > Sent: Wednesday, June 25, 2008 12:31:18 PM
> > Subject: Re: Compiling Word Count in C++ : Hadoop Pipes
> >
> > I am under the impression that it already is. As I posted in my
> original
> > e-mail, here are the declarations in hadoop-env.sh and my
> .bash_profile
> >
> > My hadoop-env.sh file looks something like:
> > # Set Hadoop-specific environment variables here.
> >
> > # The only required environment variable is JAVA_HOME.  All others are
> > # optional.  When running a distributed configuration it is best to
> > # set JAVA_HOME in this file, so that it is correctly defined on
> > # remote nodes.
> >
> > # The java implementation to use.  Required.
> > # export JAVA_HOME=$JAVA_HOME
> >
> >
> > and my .bash_profile file has this line in it:
> > JAVA_HOME=/usr/java/jre1.6.0_06; export JAVA_HOME
> > export PATH
> >
> >
> > Is there a different way I'm supposed to set the JAVA_HOME environment
> > variable?
> >
> > Much thanks,
> >
> > -SM
> > On Wed, Jun 25, 2008 at 3:22 PM, Zheng Shao <[EMAIL PROTECTED]>
> wrote:
> >
> > > You need to set JAVA_HOME to your jdk directory (instead of jre).
> > > This is required by ant.
> > >
> > > Zheng
> > > -----Original Message-----
> > > From: Sandy [mailto:[EMAIL PROTECTED]
> > > Sent: Wednesday, June 25, 2008 11:22 AM
> > > To: [email protected]
> > > Subject: Re: Compiling Word Count in C++ : Hadoop Pipes
> > >
> > > I'm not sure how this answers my question. Could you be more
> specific? I
> > > still am getting the above error when I type this commmand in. To
> > > summarize:
> > >
> > > With my current setup, this occurs:
> > > $ ant -Dcompile.c++=yes compile-c++-examples
> > > Unable to locate tools.jar. Expected to find it in
> > > /usr/java/jre1.6.0_06/lib/tools.jar
> > > Buildfile: build.xml
> > >
> > > init:
> > >    [touch] Creating /tmp/null2044923713
> > >   [delete] Deleting: /tmp/null2044923713
> > >     [exec] svn: '.' is not a working copy
> > >     [exec] svn: '.' is not a working copy
> > >
> > > check-c++-makefiles:
> > >
> > > create-c++-examples-pipes-makefile:
> > >    [mkdir] Created dir:
> > >
> /home/sjm/Desktop/hadoop-0.16.4/build/c++-build/Linux-i386-32/examples/p
> > > ipes
> > >
> > > BUILD FAILED
> > > /home/sjm/Desktop/hadoop-0.16.4/build.xml:987: Execute failed:
> > > java.io.IOException: Cannot run program
> > > "/home/sjm/Desktop/hadoop-0.16.4/src/examples/pipes/configure" (in
> > > directory
> > >
> "/home/sjm/Desktop/hadoop-0.16.4/build/c++-build/Linux-i386-32/examples/
> > > pipes"):
> > > java.io.IOException: error=13, Permission denied
> > >
> > > Total time: 1 second
> > >
> > > -----
> > >
> > > If I copy the tools.jar file located in my jdk's lib folder, i get
> the
> > > error
> > > message I printed in the previous message.
> > >
> > > Could someone please tell me or suggest to me what I am doing wrong?
> > >
> > > Thanks,
> > >
> > > -SM
> > >
> > > On Wed, Jun 25, 2008 at 1:53 PM, lohit <[EMAIL PROTECTED]> wrote:
> > >
> > > > ant -Dcompile.c++=yes compile-c++-examples
> > > > I picked it up from build.xml
> > > >
> > > > Thanks,
> > > > Lohit
> > > >
> > > > ----- Original Message ----
> > > > From: Sandy <[EMAIL PROTECTED]>
> > > > To: [email protected]
> > > > Sent: Wednesday, June 25, 2008 10:44:20 AM
> > > > Subject: Compiling Word Count in C++ : Hadoop Pipes
> > > >
> > > > Hi,
> > > >
> > > > I am currently trying to get Hadoop Pipes working. I am following
> the
> > > > instructions at the hadoop wiki, where it provides code for a C++
> > > > implementation of Word Count (located here:
> > > >
> http://wiki.apache.org/hadoop/C++WordCount?highlight=%28C%2B%2B%29)
> > > >
> > > > I am having some trouble parsing the instructions. What should the
> > > file
> > > > containing the new word count program be called? "examples"?
> > > >
> > > > If I were to call the file "example" and type in the following:
> > > > $ ant -Dcompile.c++=yes example
> > > > Buildfile: build.xml
> > > >
> > > > BUILD FAILED
> > > > Target `example' does not exist in this project.
> > > >
> > > > Total time: 0 seconds
> > > >
> > > >
> > > > If I try and compile with "examples" as stated on the wiki, I get:
> > > > $ ant -Dcompile.c++=yes examples
> > > > Buildfile: build.xml
> > > >
> > > > clover.setup:
> > > >
> > > > clover.info:
> > > >     [echo]
> > > >     [echo]      Clover not found. Code coverage reports disabled.
> > > >     [echo]
> > > >
> > > > clover:
> > > >
> > > > init:
> > > >    [touch] Creating /tmp/null810513231
> > > >   [delete] Deleting: /tmp/null810513231
> > > >     [exec] svn: '.' is not a working copy
> > > >     [exec] svn: '.' is not a working copy
> > > >
> > > > record-parser:
> > > >
> > > > compile-rcc-compiler:
> > > >    [javac] Compiling 29 source files to
> > > > /home/sjm/Desktop/hadoop-0.16.4/build/classes
> > > >
> > > > BUILD FAILED
> > > > /home/sjm/Desktop/hadoop-0.16.4/build.xml:241: Unable to find a
> javac
> > > > compiler;
> > > > com.sun.tools.javac.Main is not on the classpath.
> > > > Perhaps JAVA_HOME does not point to the JDK
> > > >
> > > > Total time: 1 second
> > > >
> > > >
> > > >
> > > > I am a bit puzzled by this. Originally I got the error that
> tools.jar
> > > was
> > > > not found, because it was looking for it under
> > > > /usr/java/jre1.6.0_06/lib/tools.jar . There is a tools.jar under
> > > > /usr/java/jdk1.6.0_06/lib/tools.jar. If I copy this file over to
> the
> > > jre
> > > > folder, that message goes away and its replaced with the above
> > > message.
> > > >
> > > > My hadoop-env.sh file looks something like:
> > > > # Set Hadoop-specific environment variables here.
> > > >
> > > > # The only required environment variable is JAVA_HOME.  All others
> are
> > > > # optional.  When running a distributed configuration it is best
> to
> > > > # set JAVA_HOME in this file, so that it is correctly defined on
> > > > # remote nodes.
> > > >
> > > > # The java implementation to use.  Required.
> > > > # export JAVA_HOME=$JAVA_HOME
> > > >
> > > >
> > > > and my .bash_profile file has this line in it:
> > > > JAVA_HOME=/usr/java/jre1.6.0_06; export JAVA_HOME
> > > > export PATH
> > > >
> > > >
> > > > Furthermore, if I go to the command line and type in javac
> -version, I
> > > get:
> > > > $ javac -version
> > > > javac 1.6.0_06
> > > >
> > > >
> > > > I also had no problem getting through the hadoop word count map
> reduce
> > > > tutorial in Java. It was able to find my java compiler fine. Could
> > > someone
> > > > please point me in the right direction? Also, since it is an sh
> file,
> > > > should
> > > > that export line in hadoop-env.sh really start with a hash sign?
> > > >
> > > > Thank you in advance for your assistance.
> > > >
> > > > -SM
> > > >
> > > >
> > >
> >
> >
>

Reply via email to