I know at one time maven 3.x didn't work so I've been using maven 2.x.

Well I've never tried using java6 for java5 home but I would think it
wouldn't work.  I thought it was forrest that required java5. I would
suggest using java5.

Tom


On 6/16/11 12:24 PM, "Praveen Sripati" <[email protected]> wrote:

> 
> Tom,
> 
>>> Note, it looks like your java5.home is pointing to java6?
> I have java6 on my laptop and pointed java5.home variable to java6. The
> hadoop doc says "Java 1.6.x - preferable from Sun". Is this the problem?
> 
>>> What version of protobufs are you using?
> I have protobuf 2.4.1.
> 
>>> What about mvn version?
> Apache Maven 3.0.3 (r1075438; 2011-02-28 23:01:09+0530)
> 
>>> So you had both common and hdfs built before doing mapreduce and
> common built before building hdfs? Or was common failing with the error
> you mention  below? If you haven't already you might simply try
> veryclean on everything and go again in order.
> I tried common first and there were some errors related to fop, but the
> common jars were created, so I started with hdfs and it was successful.
> Then I started the yarn build which led to the
> java_generate_equals_and_hash error.
> 
> Thanks,
> Praveen
> 
> 
> On Thursday 16 June 2011 09:54 PM, Thomas Graves wrote:
>> Note, it looks like your java5.home is pointing to java6?
>> 
>> I've never seen this particular error. The java_generate_equals_and_hash
>> option seems to have been added in protobuf2.4.0. What version of protobufs
>> are you using?  The instructions say to use atleast 2.4.0a, I'm using 2.4.1
>> right now.
>> 
>> You need to define the following  (I use a build.properties file). These are
>> the version I'm currently using.  All of these are just downloaded from the
>> corresponding website.  Some links to those can be found here:
>> http://yahoo.github.com/hadoop-common/installing.html
>> 
>> java5.home=/home/tgraves/hadoop/jdk1.5.0_22/
>> forrest.home=/home/tgraves/hadoop/apache-forrest-0.8
>> ant.home=/home/tgraves/hadoop/apache-ant-1.8.2
>> xercescroot=/home/tgraves/hadoop/xerces-c-src_2_8_0
>> eclipse.home=/home/tgraves/hadoop/eclipse
>> findbugs.home=/home/tgraves/hadoop/findbugs-1.3.9
>> 
>> I thought this was the same as for trunk but perhaps I'm mistaken.
>> 
>> What about mvn version?
>> /home/y/libexec/maven/bin/mvn --version
>> Apache Maven 2.2.1 (r801777; 2009-08-06 19:16:01+0000)
>> 
>> So you had both common and hdfs built before doing mapreduce and common
>> built before building hdfs? Or was common failing with the error you mention
>> below?   If you haven't already you might simply try veryclean on everything
>> and go again in order.
>> 
>> Tom
>> 
>> 
>> On 6/16/11 8:10 AM, "Praveen Sripati"<[email protected]>  wrote:
>> 
>>> Hi,
>>> 
>>> The hdfs build was successful after including the -Dforrest.home
>>> property to the ant command.
>>> 
>>> ***********
>>> 
>>> When  I started the mapreduce build to get the below error.
>>> 
>>> mvn clean install assembly:assembly
>>> 
>>> Downloaded:
>>> http://repo1.maven.org/maven2/org/apache/commons/commons-exec/1.0.1/commons-
>>> ex
>>> ec-1.0.1.jar
>>> (49 KB at 24.4 KB/sec)
>>> yarn_protos.proto:4:8: Option "java_generate_equals_and_hash" unknown.
>>> [INFO]
>>> 
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO] Skipping hadoop-mapreduce
>>> [INFO] This project has been banned from the build due to previous failures.
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO] Reactor Summary:
>>> [INFO]
>>> [INFO] yarn-api .......................................... FAILURE
>>> [13:23.081s]
>>> [INFO] yarn-common ....................................... SKIPPED
>>> [INFO] yarn-server-common ................................ SKIPPED
>>> [INFO] yarn-server-nodemanager ........................... SKIPPED
>>> [INFO] yarn-server-resourcemanager ....................... SKIPPED
>>> [INFO] yarn-server-tests ................................. SKIPPED
>>> [INFO] yarn-server ....................................... SKIPPED
>>> [INFO] yarn .............................................. SKIPPED
>>> [INFO] hadoop-mapreduce-client-core ...................... SKIPPED
>>> [INFO] hadoop-mapreduce-client-common .................... SKIPPED
>>> [INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED
>>> [INFO] hadoop-mapreduce-client-app ....................... SKIPPED
>>> [INFO] hadoop-mapreduce-client-hs ........................ SKIPPED
>>> [INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED
>>> [INFO] hadoop-mapreduce-client ........................... SKIPPED
>>> [INFO] hadoop-mapreduce .................................. SKIPPED
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO] BUILD FAILURE
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO] Total time: 13:45.437s
>>> [INFO] Finished at: Thu Jun 16 18:30:48 IST 2011
>>> [INFO] Final Memory: 6M/15M
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [ERROR] Failed to execute goal
>>> org.codehaus.mojo:exec-maven-plugin:1.2:exec (generate-sources) on
>>> project yarn-api: Command execution failed. Process exited with an
>>> error: 1(Exit value: 1) ->  [Help 1]
>>> [ERROR]
>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>> -e switch.
>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>> [ERROR]
>>> [ERROR] For more information about the errors and possible solutions,
>>> please read the following articles:
>>> [ERROR] [Help 1]
>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>> 
>>> ***********
>>> 
>>> I started building the commons and had to include -Djava5.home and
>>> -Dforrest.home properties in the ant command.
>>> 
>>> ant -Djava5.home=/usr/lib/jvm/java-6-openjdk
>>> -Dforrest.home=/home/praveensripati/Installations/apache-forrest-0.9
>>> veryclean mvn-install tar
>>> 
>>> And then I get the below error and the build hangs , but I see 4 jars in
>>> the build folder including hadoop-common-0.22.0-SNAPSHOT.jar.
>>> 
>>>        [exec] Cocoon will report the status of each document:
>>>        [exec]   - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
>>>        [exec]
>>>        [exec]
>>> ------------------------------------------------------------------------
>>>        [exec] cocoon 2.1.12-dev
>>>        [exec] Copyright (c) 1999-2007 Apache Software Foundation. All
>>> rights reserved.
>>>        [exec]
>>> ------------------------------------------------------------------------
>>>        [exec]
>>>        [exec]
>>>        [exec] * [1/29]    [29/29]   6.547s 9.4Kb   linkmap.html
>>>        [exec] * [2/29]    [1/28]    1.851s 22.3Kb  hdfs_shell.html
>>>        [exec] * [4/28]    [1/28]    1.156s 21.1Kb  distcp.html
>>>        [exec] * [5/27]    [0/0]     0.306s 0b      distcp.pdf
>>>        [exec] Exception in thread "main" java.lang.NoClassDefFoundError:
>>> org/apache//messaging/MessageHandler
>>>        [exec]  at
>>> org.apache.cocoon.serialization.FOPSerializer.configure(FOPSerializer.java:1
>>> 22
>>> )
>>> 
>>> 
>>> I included the following in the common/ivy.xml and the
>>> ./common/build/ivy/lib/Hadoop-Common/common/fop-0.93.jar file is there.
>>> 
>>>       <dependency org="org.apache.xmlgraphics"
>>>         name="fop"
>>>         rev="${fop.version}"
>>>         conf="common->default"/>
>>> 
>>> and the following in the common/ivy/libraries.properties and still get
>>> the same error.
>>> 
>>> fop.version=0.93
>>> 
>>> Thanks,
>>> Praveen
>>> 
>>> On Thursday 16 June 2011 07:55 AM, Luke Lu wrote:
>>>> On Wed, Jun 15, 2011 at 6:45 PM, Praveen Sripati
>>>> <[email protected]>   wrote:
>>>>> Do I need the avro-maven-plugin? When I ran the below command got the
>>>>> error that the pom file was not found. Where do I get the jar and the
>>>>> pom files for the avro-maven-plugin? I was able to get the source code
>>>>> for them, but not the binaries.
>>>>> 
>>>>> mvn install:install-file
>>>>> -Dfile=./avro-maven-plugin/avro-maven-plugin-1.4.0-SNAPSHOT.jar
>>>>> -DpomFile=./avro-maven-plugin/avro-maven-plugin-1.4.0-SNAPSHOT.pom
>>>> No, you no longer need to install avro-maven-plugin manually. It's
>>>> automatically installed via maven, as we switched to  avro 1.5.1.
>>>> We'll fix the instruction.
>>>> 
>>>> __Luke

Reply via email to