[ 
https://issues.apache.org/jira/browse/HAMA-431?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13098675#comment-13098675
 ] 

Thomas Jungblut edited comment on HAMA-431 at 9/7/11 9:08 AM:
--------------------------------------------------------------

Great so let's start with some kind of tutorial:

{noformat}
svn checkout http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.23/
{noformat}

Follow the building rules in "BUILDING.txt".
Most of the time you'd just need to run:

{noformat}
mvn compile -e -DskipTests
{noformat}

This will retrieve the dependencies. 


If you don't have protobuf in your path, the build fails while compiling 
yarn-api. This is caused by the exec plugin which compiles the protobuf files 
(generates sources).

{noformat}

[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2:exec 
(generate-sources) on project hadoop-yarn-api: Command execution failed. 
Process exited with an error: 1(Exit value: 1) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal 
org.codehaus.mojo:exec-maven-plugin:1.2:exec (generate-sources) on project 
hadoop-yarn-api: Command execution failed.
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)

{noformat}

This pom.xml tries to run an executable called "protoc", so make sure you have 
installed protobuf correctly.
You can download it here: http://code.google.com/p/protobuf/

Follow the steps in INSTALL.txt. 
./configure
make
make install

Maybe "configure" fails because you don't have g++ installed, so just install 
it via "apt-get install g++" and then start the whole process again.
Be careful what the output of install says. For me it told me that he layed the 
shared objects into "/usr/local/lib". You then have to edit your 
"/etc/ld.so.conf" and add the path of the protobuf shared objects to it. Reload 
with "ldconfig".
Now you can try to run "protoc" on your shell, it should tell you "missing 
input file".

Back to yarn you can just call 

{noformat}
mvn clean install -e -rf :hadoop-yarn-api -DskipTests
{noformat}

to rerun the build process.

      was (Author: thomas.jungblut):
    Great so let's start with some kind of tutorial:

{noformat}
svn checkout http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.23/
{noformat}

Follow the building rules in "BUILDING.txt".
Most of the time you'd just need to run:

{noformat}
mvn compile -e -DskipTests
{noformat}

This will retrieve the dependencies. 


If you don't have protobuf in your path, the build fails while compiling 
yarn-api. This is caused by the exec plugin which compiles the protobuf files 
(generates sources).

{noformat}

[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2:exec 
(generate-sources) on project hadoop-yarn-api: Command execution failed. 
Process exited with an error: 1(Exit value: 1) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal 
org.codehaus.mojo:exec-maven-plugin:1.2:exec (generate-sources) on project 
hadoop-yarn-api: Command execution failed.
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)

{noformat}

This pom.xml tries to run an executable called "protoc", so make sure you have 
installed protobuf correctly.
You can download it here: http://code.google.com/p/protobuf/

Follow the steps in INSTALL.txt. 
./configure
make
make install

Maybe "configure" fails because you don't have g++ installed, so just install 
it via "apt-get install g++" and then start the whole process again.
Be careful what the output of install says. For me it told me that he layed the 
shared objects into "/usr/local/lib". You then have to edit your 
"/etc/ld.so.conf" and add the path of the protobuf shared objects to it. Reload 
with "ldconfig".
Now you can try to run "protoc" on your shell, it should tell you "missing 
input file".

Back to yarn you can just call 

{noformat}
mvn compile -e -rf :hadoop-yarn-api -DskipTests
{noformat}

to rerun the build process.
  
> MapReduce NG integration
> ------------------------
>
>                 Key: HAMA-431
>                 URL: https://issues.apache.org/jira/browse/HAMA-431
>             Project: Hama
>          Issue Type: New Feature
>            Reporter: Thomas Jungblut
>            Assignee: Thomas Jungblut
>
> We should take a look at how to integrate Hama's BSP Engine to Hadoop's 
> nextGen application platform.
> Can be currently found in the 0.23 branch.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Reply via email to