[ 
https://issues.apache.org/jira/browse/HDDS-615?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16661968#comment-16661968
 ] 

Elek, Marton commented on HDDS-615:
-----------------------------------

This patch doesn't fix the problem in case of partial yetus build. But fixes 
the problem in case of full build. In case of full build we need to specify 
that the ozonefs should be built before the ozone-dist. Without that the build 
could be failed. 


> ozone-dist should depend on hadoop-ozone-file-system
> ----------------------------------------------------
>
>                 Key: HDDS-615
>                 URL: https://issues.apache.org/jira/browse/HDDS-615
>             Project: Hadoop Distributed Data Store
>          Issue Type: Bug
>            Reporter: Elek, Marton
>            Assignee: Elek, Marton
>            Priority: Major
>         Attachments: HDDS-615.001.patch
>
>
> In the Yetus build of HDDS-523 the build of the dist project was failed:
> {code:java}
> Mon Oct  8 14:16:06 UTC 2018
> cd /testptch/hadoop/hadoop-ozone/dist
> /usr/bin/mvn -Phdds 
> -Dmaven.repo.local=/home/jenkins/yetus-m2/hadoop-trunk-patch-1 -Ptest-patch 
> -DskipTests -fae clean install -DskipTests=true -Dmaven.javadoc.skip=true 
> -Dcheckstyle.skip=true -Dfindbugs.skip=true
> [INFO] Scanning for projects...
> [INFO]                                                                        
>  
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Building Apache Hadoop Ozone Distribution 0.3.0-SNAPSHOT
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] 
> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-ozone-dist 
> ---
> [INFO] Deleting /testptch/hadoop/hadoop-ozone/dist (includes = 
> [dependency-reduced-pom.xml], excludes = [])
> [INFO] 
> [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-ozone-dist 
> ---
> [INFO] Executing tasks
> main:
>     [mkdir] Created dir: /testptch/hadoop/hadoop-ozone/dist/target/test-dir
> [INFO] Executed tasks
> [INFO] 
> [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ 
> hadoop-ozone-dist ---
> [INFO] 
> [INFO] --- exec-maven-plugin:1.3.1:exec (dist) @ hadoop-ozone-dist ---
> cp: cannot stat 
> '/testptch/hadoop/hadoop-ozone/ozonefs/target/hadoop-ozone-filesystem-0.3.0-SNAPSHOT.jar':
>  No such file or directory
> Current directory /testptch/hadoop/hadoop-ozone/dist/target
> $ rm -rf ozone-0.3.0-SNAPSHOT
> $ mkdir ozone-0.3.0-SNAPSHOT
> $ cd ozone-0.3.0-SNAPSHOT
> $ cp -p /testptch/hadoop/LICENSE.txt .
> $ cp -p /testptch/hadoop/NOTICE.txt .
> $ cp -p /testptch/hadoop/README.txt .
> $ mkdir -p ./share/hadoop/mapreduce
> $ mkdir -p ./share/hadoop/ozone
> $ mkdir -p ./share/hadoop/hdds
> $ mkdir -p ./share/hadoop/yarn
> $ mkdir -p ./share/hadoop/hdfs
> $ mkdir -p ./share/hadoop/common
> $ mkdir -p ./share/ozone/web
> $ mkdir -p ./bin
> $ mkdir -p ./sbin
> $ mkdir -p ./etc
> $ mkdir -p ./libexec
> $ cp -r /testptch/hadoop/hadoop-common-project/hadoop-common/src/main/conf 
> etc/hadoop
> $ cp 
> /testptch/hadoop/hadoop-ozone/common/src/main/conf/om-audit-log4j2.properties 
> etc/hadoop
> $ cp /testptch/hadoop/hadoop-common-project/hadoop-common/src/main/bin/hadoop 
> bin/
> $ cp 
> /testptch/hadoop/hadoop-common-project/hadoop-common/src/main/bin/hadoop.cmd 
> bin/
> $ cp /testptch/hadoop/hadoop-ozone/common/src/main/bin/ozone bin/
> $ cp 
> /testptch/hadoop/hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh
>  libexec/
> $ cp 
> /testptch/hadoop/hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.cmd
>  libexec/
> $ cp 
> /testptch/hadoop/hadoop-common-project/hadoop-common/src/main/bin/hadoop-functions.sh
>  libexec/
> $ cp /testptch/hadoop/hadoop-ozone/common/src/main/bin/ozone-config.sh 
> libexec/
> $ cp -r /testptch/hadoop/hadoop-ozone/common/src/main/shellprofile.d libexec/
> $ cp 
> /testptch/hadoop/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemons.sh
>  sbin/
> $ cp 
> /testptch/hadoop/hadoop-common-project/hadoop-common/src/main/bin/workers.sh 
> sbin/
> $ cp /testptch/hadoop/hadoop-ozone/common/src/main/bin/start-ozone.sh sbin/
> $ cp /testptch/hadoop/hadoop-ozone/common/src/main/bin/stop-ozone.sh sbin/
> $ mkdir -p ./share/hadoop/ozonefs
> $ cp 
> /testptch/hadoop/hadoop-ozone/ozonefs/target/hadoop-ozone-filesystem-0.3.0-SNAPSHOT.jar
>  ./share/hadoop/ozonefs/hadoop-ozone-filesystem-0.3.0-SNAPSHOT.jar
> Failed!
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Total time: 7.832 s
> [INFO] Finished at: 2018-10-08T14:16:16+00:00
> [INFO] Final Memory: 33M/625M
> [INFO] 
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec 
> (dist) on project hadoop-ozone-dist: Command execution failed. Process exited 
> with an error: 1 (Exit value: 1) -> [Help 1]
> [ERROR] 
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
> switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR] 
> [ERROR] For more information about the errors and possible solutions, please 
> read the following articles:
> [ERROR] [Help 1] 
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> {code}
> The problem here is the ozonefs jar file is not built before the dist 
> project. We can fix it to add an additional dependency (dist should depend on 
> ozone-filesystem) to ensure the right reactor order.
> We have a step in the dist project which copies all the dependencies to 
> share/ozone/lib. We don't need to copy the ozonefs as we copy the shaded 
> ozonefs jar file to a different dir. For this reason we can use the provided 
> scope as only the runtime dependencies are copied.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to