Apache9 commented on a change in pull request #1494: HADOOP-16558. 
[COMMON+HDFS] use protobuf-maven-plugin to generate protobuf classes
URL: https://github.com/apache/hadoop/pull/1494#discussion_r326881543
 
 

 ##########
 File path: hadoop-hdfs-project/hadoop-hdfs-client/pom.xml
 ##########
 @@ -131,36 +131,18 @@ https://maven.apache.org/xsd/maven-4.0.0.xsd";>
         </configuration>
       </plugin>
       <plugin>
-        <groupId>org.apache.hadoop</groupId>
-        <artifactId>hadoop-maven-plugins</artifactId>
+        <groupId>org.xolstice.maven.plugins</groupId>
+        <artifactId>protobuf-maven-plugin</artifactId>
         <executions>
           <execution>
-            <id>compile-protoc</id>
-            <goals>
-              <goal>protoc</goal>
-            </goals>
+            <id>src-compile-protoc</id>
             <configuration>
-              <protocVersion>${protobuf.version}</protocVersion>
-              <protocCommand>${protoc.path}</protocCommand>
-              <imports>
-                
<param>${basedir}/../../hadoop-common-project/hadoop-common/src/main/proto</param>
-                <param>${basedir}/src/main/proto</param>
-              </imports>
-              <source>
-                <directory>${basedir}/src/main/proto</directory>
-                <includes>
-                  <include>ClientDatanodeProtocol.proto</include>
-                  <include>ClientNamenodeProtocol.proto</include>
-                  <include>acl.proto</include>
-                  <include>xattr.proto</include>
-                  <include>datatransfer.proto</include>
-                  <include>hdfs.proto</include>
-                  <include>encryption.proto</include>
-                  <include>inotify.proto</include>
-                  <include>erasurecoding.proto</include>
-                  <include>ReconfigurationProtocol.proto</include>
-                </includes>
-              </source>
+              <skip>false</skip>
+              <additionalProtoPathElements>
+                <additionalProtoPathElement>
+                  
${basedir}/../../hadoop-common-project/hadoop-common/src/main/proto
 
 Review comment:
   If we also attach the proto along in the jar, we do not need to add this 
configuration. This may change the behavior but I think it is fine? Of course, 
keep the old way is also fine, not a big problem. Just want to make the pom 
simpler. As in the old way, you need make sure that we also depend on 
hadoop-common in the dependencies section, otherwise the protoc generating is 
fine but there will be compile error...

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to