vinayakumarb commented on a change in pull request #1494: HADOOP-16558. 
[COMMON+HDFS] use protobuf-maven-plugin to generate protobuf classes
URL: https://github.com/apache/hadoop/pull/1494#discussion_r326912670
 
 

 ##########
 File path: hadoop-hdfs-project/hadoop-hdfs-client/pom.xml
 ##########
 @@ -131,36 +131,18 @@ https://maven.apache.org/xsd/maven-4.0.0.xsd";>
         </configuration>
       </plugin>
       <plugin>
-        <groupId>org.apache.hadoop</groupId>
-        <artifactId>hadoop-maven-plugins</artifactId>
+        <groupId>org.xolstice.maven.plugins</groupId>
+        <artifactId>protobuf-maven-plugin</artifactId>
         <executions>
           <execution>
-            <id>compile-protoc</id>
-            <goals>
-              <goal>protoc</goal>
-            </goals>
+            <id>src-compile-protoc</id>
             <configuration>
-              <protocVersion>${protobuf.version}</protocVersion>
-              <protocCommand>${protoc.path}</protocCommand>
-              <imports>
-                
<param>${basedir}/../../hadoop-common-project/hadoop-common/src/main/proto</param>
-                <param>${basedir}/src/main/proto</param>
-              </imports>
-              <source>
-                <directory>${basedir}/src/main/proto</directory>
-                <includes>
-                  <include>ClientDatanodeProtocol.proto</include>
-                  <include>ClientNamenodeProtocol.proto</include>
-                  <include>acl.proto</include>
-                  <include>xattr.proto</include>
-                  <include>datatransfer.proto</include>
-                  <include>hdfs.proto</include>
-                  <include>encryption.proto</include>
-                  <include>inotify.proto</include>
-                  <include>erasurecoding.proto</include>
-                  <include>ReconfigurationProtocol.proto</include>
-                </includes>
-              </source>
+              <skip>false</skip>
+              <additionalProtoPathElements>
+                <additionalProtoPathElement>
+                  
${basedir}/../../hadoop-common-project/hadoop-common/src/main/proto
 
 Review comment:
   Yes. I agree that keeping proto files in jar may be fine for now.
   It would be better to specify explicitly which proto files are required for 
successfull proto generation. So I preferred to NOT to include in jars.
   
   > As in the old way, you need make sure that we also depend on hadoop-common 
in the dependencies section, otherwise the protoc generating is fine but there 
will be compile error...
   
   hadoop-common in dependency section is anyway required, irrespective of 
whether proto files are part of jar or not.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to