sunchao commented on pull request #29843:
URL: https://github.com/apache/spark/pull/29843#issuecomment-699068276


   Thanks @srowen , I'm not sure how that will be useful though. The root issue 
here is not CI but making Hadoop 3.2 as the default Maven profile. Previously, 
this is realized by specifying `hadoop.version` to be 3.2.0, and it works fine 
since both Hadoop 2.7 and 3.2 share the same set of dependencies.
   
   Now with this PR, we'd have to use different sets of dependencies for 2.7 
and 3.2 separately. With a property-based approach like the following:
   
   ```xml
       <profile>
         <id>hadoop-2.7</id>
         <dependencies>
           ..
         </dependencies>
       </profile>
       <profile>
         <id>hadoop-3.2</id>
         <!-- Default hadoop profile. Uses global properties. -->
         <activation>
           <property><name>!hadoop-2.7</name></property>
         </activation>
         <dependencies>
            ..
         </dependencies>
       </profile>
   ```
   we'll be able to make 3.2 as default profile (i.e., it will be activated 
when ppl are not using the `-Phadoop-3.2` flag). However to compile against 2.7 
ppl have to change the Maven command to be:
   ```shell
   build/mvn -Phadoop-2.7 -Dhadoop-2.7 ...
   ```
   instead of today's 
   ```shell
   build/mvn -Phadoop-2.7 ...
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to