sunchao commented on pull request #33276:
URL: https://github.com/apache/spark/pull/33276#issuecomment-877512333


   Yeah IMHO it looks be OK since `hadoop-3.2` is implicitly configured as the 
default profile unless `hadoop-2.7` is explicitly specified right now, so it's 
probably good to make this consistent in sub-modules too (we're already doing 
this in `kubernetes/integration-tests`). Otherwise we might run into something 
similar in future, e.g., adding some dependency only for Hadoop 3.x which then 
break the tests.
   
   > it might be simplest just to put bouncy castle on the test CP always.
   
   Since `hadoop-2.7` already has test dependency on BC, we may introduce two 
BC jars in the test classpath (`bcprov-jdk15on-1.60.jar` and 
`bcprov-jdk15-140.jar`). They seem to contain overlapping classes so I'm not 
sure if its a good thing to do.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to