Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/5380#issuecomment-90858684
  
    My guess is that this is probably OK -- I have actually been using commons 
math 3.4.1 in a project on Hadoop 2.6 without issues, though I recall a subtle 
and nasty incompatibility problem from 3.2 to 3.3 (that is very unlikely to be 
relevant here or manifest here). Clearly the intent has been backwards 
compatibility: http://commons.apache.org/proper/commons-math/changes-report.html
    
    Although it's always safe to try to harmonize dependencies, it's not 100% 
possible. If there's a compelling reason to update, and we see that tests pass 
in a few permutations of Hadoop builds, then I'd support it. For example can 
you try `mvn [opts] -DskipTests clean package; mvn [opts] test` for options 
like `-Phadoop-2.2` and `-Phadoop-2.4 -Dhadoop.version=2.6.0`? if those pass 
that gives more confidence that there's no issue.
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to