Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/3938#issuecomment-69106242
  
    Well, either choice (forcing it down to 1.8.8 or up to 1.9.13) is dangerous 
if the library doesn't have strict compatibility guarantees. If it's reasonably 
compatible - that is, if newer versions have strictly new APIs and didn't 
change or remove any old ones, forcing the version up would be best.
    
    So I guess forcing the version up is ok if hadoop-2.4 libraries are ok with 
it. I haven't tested that case, though, so that's my only concern here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to