Github user FavioVazquez commented on the pull request:

    https://github.com/apache/spark/pull/5786#issuecomment-101126703
  
    I think @srowen saw this PR as a cleaned up of old dependencies and 
updating of spark's defaults to a currently used Hadoop version. This started 
as a minor fix for inconsistencies in the Hadoop defaults when using the latest 
CDH5 distribution, and grew to be a upgrading of the Hadoop default version, 
updating of docs, cleaned up yarn's POM and main POM. I still face problems 
when building Spark for CDH5 without this changes, and I think it would be 
helpful to update the versions, since Hadoop-1 is really old, and I really 
believe it pumps up Spark to the newest technologies.
    
    I'm no expert in this field, but I think this PR could be interesting and 
useful for a lot of people that's starting with this technologies and would 
like to build Spark with the newest Hadoop version. I have to remark that if 
you use the actual building process and main POM, you'll get errors when try to 
connect to Cloudera's newest HDFS, yo can see that in the beginning of the PR. 
It's really awkward to build Spark with lots of ad hoc  and in situ 
dependencies just to keep old versions, Idk maybe it's just me. 
    
    I really appreciated @srowen and @vanzin help with this, and would like to 
now if you think this is the right track to Spark 1.4.0 @pwendell. I'm up to 
making any more changes and updates if you think is necessary, and I repeat, I 
think this could be a good refresh to spark dependencies, I know this is really 
a minor change, but it could grow to be even a better update.
    
    Thanks for your comments, I'll wait for your replies. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to