[ 
https://issues.apache.org/jira/browse/SPARK-7481?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15174438#comment-15174438
 ] 

Nicholas Chammas commented on SPARK-7481:
-----------------------------------------

Many people seem to be downgrading to use Spark built against Hadoop 2.4 
because the Spark / Hadoop 2.6 package doesn't work against S3 out of the box.

* [Example 
1|https://issues.apache.org/jira/browse/SPARK-7442?focusedCommentId=14582965&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-14582965]
* [Example 
2|https://issues.apache.org/jira/browse/SPARK-7442?focusedCommentId=14903750&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-14903750]
* [Example 
3|https://github.com/nchammas/flintrock/issues/88#issuecomment-190905262]

If this proposal eliminates that bit of friction for users without being too 
burdensome on the team, then I'm for it.

Ideally, we want people using Spark built against the latest version of Hadoop 
anyway, right? This proposal would nudge people in that direction.

> Add Hadoop 2.6+ profile to pull in object store FS accessors
> ------------------------------------------------------------
>
>                 Key: SPARK-7481
>                 URL: https://issues.apache.org/jira/browse/SPARK-7481
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>    Affects Versions: 1.3.1
>            Reporter: Steve Loughran
>
> To keep the s3n classpath right, to add s3a, swift & azure, the dependencies 
> of spark in a 2.6+ profile need to add the relevant object store packages 
> (hadoop-aws, hadoop-openstack, hadoop-azure)
> this adds more stuff to the client bundle, but will mean a single spark 
> package can talk to all of the stores.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to