[
https://issues.apache.org/jira/browse/SPARK-20153?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15955956#comment-15955956
]
Steve Loughran commented on SPARK-20153:
----------------------------------------
I'm glad we are both in agreement about not using secrets in URLs.
I'm afraid then, there's not much that can be done without upgrading to Hadoop
2.8.x JARs. You'll get a lot of other S3A speedups too, so it's worth upgrading
for S3 IO performance as well as security.
> Support Multiple aws credentials in order to access multiple Hive on S3 table
> in spark application
> ---------------------------------------------------------------------------------------------------
>
> Key: SPARK-20153
> URL: https://issues.apache.org/jira/browse/SPARK-20153
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 2.0.1, 2.1.0
> Reporter: Franck Tago
> Priority: Minor
>
> I need to access multiple hive tables in my spark application where each hive
> table is
> 1- an external table with data sitting on S3
> 2- each table is own by a different AWS user so I need to provide different
> AWS credentials.
> I am familiar with setting the aws credentials in the hadoop configuration
> object but that does not really help me because I can only set one pair of
> (fs.s3a.awsAccessKeyId , fs.s3a.awsSecretAccessKey )
> From my research , there is no easy or elegant way to do this in spark .
> Why is that ?
> How do I address this use case?
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]