[
https://issues.apache.org/jira/browse/SPARK-20153?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Hyukjin Kwon resolved SPARK-20153.
----------------------------------
Resolution: Incomplete
> Support Multiple aws credentials in order to access multiple Hive on S3 table
> in spark application
> ---------------------------------------------------------------------------------------------------
>
> Key: SPARK-20153
> URL: https://issues.apache.org/jira/browse/SPARK-20153
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 2.0.1, 2.1.0
> Reporter: Franck Tago
> Priority: Minor
> Labels: bulk-closed
>
> I need to access multiple hive tables in my spark application where each hive
> table is
> 1- an external table with data sitting on S3
> 2- each table is own by a different AWS user so I need to provide different
> AWS credentials.
> I am familiar with setting the aws credentials in the hadoop configuration
> object but that does not really help me because I can only set one pair of
> (fs.s3a.awsAccessKeyId , fs.s3a.awsSecretAccessKey )
> From my research , there is no easy or elegant way to do this in spark .
> Why is that ?
> How do I address this use case?
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]