[
https://issues.apache.org/jira/browse/SPARK-20153?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15955695#comment-15955695
]
Franck Tago commented on SPARK-20153:
-------------------------------------
oh I would definitely not consider including the key into the URL . It is a
gigantic security hole in my opinion. Moreover consider that I am dealing with
Hive on S3 where the uri is part of the table metadata. How would that work in
this case.
Is there a way to encode the accesId and secret key before calling
Does spark provide anyway of masking or hiding the accessId and secretKey?
> Support Multiple aws credentials in order to access multiple Hive on S3 table
> in spark application
> ---------------------------------------------------------------------------------------------------
>
> Key: SPARK-20153
> URL: https://issues.apache.org/jira/browse/SPARK-20153
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 2.0.1, 2.1.0
> Reporter: Franck Tago
> Priority: Minor
>
> I need to access multiple hive tables in my spark application where each hive
> table is
> 1- an external table with data sitting on S3
> 2- each table is own by a different AWS user so I need to provide different
> AWS credentials.
> I am familiar with setting the aws credentials in the hadoop configuration
> object but that does not really help me because I can only set one pair of
> (fs.s3a.awsAccessKeyId , fs.s3a.awsSecretAccessKey )
> From my research , there is no easy or elegant way to do this in spark .
> Why is that ?
> How do I address this use case?
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]