[ https://issues.apache.org/jira/browse/SPARK-20153?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15956346#comment-15956346 ]
Franck Tago commented on SPARK-20153: ------------------------------------- ok thanks for the tips. It appears that EMR 5.4.0 also supports the use of the s3a within a spark application. http://docs.aws.amazon.com/emr/latest/ReleaseGuide/emr-whatsnew.html This was a painful restriction prior to this resolution. > Support Multiple aws credentials in order to access multiple Hive on S3 table > in spark application > --------------------------------------------------------------------------------------------------- > > Key: SPARK-20153 > URL: https://issues.apache.org/jira/browse/SPARK-20153 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 2.0.1, 2.1.0 > Reporter: Franck Tago > Priority: Minor > > I need to access multiple hive tables in my spark application where each hive > table is > 1- an external table with data sitting on S3 > 2- each table is own by a different AWS user so I need to provide different > AWS credentials. > I am familiar with setting the aws credentials in the hadoop configuration > object but that does not really help me because I can only set one pair of > (fs.s3a.awsAccessKeyId , fs.s3a.awsSecretAccessKey ) > From my research , there is no easy or elegant way to do this in spark . > Why is that ? > How do I address this use case? -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org