[
https://issues.apache.org/jira/browse/SPARK-3640?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14196534#comment-14196534
]
Aniket Bhatnagar commented on SPARK-3640:
-----------------------------------------
Thanks Chris for looking into this. This documentation would certainly be
useful. However, the problem that I am facing with using
DefaultCredentialsProvider is that each node in the cluster needs to be setup
to have those credentials in the user's home directory and this is a bit
tedious. I would like the driver to pass credentials to all nodes in the
cluster to avoid such an operational overhead. I have submitted a pull request
that contains the changes I had to make to allow driver to pass user defined
credentials. Do have a look and let me know if there is a better way.
https://github.com/apache/spark/pull/3092
> KinesisUtils should accept a credentials object instead of forcing
> DefaultCredentialsProvider
> ---------------------------------------------------------------------------------------------
>
> Key: SPARK-3640
> URL: https://issues.apache.org/jira/browse/SPARK-3640
> Project: Spark
> Issue Type: Improvement
> Components: Streaming
> Affects Versions: 1.1.0
> Reporter: Aniket Bhatnagar
> Labels: kinesis
>
> KinesisUtils should accept AWS Credentials as a parameter and should default
> to DefaultCredentialsProvider if no credentials are provided. Currently, the
> implementation forces usage of DefaultCredentialsProvider which can be a pain
> especially when jobs are run by multiple unix users.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]