Github user budde commented on a diff in the pull request:
https://github.com/apache/spark/pull/17467#discussion_r112565900
--- Diff:
external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisInputDStream.scala
---
@@ -249,6 +252,17 @@ object KinesisInputDStream {
}
/**
+ * Sets the [[SparkAWSCredentials]] to use for authenticating to the
AWS CloudWatch
+ * endpoint. Will use the same credentials used for AWS Kinesis if no
custom value is set.
+ *
+ * @param conf: Map[String, String] to use for CloudWatch
authentication
+ */
+ def kinesisConf(conf: Map[String, String]): Builder = {
--- End diff --
If you want the extensibility of a key/value map for configs then I would
go the route of getting a solution that uses ```SparkConf``` to do that in
order to use the existing facilities provided by Spark. It doesn't make sense
to me to introduce a key/value map just for Kinesis, especially since the
naming of your keys (e.g. ```spark.streaming.kinesis.retry.waitTime```) would
indicate to most users that these are ```SparkConf``` params, not a
Kinesis-specific mapping that must be manually set up and passed to the Kinesis
stream builder.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]