[jira] [Commented] (SPARK-31961) Add a class in spark with all Kafka configuration key available as string
[ https://issues.apache.org/jira/browse/SPARK-31961?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17133214#comment-17133214 ] Gabor Somogyi commented on SPARK-31961: --- +1 on not addig this. Spark should be loosely coupled with Kafka + the constants are already available in Kafka. The mentioned string magic should work. {quote}If you feel bad to add "kafka." as prefix then please submit a PR to propose the fix.{quote} This can be defined in app code. > Add a class in spark with all Kafka configuration key available as string > - > > Key: SPARK-31961 > URL: https://issues.apache.org/jira/browse/SPARK-31961 > Project: Spark > Issue Type: New Feature > Components: SQL, Structured Streaming >Affects Versions: 2.4.6 >Reporter: Gunjan Kumar >Priority: Minor > Labels: kafka, sql, structured-streaming > > Add a class in spark with all Kafka configuration key available as string. > see the highligted class which i want. > eg:- > Current code:- > val df_cluster1 = spark > .read > .format("kafka") > .option("kafka.bootstrap.servers","cluster1_host:cluster1_port) > .option("subscribe", "topic1") > Expected code:- > val df_cluster1 = spark > .read > .format("kafka") > .option(*KafkaConstantClass*.KAFKA_BOOTSTRAP_SERVERS,"cluster1_host:cluster1_port) > .option(*KafkaConstantClass*.subscribe, "topic1") -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-31961) Add a class in spark with all Kafka configuration key available as string
[ https://issues.apache.org/jira/browse/SPARK-31961?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17132742#comment-17132742 ] Jungtaek Lim commented on SPARK-31961: -- Great. Then just construct the config you want use like "kafka." + ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG. If you feel bad to add "kafka." as prefix then please submit a PR to propose the fix. > Add a class in spark with all Kafka configuration key available as string > - > > Key: SPARK-31961 > URL: https://issues.apache.org/jira/browse/SPARK-31961 > Project: Spark > Issue Type: New Feature > Components: SQL, Structured Streaming >Affects Versions: 2.4.6 >Reporter: Gunjan Kumar >Priority: Minor > Labels: kafka, sql, structured-streaming > > Add a class in spark with all Kafka configuration key available as string. > see the highligted class which i want. > eg:- > Current code:- > val df_cluster1 = spark > .read > .format("kafka") > .option("kafka.bootstrap.servers","cluster1_host:cluster1_port) > .option("subscribe", "topic1") > Expected code:- > val df_cluster1 = spark > .read > .format("kafka") > .option(*KafkaConstantClass*.KAFKA_BOOTSTRAP_SERVERS,"cluster1_host:cluster1_port) > .option(*KafkaConstantClass*.subscribe, "topic1") -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-31961) Add a class in spark with all Kafka configuration key available as string
[ https://issues.apache.org/jira/browse/SPARK-31961?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17132739#comment-17132739 ] Gunjan Kumar commented on SPARK-31961: -- no,kafka users dont write down config like that there are special classes ProducerConfig and ConsumerConfig. [https://kafka.apache.org/22/javadoc/index.html?org/apache/kafka/clients/consumer/ConsumerConfig.html] > Add a class in spark with all Kafka configuration key available as string > - > > Key: SPARK-31961 > URL: https://issues.apache.org/jira/browse/SPARK-31961 > Project: Spark > Issue Type: New Feature > Components: SQL, Structured Streaming >Affects Versions: 2.4.6 >Reporter: Gunjan Kumar >Priority: Minor > Labels: kafka, sql, structured-streaming > > Add a class in spark with all Kafka configuration key available as string. > see the highligted class which i want. > eg:- > Current code:- > val df_cluster1 = spark > .read > .format("kafka") > .option("kafka.bootstrap.servers","cluster1_host:cluster1_port) > .option("subscribe", "topic1") > Expected code:- > val df_cluster1 = spark > .read > .format("kafka") > .option(*KafkaConstantClass*.KAFKA_BOOTSTRAP_SERVERS,"cluster1_host:cluster1_port) > .option(*KafkaConstantClass*.subscribe, "topic1") -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-31961) Add a class in spark with all Kafka configuration key available as string
[ https://issues.apache.org/jira/browse/SPARK-31961?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17132736#comment-17132736 ] Jungtaek Lim commented on SPARK-31961: -- My point is that given they're Kafka config why you're requiring they from Spark. If Kafka provides such constant then you can just use that. If not, how could you tolerate the fact Kafka users also have to write down config key as string? In either way, Kafka should provide it under such rationalization, not Spark. > Add a class in spark with all Kafka configuration key available as string > - > > Key: SPARK-31961 > URL: https://issues.apache.org/jira/browse/SPARK-31961 > Project: Spark > Issue Type: New Feature > Components: SQL, Structured Streaming >Affects Versions: 2.4.6 >Reporter: Gunjan Kumar >Priority: Minor > Labels: kafka, sql, structured-streaming > > Add a class in spark with all Kafka configuration key available as string. > see the highligted class which i want. > eg:- > Current code:- > val df_cluster1 = spark > .read > .format("kafka") > .option("kafka.bootstrap.servers","cluster1_host:cluster1_port) > .option("subscribe", "topic1") > Expected code:- > val df_cluster1 = spark > .read > .format("kafka") > .option(*KafkaConstantClass*.KAFKA_BOOTSTRAP_SERVERS,"cluster1_host:cluster1_port) > .option(*KafkaConstantClass*.subscribe, "topic1") -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-31961) Add a class in spark with all Kafka configuration key available as string
[ https://issues.apache.org/jira/browse/SPARK-31961?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17132725#comment-17132725 ] Gunjan Kumar commented on SPARK-31961: -- spark will not be tied to kafka version as these property keys never changes in any kafka version. In the current scenarion, suppose i want to set poll timeout but i dont know the property name for same then i have to go to kafka doc search for that property and paste it in Option(). This is not how a programmer should code :) > Add a class in spark with all Kafka configuration key available as string > - > > Key: SPARK-31961 > URL: https://issues.apache.org/jira/browse/SPARK-31961 > Project: Spark > Issue Type: New Feature > Components: SQL, Structured Streaming >Affects Versions: 2.4.6 >Reporter: Gunjan Kumar >Priority: Minor > Labels: kafka, sql, structured-streaming > > Add a class in spark with all Kafka configuration key available as string. > see the highligted class which i want. > eg:- > Current code:- > val df_cluster1 = spark > .read > .format("kafka") > .option("kafka.bootstrap.servers","cluster1_host:cluster1_port) > .option("subscribe", "topic1") > Expected code:- > val df_cluster1 = spark > .read > .format("kafka") > .option(*KafkaConstantClass*.KAFKA_BOOTSTRAP_SERVERS,"cluster1_host:cluster1_port) > .option(*KafkaConstantClass*.subscribe, "topic1") -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-31961) Add a class in spark with all Kafka configuration key available as string
[ https://issues.apache.org/jira/browse/SPARK-31961?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17132707#comment-17132707 ] Jungtaek Lim commented on SPARK-31961: -- Don't you think the constant should be available in Kafka instead of Spark? Spark should try to avoid being tie to the specific version of Kafka in codebase. If Kafka provides it, what you need is just add "kafka." as prefix. > Add a class in spark with all Kafka configuration key available as string > - > > Key: SPARK-31961 > URL: https://issues.apache.org/jira/browse/SPARK-31961 > Project: Spark > Issue Type: New Feature > Components: SQL, Structured Streaming >Affects Versions: 2.4.6 >Reporter: Gunjan Kumar >Priority: Minor > Labels: kafka, sql, structured-streaming > > Add a class in spark with all Kafka configuration key available as string. > see the highligted class which i want. > eg:- > Current code:- > val df_cluster1 = spark > .read > .format("kafka") > .option("kafka.bootstrap.servers","cluster1_host:cluster1_port) > .option("subscribe", "topic1") > Expected code:- > val df_cluster1 = spark > .read > .format("kafka") > .option(*KafkaConstantClass*.KAFKA_BOOTSTRAP_SERVERS,"cluster1_host:cluster1_port) > .option(*KafkaConstantClass*.subscribe, "topic1") -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org