Hello,

I've never tried that, this doesn't work?

val df_cluster1 = spark
  .read
  .format("kafka")
  .option("kafka.bootstrap.servers", "cluster1_host:cluster1_port")
  .option("subscribe", "topic1")

val df_cluster2 = spark
  .read
  .format("kafka")
  .option("kafka.bootstrap.servers", "cluster2_host:port")
  .option("subscribe", "topic2")


On Tue, 9 Jun 2020 at 18:10, Srinivas V <srini....@gmail.com> wrote:

> Hello,
>  In Structured Streaming, is it possible to have one spark application
> with one query to consume topics from multiple kafka clusters?
>
> I am trying to consume two topics each from different Kafka Cluster, but
> it gives one of the topics as an unknown topic and the job keeps running
> without completing in Spark UI.
>
> Is it not allowed in Spark 2.4.5?
>
> Regards
> Srini
>
>
>
>

Reply via email to