I am facing an issue with Spark Conf while reading the Cassandra host property
from the default spark configuration file.
I use Kafka 126.96.36.199 and Spark 2.2.1, Cassandra 3.11. I have a Docker
container where spark master, worker and my app running as standalone cluster
mode. I have a spark app using structured streaming pipeline reading from Kafka
and storing to Cassandra. I use for each sink to write the streaming data to
Cassandra. In for each writer, my application forgets the cassandra host
specified in spark defaults. I print out the Spark config when the Spark
application starts and I see that the application reads
spark.cassandra.connection.host and prints out properly.
However, when I start publishing messages to Kafka and for each writer
triggered, I see that it tries to connect localhost by default and forgets/does
not read the host in the Spark defaults.
Any idea what I am dealing with here. I am out of options.
The source code is available here:
For each writer: (to workaround I set it manually again)
Sent by mobile.