Hi,

I have a spark structured streaming application that is reading data from a
Kafka topic (16 partitions). I am using standalone mode. I have two workers
node, one node is on the same machine with masters and another one is on a
different machine. Both of the worker nodes has 8 cores and 16G RAM with one
executor.

While I run the streaming application with one worker node which is on the
same machine as the master, the application is working fine. But while I am
running the application with two worker nodes, 8 tasks successfully
completed running on worker node 1 (which is on the same machine as
masters), but the other 8 tasks are scheduled on another worker node but
it's got stuck in the RUNNING stage and application got stuck.

The normal spark application is running fine with this setup.

Can anyone help me with this?



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to