Anybody out there running Beam on Spark?

I am pulling data from a Kafka topic with KafkaIO, but the job keeps
restarting. There is no error, it just....

   1. creates the driver
   2. creates the executors
   3. runs for a few seconds
   4. terminates the executors
   5. terminates the driver
   6. goto #1

I'm new to Beam, and completely new to Spark so I'm not sure how it's
supposed to work. Is this expected behavior? I expected the Beam job to run
continuously. Either I'm missing a setting, or I'm misunderstanding how
things are supposed to work.

Thanks for your consideration!

-- 

Joseph Zack
Software Engineer   | Information Security Group   | Symantec Enterprise
Division
Broadcom

mobile: 407.920.4930

[email protected]   | broadcom.com

Attachment: smime.p7s
Description: S/MIME Cryptographic Signature

Reply via email to