I am using the Spark Streaming and have the following two questions:
1. If more than one output operations are put in the same StreamingContext
(basically, I mean, I put all the output operations in the same class), are
they processed one by one as the order they appear in the class? Or they
are
1. Multiple output operations are processed in the order they are defined.
That is because by default each one output operation is processed at a
time. This *can* be parallelized using an undocumented config parameter
spark.streaming.concurrentJobs which is by default set to 1.
2. Yes, the output
Great. Thank you!
Fang, Yan
yanfang...@gmail.com
+1 (206) 849-4108
On Wed, Jul 9, 2014 at 11:45 AM, Tathagata Das tathagata.das1...@gmail.com
wrote:
1. Multiple output operations are processed in the order they are defined.
That is because by default each one output operation is processed at