Hi, I am doing a benchmark in Flink, Storm, Spark for an iterative streaming application. The goal is to make a window for a stream and do an iterative computation per window.
Both Flink and Storm provides a window function with a list or iterator. But in Spark, I am not quite sure how to do this. Is it possible to get all the elements in a window as a list or iterator or etc? And the goal after this windowed computation is to do a reduce operation to get a globally synchronized value. Is this possible with Spark Streaming? -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org