I may be wrong here... It seems to me that the use of such functionality is
contrary to the paradigm that Spark enforces.

Here is why I say that:    Spark doesn't execute transformations, such as
'map', until an action is requested, such as 'persist'. Therefore,
explicitly performing computations between partially competed chunks of a
'map' call seems counter to the Spark MO.

-- Horia
 On Nov 4, 2013 12:22 PM, "Markus Losoi" <[email protected]> wrote:

> Hi
>
> Is it possible for a driver program to receive intermediary results of a
> Spark operation? If, e.g., a long map() operation is in progress, can the
> driver become aware of some of the (key, value) pairs before all of them
> are
> computed?
>
> There seems to be SparkListener interface that has an onTaskEnd() event
> [1].
> However, the documentation is somewhat sparse on what kind of information
> is
> included in a SparkListenerTaskEnd object [2].
>
> [1]
>
> http://spark.incubator.apache.org/docs/0.8.0/api/core/org/apache/spark/sched
> uler/SparkListener.html
> [2]
>
> http://spark.incubator.apache.org/docs/0.8.0/api/core/org/apache/spark/sched
> uler/SparkListenerTaskEnd.html
>
> Best regards,
> Markus Losoi ([email protected])
>
>

Reply via email to