The main change here was refactoring the SparkListener interface which
is where we expose internal state about a Spark job to other
applications. We've cleaned up these API's a bunch and also added a
way to log all data as JSON for post-hoc analysis:

https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/SparkListener.scala

- Patrick

On Fri, May 30, 2014 at 7:09 AM, Daniel Siegmann
<daniel.siegm...@velos.io> wrote:
> The Spark 1.0.0 release notes state "Internal instrumentation has been added
> to allow applications to monitor and instrument Spark jobs." Can anyone
> point me to the docs for this?
>
> --
> Daniel Siegmann, Software Developer
> Velos
> Accelerating Machine Learning
>
> 440 NINTH AVENUE, 11TH FLOOR, NEW YORK, NY 10001
> E: daniel.siegm...@velos.io W: www.velos.io

Reply via email to