Take a look at the org.apache.spark.scheduler.SparkListener class. You can 
register your own SparkListener with SparkContext that listens for job-start 
and job-end events.

Matei

On Oct 10, 2013, at 9:04 PM, prabeesh k <[email protected]> wrote:

> Is there any way to get execution time in the program? 
> Actually I got from log 
>  INFO spark.SparkContext: Job finished: collect at Kmeans.scala:109, took 
> 0.242050892 s
> But I want to use execution time in my code. Please help me
>   

Reply via email to