Thanks for your reply.
On Fri, Oct 11, 2013 at 10:03 AM, Matei Zaharia wrote:
> Take a look at the org.apache.spark.scheduler.SparkListener class. You can
> register your own SparkListener with SparkContext that listens for
> job-start and job-end events.
>
> Matei
>
> On Oct 10, 2013, at 9:04 P
Take a look at the org.apache.spark.scheduler.SparkListener class. You can
register your own SparkListener with SparkContext that listens for job-start
and job-end events.
Matei
On Oct 10, 2013, at 9:04 PM, prabeesh k wrote:
> Is there any way to get execution time in the program?
> Actually
Is there any way to get execution time in the program?
Actually I got from log
INFO spark.SparkContext: Job finished: collect at Kmeans.scala:109, took
0.242050892 s
But I want to use execution time in my code. Please help me