Can’t you find this in the Spark UI or timeline server?

> On 13. May 2018, at 00:31, Guillermo Ortiz Fernández 
> <guillermo.ortiz.f...@gmail.com> wrote:
> 
> I want to measure how long it takes some different transformations in Spark 
> as map, joinWithCassandraTable and so on.  Which one is the best aproximation 
> to do it? 
> 
> def time[R](block: => R): R = {
>     val t0 = System.nanoTime()
>     val result = block   
>     val t1 = System.nanoTime()
>     println("Elapsed time: " + (t1 - t0) + "ns")
>     result
> }
> 
> Could I use something like this?? I guess that the System.nanoTime will be 
> executed in the driver before and after the workers execute the maps/joins 
> and so on. Is it right? any other idea?

Reply via email to