Oh this is Awesome! exactly what I needed! Thank you Otis!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Measuring-Performance-in-Spark-tp17376p17839.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---
Is there any tools like Ganglia that I can use to get performance on Spark or
I need to do it myself?
Thanks!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Measuring-Performance-in-Spark-tp17376p17836.html
Sent from the Apache Spark User List mailing list
Thanks Akhil,
So there is no tool that I can use right? My program is overloading some
operators for some operation on images. I need to be accurate in the result.
I try to work on your offered approach.
Thanks.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabb
Hi,
I want to test the performance of MapReduce, and Spark on a program, find
the bottleneck, calculating the performance of each part of the program and
etc. I was wondering if there is tool for the measurement like Galia and
etc. to help me in this regard.
Thanks!
--
View this message in cont