One approach would be to write pure mapReduce and spark jobs (eg like
wordcounts, filter, join, groupBy etc) and benchmark them. Another would be
to pick something that runs on top of mapReduce/Spark and benchmark on it.
(like benchmark against hive and sparkSQL)

Thanks
Best Regards

On Mon, Oct 27, 2014 at 10:52 PM, mahsa <[email protected]> wrote:

> Hi,
> I want to test the performance of MapReduce, and Spark on a program, find
> the bottleneck, calculating the performance of each part of the program and
> etc. I was wondering if there is tool for the measurement like Galia and
> etc. to help me in this regard.
> Thanks!
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Measuring-Performance-in-Spark-tp17376.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>
>

Reply via email to