Hi, I'll be showing our Spark monitoring <http://blog.sematext.com/2014/10/07/apache-spark-monitoring/> at the upcoming Spark Summit in NYC. I'd like to run some/any Spark job that really exercises Spark and makes it emit all its various metrics (so the metrics charts are full of data and not blank or flat and boring).
Since we don't use Spark at Sematext yet, I was wondering if anyone could recommend some Spark app/job that's easy to run, just to get some Spark job to start emitting various Spark metrics? Thanks, Otis -- Monitoring * Alerting * Anomaly Detection * Centralized Log Management Solr & Elasticsearch Support * http://sematext.com/