Unfortunately we don't have anything to do with Spark on GCE, so I'd suggest
asking in the GCE support forum. You could also try to launch a Spark cluster
by hand on nodes in there. Sigmoid Analytics published a package for this here:
http://spark-packages.org/package/9
Matei
On Jan 17, 2015, at 4:47 PM, Soumya Simanta soumya.sima...@gmail.com wrote:
I'm deploying Spark using the Click to Deploy Hadoop - Install Apache
Spark on Google Compute Engine.
I can run Spark jobs on the REPL and read data from Google storage. However,
I'm not sure how to access the Spark UI in this deployment. Can anyone help?
Also, it deploys Spark 1.1. It there an easy way to bump it to Spark 1.2 ?
Thanks
-Soumya
image.png
image.png
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org