I'm also using right now SPARK_EXECUTOR_URI, though I would prefer
distributing Spark as a binary package.
For running examples with `./bin/run-example ...` it works fine, however
tasks from spark-shell are getting lost.
Error: Could not find or load main class
I guess it's due to missing documentation and quite complicated setup.
Continuous integration would be nice!
Btw. is it possible to use spark as a shared library and not to fetch spark
tarball for each task?
Do you point SPARK_EXECUTOR_URI to HDFS url?
--
View this message in context:
I'm trying to run Spark on Mesos and I'm getting this error:
java.lang.ClassNotFoundException: org/apache/spark/serializer/JavaSerializer
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at