spark setting maximum available memory
In my situation each slave has 8 GB memory. I want to use the maximum memory that I can: .set(spark.executor.memory, ?g) How can I determine the amount of memory I should set ? It fails when I set it to 8GB.
java.lang.ClassNotFoundException
HelIo. I followed A Standalone App in Java part of the tutorial https://spark.apache.org/docs/0.8.1/quick-start.html Spark standalone cluster looks it's running without a problem : http://i.stack.imgur.com/7bFv8.png I have built a fat jar for running this JavaApp on the cluster. Before maven