Hi, I have written a code that works just about fine in the spark shell on EC2. The ec2 script helped me configure my master and worker nodes. Now I want to run the scala-spark code out side the interactive shell. How do I go about doing it.
I was referring to the instructions mentioned here: https://spark.apache.org/docs/0.9.1/quick-start.html But this is confusing because it mentions about a simple project jar file which I am not sure how to generate. I only have the file that runs directly on my spark shell. Any easy intruction to get this quickly running as a job? Thanks AJ -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-Spark-running-out-of-the-spark-shell-tp6459.html Sent from the Apache Spark Developers List mailing list archive at Nabble.com.