Hi,

I'm quite new and recetly started to try spark. I've setup a single node
spark "cluster" and followed the tutorials in Quick Start. But I've come
across some issues.

The thing I was trying to do is to try the java api and run it on the
single-node "cluster". I followed the Quick Start/A Standalone App in Java
and successfully ran it using maven. But when I was trying to use
./bin/spark-class org.apache.spark.deploy.Client launch to submit the jar,
I found there are a driver and an app running on the cluster. For running
using maven directly, I only saw the app running.

So I was thinking if I could build a jar with all the dependencise in order
to distribute and run it usie just java -cp my.jar MainClass Arguments. But
I came across the "Exception in thread "main"
com.typesafe.config.ConfigException$Missing: No configuration setting found
for key 'akka.version'" issue. And I tried to specify the org.apache.spark
as provided in the pom.xml. I can build the jar. But when executing using
the java -cp my.jar, it just report cannot find the spark dependencies. And
using the ./bin/spark-class org.apache.spark.deploy.Client launch method
just go back to have a driver and an app at the same time.

So I'm wondering what's the best way to generate a jar with dependencies
and submit it to the spark cluster as a single app? Could somebody give me
some advice on this? Thank you!

Best Regards,
Min Li

Reply via email to