bin/spark-submit will set some env variable, like SPARK_HOME, that Spark later
will use to locate the spark-defaults.conf from which default settings for
Spark will be loaded.
I would guess that some configuration option like spark.eventLog.enabled in the
spark-defaults.conf is skipped by direc
Have you tried https://github.com/spark-jobserver/spark-jobserver
On Tue, Aug 2, 2016 at 2:23 PM, Rychnovsky, Dusan
wrote:
> Hi,
>
>
> I am trying to launch my Spark application from within my Java application
> via the SparkSubmit class, like this:
>
>
>
> List args = new ArrayList<>();
>
> args
Hi,
I am trying to launch my Spark application from within my Java application via
the SparkSubmit class, like this:
List args = new ArrayList<>();
args.add("--verbose");
args.add("--deploy-mode=cluster");
args.add("--master=yarn");
...
SparkSubmit.main(args.toArray(new String[args.size()])