lbustelo 08:51
@HS88 By default, Toree will create a SparkContext used a SparkConf that is
populate through cmd parameters passed using SPARK_CONF. There are other
forms of startup time configuration that are covered by
http://spark.apache.org/docs/latest/submitting-applications.html.
If you wish to configure the SparkContext at runtime, then you can set
Toree to not create a context for you and pass in the --nosparkcontext. At
this point, creating, configuring and managing the SparkContext is up to
you and done with code on a Notebook cell.

chipsenkbeil 09:18
@HS88 note that using --nosparkcontext means that you will need to create
the spark context using kernel.createSparkContext(sparkConf) or
kernel.createSparkContext(master, appName)

On Mon, Apr 11, 2016 at 6:04 AM, Harmeet Singh <[email protected]>
wrote:

> Hi,
>
> I am running toree using the command "dist/toree/bin/run.sh". This starts
> the spark-kernel in default configuration mode. However, I want to specify
> the parameters for spark-kernel. I want to run spark-kernel using following
> parameters:
>
> --num-executors 25 --executor-cores 4 --executor-memory 10g
> --driver-memory 10g --conf spark.yarn.executor.memoryOverhead=2048 --conf
> spark.driver.maxResultSize=10g --conf
> spark.serializer=org.apache.spark.serializer.KryoSerializer
>
> If possible, please help me.
>
> Regards,
> Harmeet
>

Reply via email to