No, you can't instantiate a SparkContext to start apps in cluster mode.

For Yarn, for example, you'd have to call directly into
org.apache.spark.deploy.yarn.Client; that class will tell the Yarn
cluster to launch the driver for you and then instantiate the
SparkContext.

On Wed, Oct 1, 2014 at 10:17 AM, Tamas Jambor <jambo...@gmail.com> wrote:
> when you say "respective backend code to launch it", I thought this is
> the way to do that.

>>>>>
>>>>>     conf = (SparkConf()
>>>>>                 .setMaster("yarn-client")
>>>>>                 .setAppName("test")
>>>>>                 .set("spark.driver.memory", "1G")
>>>>>                 .set("spark.executor.memory", "1G")
>>>>>                 .set("spark.executor.instances", 2)
>>>>>                 .set("spark.executor.cores", 4))
>>>>>     sc = SparkContext(conf=conf)

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to