when you say "respective backend code to launch it", I thought this is
the way to do that.

thanks,
Tamas

On Wed, Oct 1, 2014 at 6:13 PM, Marcelo Vanzin [via Apache Spark User
List] <ml-node+s1001560n15506...@n3.nabble.com> wrote:
> Because that's not how you launch apps in cluster mode; you have to do
> it through the command line, or by calling directly the respective
> backend code to launch it.
>
> (That being said, it would be nice to have a programmatic way of
> launching apps that handled all this - this has been brought up in a
> few different contexts, but I don't think there's an "official"
> solution yet.)
>
> On Wed, Oct 1, 2014 at 9:59 AM, Tamas Jambor <[hidden email]> wrote:
>
>> thanks Marcelo.
>>
>> What's the reason it is not possible in cluster mode, either?
>>
>> On Wed, Oct 1, 2014 at 5:42 PM, Marcelo Vanzin <[hidden email]> wrote:
>>> You can't set up the driver memory programatically in client mode. In
>>> that mode, the same JVM is running the driver, so you can't modify
>>> command line options anymore when initializing the SparkContext.
>>>
>>> (And you can't really start cluster mode apps that way, so the only
>>> way to set this is through the command line / config files.)
>>>
>>> On Wed, Oct 1, 2014 at 9:26 AM, jamborta <[hidden email]> wrote:
>>>> Hi all,
>>>>
>>>> I cannot figure out why this command is not setting the driver memory
>>>> (it is
>>>> setting the executor memory):
>>>>
>>>>     conf = (SparkConf()
>>>>                 .setMaster("yarn-client")
>>>>                 .setAppName("test")
>>>>                 .set("spark.driver.memory", "1G")
>>>>                 .set("spark.executor.memory", "1G")
>>>>                 .set("spark.executor.instances", 2)
>>>>                 .set("spark.executor.cores", 4))
>>>>     sc = SparkContext(conf=conf)
>>>>
>>>> whereas if I run the spark console:
>>>> ./bin/pyspark --driver-memory 1G
>>>>
>>>> it sets it correctly. Seemingly they both generate the same commands in
>>>> the
>>>> logs.
>>>>
>>>> thanks a lot,
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-memory-is-not-set-pyspark-1-1-0-tp15498.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: [hidden email]
>>>> For additional commands, e-mail: [hidden email]
>>>>
>>>
>>>
>>>
>>> --
>>> Marcelo
>
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [hidden email]
> For additional commands, e-mail: [hidden email]
>
>
>
> ________________________________
> If you reply to this email, your message will be added to the discussion
> below:
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-memory-is-not-set-pyspark-1-1-0-tp15498p15506.html
> To unsubscribe from spark.driver.memory is not set (pyspark, 1.1.0), click
> here.
> NAML




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-memory-is-not-set-pyspark-1-1-0-tp15498p15507.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to