This typically works ok for standalone mode with moderate resources

    ${SPARK_HOME}/bin/spark-submit \
                --driver-memory 6G \
                --executor-memory 2G \
                --num-executors 2 \
                --executor-cores 2 \
                --master spark://50.140.197.217:7077 \
                --conf "spark.scheduler.mode=FAIR" \
                --conf "spark.ui.port=55555" \


HTH



Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 12 June 2017 at 17:30, Rastogi, Pankaj <pankaj.rast...@verizon.com>
wrote:

> Please make sure that you have enough memory available on the driver node.
> If there is not enough free memory on the driver node, then your
> application won’t start.
>
> Pankaj
>
> From: vaquar khan <vaquar.k...@gmail.com>
> Date: Saturday, June 10, 2017 at 5:02 PM
> To: Abdulfattah Safa <fattah.s...@gmail.com>
> Cc: User <user@spark.apache.org>
> Subject: [E] Re: Spark Job is stuck at SUBMITTED when set Driver Memory >
> Executor Memory
>
> You can add memory in your command make sure given memory available on
> your executor
>
> ./bin/spark-submit \
>   --class org.apache.spark.examples.SparkPi \
>   --master spark://207.184.161.138:7077 
> <https://urldefense.proofpoint.com/v2/url?u=http-3A__207.184.161.138-3A7077&d=DwMFaQ&c=udBTRvFvXC5Dhqg7UHpJlPps3mZ3LRxpb6__0PomBTQ&r=zQqmwCNxd6rBWnFRMGXIzVL1nRVw40AD5ViBUj89NkA&m=wxxfRxzLq-84-0MgK0lf3k9fISTBemTByQfiA5jv7zQ&s=vnOyOle4HerCDAASfIwUj29e-H2eVhtSuknGDC9mHyI&e=>
>  \
>   --executor-memory 20G \
>   --total-executor-cores 100 \
>   /path/to/examples.jar \
>   1000
>
>
> https://spark.apache.org/docs/1.1.0/submitting-applications.html
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__spark.apache.org_docs_1.1.0_submitting-2Dapplications.html&d=DwMFaQ&c=udBTRvFvXC5Dhqg7UHpJlPps3mZ3LRxpb6__0PomBTQ&r=zQqmwCNxd6rBWnFRMGXIzVL1nRVw40AD5ViBUj89NkA&m=wxxfRxzLq-84-0MgK0lf3k9fISTBemTByQfiA5jv7zQ&s=RPQU9484Nv1qoYOjnB_R_w5pjZga5v3YaA5UMTxEXA0&e=>
>
> Also try to avoid function need memory like collect etc.
>
>
> Regards,
> Vaquar khan
>
>
> On Jun 4, 2017 5:46 AM, "Abdulfattah Safa" <fattah.s...@gmail.com> wrote:
>
> I'm working on Spark with Standalone Cluster mode. I need to increase the
> Driver Memory as I got OOM in t he driver thread. If found that when
> setting  the Driver Memory to > Executor Memory, the submitted job is stuck
> at Submitted in the driver and the application never starts.
>
>
>

Reply via email to