Hi Egor,

Is it successful without dynamic allocation? From your log, it looks like
the job is unable to acquire resources from YARN, which could be because
other jobs are using up all the resources.

-Sandy

On Fri, Nov 14, 2014 at 11:32 AM, Egor Pahomov <pahomov.e...@gmail.com>
wrote:

> Hi.
> I execute ipython notebook + pyspark with spark.dynamicAllocation.enabled
> = true. Task never ends.
> Code:
>
> import sys
> from random import random
> from operator import add
> partitions = 10
> n = 100000 * partitions
>
> def f(_):
>     x = random() * 2 - 1
>     y = random() * 2 - 1
>     return 1 if x ** 2 + y ** 2 < 1 else 0
>
> count = sc.parallelize(xrange(1, n + 1), partitions).map(f).reduce(add)
> print "Pi is roughly %f" % (4.0 * count / n)
>
>
>
> Run notebook:
>
> IPYTHON_ARGS="notebook --profile=ydf --port $IPYTHON_PORT --port-retries=0 
> --ip='*' --no-browser"
> pyspark \
>         --verbose \
>         --master yarn-client \
>         --conf spark.driver.port=$((RANDOM_PORT + 2)) \
>         --conf spark.broadcast.port=$((RANDOM_PORT + 3)) \
>         --conf spark.replClassServer.port=$((RANDOM_PORT + 4)) \
>         --conf spark.blockManager.port=$((RANDOM_PORT + 5)) \
>         --conf spark.executor.port=$((RANDOM_PORT + 6)) \
>         --conf spark.fileserver.port=$((RANDOM_PORT + 7)) \
>         --conf spark.shuffle.service.enabled=true \
>         --conf spark.dynamicAllocation.enabled=true \
>         --conf spark.dynamicAllocation.minExecutors=1 \
>         --conf spark.dynamicAllocation.maxExecutors=10 \
>         --conf spark.ui.port=$SPARK_UI_PORT
>
>
> Spark/Ipython log is in attachment.
>
> --
>
>
>
> *Sincerely yoursEgor PakhomovScala Developer, Yandex*
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

Reply via email to