Solved this. It appears that 'spark.driver.bindAddress' should point to
docker container IP while 'spark.driver.host' to outer host IP

Thanks anyway

2017-06-22 19:16 GMT+03:00 Иван Шаповалов <shapovalov.iva...@gmail.com>:

> Help needed.
>
> 1. got zeppelin running in a docker container
> 2. got remote spark standalone cluster I want to run paragraphs against
>
>
> I have:
> - created a setting with
>  -- master - $MASTER_IP/7077
>  -- 'spark.driver.host' - ip of the docker container 172.18.0.2
>  -- 'spark.driver.port' - free port number (I have scanned range of
> forwarded ports 40000-40100 for a free one) 40099
>  -- 'spark.driver.bindAddress' - host IP address
>
> I can see the following in logs when trying to run a paragraph
>
> ...WARN [2017-06-22 15:10:21,827] ({pool-2-thread-2}
> Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on
> port 40114. Attempting port 40115.
> ERROR [2017-06-22 15:10:21,835] ({pool-2-thread-2}
> Logging.scala[logError]:91) - Error initializing SparkContext.
> java.net.BindException: Cannot assign requested address: Service
> 'sparkDriver' failed after 16 retries (starting from 40099)! Consider
> explicitly setting the appropriate port for the service 'sparkDriver' (for
> example spark.ui.port for SparkUI) to an available port or increasing
> spark.port.maxRetries.
>
> But when I remove 'spark.driver.bindAddress' - paragraph job is
> successfully submitted but apparently cluster cannot see the driver
>
> Caused by: java.io.IOException: Failed to connect to /172.18.0.2:40099
>
>
> Please help, any ideas are more than appreciated
> Thanks in advance
>
> --
> Ivan Shapovalov
> Kharkov, Ukraine
>
>


-- 
Ivan Shapovalov
Kharkov, Ukraine

Reply via email to