Arseniy Tashoyan commented on SPARK-21668:

I don't think it is a duplicate of [ SPARK-6680 ]. The referenced issue covers 
a very specific environment: all Docker containers are on the same machine, 
hence in the same bridged network.
This issue covers a more generic setup: a container with a driver program and a 
real Spark cluster.
The solution proposed in [ SPARK-6680 ] does not work for this case - specify 
--conf spark.driver.host=${SPARK_LOCAL_IP}. The process inside the container 
cannot bind to the IP address of the host machine.

> Ability to run driver programs within a container
> -------------------------------------------------
>                 Key: SPARK-21668
>                 URL: https://issues.apache.org/jira/browse/SPARK-21668
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 2.1.1, 2.2.0
>            Reporter: Arseniy Tashoyan
>            Priority: Minor
>              Labels: containers, docker, driver, spark-submit, standalone
>   Original Estimate: 96h
>  Remaining Estimate: 96h
> When a driver program in Client mode runs in a Docker container, it binds to 
> the IP address of the container, not the host machine. This container IP 
> address is accessible only within the host machine, it is inaccessible for 
> master and worker nodes.
> For example, the host machine has IP address When Docker 
> machine starts a container, it places it to a special bridged network and 
> assigns it an IP address like All Spark nodes belonging to the 
> network cannot access the bridged network with the container. 
> Therefore, the driver program is not able to communicate with the Spark 
> cluster.
> Spark already provides SPARK_PUBLIC_DNS environment variable for this 
> purpose. However, in this scenario setting SPARK_PUBLIC_DNS to the host 
> machine IP address does not work.
> Topic on StackOverflow: 
> [https://stackoverflow.com/questions/45489248/running-spark-driver-program-in-docker-container-no-connection-back-from-execu]

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to