[jira] [Commented] (SPARK-6680) Be able to specifie IP for spark-shell(spark driver) blocker for Docker integration

2016-09-09 Thread YSMAL Vincent (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6680?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15476947#comment-15476947
 ] 

YSMAL Vincent commented on SPARK-6680:
--

HI, using docker you can get rid of this alias on hostname, by using the 
{code}--hostname spark-master{code} option in docker container.


> Be able to specifie IP for spark-shell(spark driver) blocker for Docker 
> integration
> ---
>
> Key: SPARK-6680
> URL: https://issues.apache.org/jira/browse/SPARK-6680
> Project: Spark
>  Issue Type: New Feature
>  Components: Deploy
>Affects Versions: 1.3.0
> Environment: Docker.
>Reporter: Egor Pakhomov
>Priority: Minor
>  Labels: core, deploy, docker
>
> Suppose I have 3 docker containers - spark_master, spark_worker and 
> spark_shell. In docker for public IP of this container there is an alias like 
> "fgsdfg454534". It only visible in this container. When spark use it for 
> communication other containers receive this alias and don't know what to do 
> with it. Thats why I used SPARK_LOCAL_IP for master and worker. But it 
> doesn't work for spark driver(for spark shell - other types of drivers I 
> haven't try). Spark driver sent everyone "fgsdfg454534" alias about itself 
> and then nobody can address it. I've overcome it in 
> https://github.com/epahomov/docker-spark, but it would be better if it would 
> be solved on spark code level.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6680) Be able to specifie IP for spark-shell(spark driver) blocker for Docker integration

2015-08-06 Thread Cyril Lakech (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6680?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14660123#comment-14660123
 ] 

Cyril Lakech commented on SPARK-6680:
-

Hi,

After a while, I think I find a solution to this problem using --conf 
spark.driver.host=${SPARK_LOCAL_IP} 
and with export SPARK_LOCAL_IP=`awk 'NR==1 {print $1}' /etc/hosts`

cf: https://github.com/epahomov/docker-spark/pull/2

> Be able to specifie IP for spark-shell(spark driver) blocker for Docker 
> integration
> ---
>
> Key: SPARK-6680
> URL: https://issues.apache.org/jira/browse/SPARK-6680
> Project: Spark
>  Issue Type: New Feature
>  Components: Deploy
>Affects Versions: 1.3.0
> Environment: Docker.
>Reporter: Egor Pakhomov
>Priority: Minor
>  Labels: core, deploy, docker
>
> Suppose I have 3 docker containers - spark_master, spark_worker and 
> spark_shell. In docker for public IP of this container there is an alias like 
> "fgsdfg454534". It only visible in this container. When spark use it for 
> communication other containers receive this alias and don't know what to do 
> with it. Thats why I used SPARK_LOCAL_IP for master and worker. But it 
> doesn't work for spark driver(for spark shell - other types of drivers I 
> haven't try). Spark driver sent everyone "fgsdfg454534" alias about itself 
> and then nobody can address it. I've overcome it in 
> https://github.com/epahomov/docker-spark, but it would be better if it would 
> be solved on spark code level.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6680) Be able to specifie IP for spark-shell(spark driver) blocker for Docker integration

2015-08-06 Thread Cyril Lakech (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6680?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14660044#comment-14660044
 ] 

Cyril Lakech commented on SPARK-6680:
-

Hi,

I face this excat same issue in a dockerized spark environment.

How to configure shell/submit to use ip adresse instead of internal name ?

> Be able to specifie IP for spark-shell(spark driver) blocker for Docker 
> integration
> ---
>
> Key: SPARK-6680
> URL: https://issues.apache.org/jira/browse/SPARK-6680
> Project: Spark
>  Issue Type: New Feature
>  Components: Deploy
>Affects Versions: 1.3.0
> Environment: Docker.
>Reporter: Egor Pakhomov
>Priority: Minor
>  Labels: core, deploy, docker
>
> Suppose I have 3 docker containers - spark_master, spark_worker and 
> spark_shell. In docker for public IP of this container there is an alias like 
> "fgsdfg454534". It only visible in this container. When spark use it for 
> communication other containers receive this alias and don't know what to do 
> with it. Thats why I used SPARK_LOCAL_IP for master and worker. But it 
> doesn't work for spark driver(for spark shell - other types of drivers I 
> haven't try). Spark driver sent everyone "fgsdfg454534" alias about itself 
> and then nobody can address it. I've overcome it in 
> https://github.com/epahomov/docker-spark, but it would be better if it would 
> be solved on spark code level.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org