[ 
https://issues.apache.org/jira/browse/SPARK-5246?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vladimir Grigor updated SPARK-5246:
-----------------------------------
    Description: 
How to reproduce: 

1)  http://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/VPC_Scenario2.html 
should be sufficient to setup VPC for this bug. After you followed that guide, 
start new instance in VPC, ssh to it (though NAT server)

2) user starts a cluster in VPC:
{code}
./spark-ec2 -k key20141114 -i ~/aws/key.pem -s 1 --region=eu-west-1 
--spark-version=1.2.0 --instance-type=m1.large --vpc-id=vpc-2e71dd46 
--subnet-id=subnet-2571dd4d --zone=eu-west-1a  launch SparkByScript
Setting up security groups...
....
(omitted for brevity)
10.1.1.62
10.1.1.62: no org.apache.spark.deploy.worker.Worker to stop
no org.apache.spark.deploy.master.Master to stop
starting org.apache.spark.deploy.master.Master, logging to 
/root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-.out
failed to launch org.apache.spark.deploy.master.Master:
        at java.net.InetAddress.getLocalHost(InetAddress.java:1469)
        ... 12 more
full log in 
/root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-.out
10.1.1.62: starting org.apache.spark.deploy.worker.Worker, logging to 
/root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-ip-10-1-1-62.out
10.1.1.62: failed to launch org.apache.spark.deploy.worker.Worker:
10.1.1.62:      at java.net.InetAddress.getLocalHost(InetAddress.java:1469)
10.1.1.62:      ... 12 more
10.1.1.62: full log in 
/root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-ip-10-1-1-62.out
[timing] spark-standalone setup:  00h 00m 28s
.... 
(omitted for brevity)
{code}

/root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-.out
{code}
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Spark Command: /usr/lib/jvm/java-1.7.0/bin/java -cp 
:::/root/ephemeral-hdfs/conf:/root/spark/sbin/../conf:/root/spark/lib/spark-assembly-1.2.0-hadoop1.0.4.jar:/root/spark/lib/datanucleus-api-jdo-3.2.6.jar:/root/spark/lib/datanucleus-rdbms-3.2.9.jar:/root/spark/lib/datanucleus-core-3.2.10.jar
 -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
org.apache.spark.deploy.master.Master --ip 10.1.1.151 --port 7077 --webui-port 
8080
========================================

15/01/14 07:34:47 INFO master.Master: Registered signal handlers for [TERM, 
HUP, INT]
Exception in thread "main" java.net.UnknownHostException: ip-10-1-1-151: 
ip-10-1-1-151: Name or service not known
        at java.net.InetAddress.getLocalHost(InetAddress.java:1473)
        at org.apache.spark.util.Utils$.findLocalIpAddress(Utils.scala:620)
        at 
org.apache.spark.util.Utils$.localIpAddress$lzycompute(Utils.scala:612)
        at org.apache.spark.util.Utils$.localIpAddress(Utils.scala:612)
        at 
org.apache.spark.util.Utils$.localIpAddressHostname$lzycompute(Utils.scala:613)
        at org.apache.spark.util.Utils$.localIpAddressHostname(Utils.scala:613)
        at 
org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:665)
        at 
org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:665)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.util.Utils$.localHostName(Utils.scala:665)
        at 
org.apache.spark.deploy.master.MasterArguments.<init>(MasterArguments.scala:27)
        at org.apache.spark.deploy.master.Master$.main(Master.scala:819)
        at org.apache.spark.deploy.master.Master.main(Master.scala)
Caused by: java.net.UnknownHostException: ip-10-1-1-151: Name or service not 
known
        at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
        at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:901)
        at 
java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1293)
        at java.net.InetAddress.getLocalHost(InetAddress.java:1469)
        ... 12 more
{code}

Problem is that instance launched in VPC may be not able to resolve own local 
hostname. Please see  https://forums.aws.amazon.com/thread.jspa?threadID=92092.

I am going to submit a fix for this problem since I need this functionality 
asap.


  was:
##How to reproduce: 
1)  http://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/VPC_Scenario2.html 
should be sufficient to setup VPC for this bug. After you followed that guide, 
start new instance in VPC, ssh to it (though NAT server)

2) user starts a cluster in VPC:
{code}
./spark-ec2 -k key20141114 -i ~/aws/key.pem -s 1 --region=eu-west-1 
--spark-version=1.2.0 --instance-type=m1.large --vpc-id=vpc-2e71dd46 
--subnet-id=subnet-2571dd4d --zone=eu-west-1a  launch SparkByScript
Setting up security groups...
....
(omitted for brevity)
10.1.1.62
10.1.1.62: no org.apache.spark.deploy.worker.Worker to stop
no org.apache.spark.deploy.master.Master to stop
starting org.apache.spark.deploy.master.Master, logging to 
/root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-.out
failed to launch org.apache.spark.deploy.master.Master:
        at java.net.InetAddress.getLocalHost(InetAddress.java:1469)
        ... 12 more
full log in 
/root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-.out
10.1.1.62: starting org.apache.spark.deploy.worker.Worker, logging to 
/root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-ip-10-1-1-62.out
10.1.1.62: failed to launch org.apache.spark.deploy.worker.Worker:
10.1.1.62:      at java.net.InetAddress.getLocalHost(InetAddress.java:1469)
10.1.1.62:      ... 12 more
10.1.1.62: full log in 
/root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-ip-10-1-1-62.out
[timing] spark-standalone setup:  00h 00m 28s
.... 
(omitted for brevity)
{code}

/root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-.out
{code}
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Spark Command: /usr/lib/jvm/java-1.7.0/bin/java -cp 
:::/root/ephemeral-hdfs/conf:/root/spark/sbin/../conf:/root/spark/lib/spark-assembly-1.2.0-hadoop1.0.4.jar:/root/spark/lib/datanucleus-api-jdo-3.2.6.jar:/root/spark/lib/datanucleus-rdbms-3.2.9.jar:/root/spark/lib/datanucleus-core-3.2.10.jar
 -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
org.apache.spark.deploy.master.Master --ip 10.1.1.151 --port 7077 --webui-port 
8080
========================================

15/01/14 07:34:47 INFO master.Master: Registered signal handlers for [TERM, 
HUP, INT]
Exception in thread "main" java.net.UnknownHostException: ip-10-1-1-151: 
ip-10-1-1-151: Name or service not known
        at java.net.InetAddress.getLocalHost(InetAddress.java:1473)
        at org.apache.spark.util.Utils$.findLocalIpAddress(Utils.scala:620)
        at 
org.apache.spark.util.Utils$.localIpAddress$lzycompute(Utils.scala:612)
        at org.apache.spark.util.Utils$.localIpAddress(Utils.scala:612)
        at 
org.apache.spark.util.Utils$.localIpAddressHostname$lzycompute(Utils.scala:613)
        at org.apache.spark.util.Utils$.localIpAddressHostname(Utils.scala:613)
        at 
org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:665)
        at 
org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:665)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.util.Utils$.localHostName(Utils.scala:665)
        at 
org.apache.spark.deploy.master.MasterArguments.<init>(MasterArguments.scala:27)
        at org.apache.spark.deploy.master.Master$.main(Master.scala:819)
        at org.apache.spark.deploy.master.Master.main(Master.scala)
Caused by: java.net.UnknownHostException: ip-10-1-1-151: Name or service not 
known
        at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
        at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:901)
        at 
java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1293)
        at java.net.InetAddress.getLocalHost(InetAddress.java:1469)
        ... 12 more
{code}

Problem is that instance launched in VPC may be not able to resolve own local 
hostname. Please see  https://forums.aws.amazon.com/thread.jspa?threadID=92092.

I am going to submit a fix for this problem since I need this functionality 
asap.


## How to reproduce


> spark/spark-ec2.py cannot start Spark master in VPC if local DNS name does 
> not resolve
> --------------------------------------------------------------------------------------
>
>                 Key: SPARK-5246
>                 URL: https://issues.apache.org/jira/browse/SPARK-5246
>             Project: Spark
>          Issue Type: Bug
>          Components: EC2
>            Reporter: Vladimir Grigor
>
> How to reproduce: 
> 1)  http://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/VPC_Scenario2.html 
> should be sufficient to setup VPC for this bug. After you followed that 
> guide, start new instance in VPC, ssh to it (though NAT server)
> 2) user starts a cluster in VPC:
> {code}
> ./spark-ec2 -k key20141114 -i ~/aws/key.pem -s 1 --region=eu-west-1 
> --spark-version=1.2.0 --instance-type=m1.large --vpc-id=vpc-2e71dd46 
> --subnet-id=subnet-2571dd4d --zone=eu-west-1a  launch SparkByScript
> Setting up security groups...
> ....
> (omitted for brevity)
> 10.1.1.62
> 10.1.1.62: no org.apache.spark.deploy.worker.Worker to stop
> no org.apache.spark.deploy.master.Master to stop
> starting org.apache.spark.deploy.master.Master, logging to 
> /root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-.out
> failed to launch org.apache.spark.deploy.master.Master:
>       at java.net.InetAddress.getLocalHost(InetAddress.java:1469)
>       ... 12 more
> full log in 
> /root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-.out
> 10.1.1.62: starting org.apache.spark.deploy.worker.Worker, logging to 
> /root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-ip-10-1-1-62.out
> 10.1.1.62: failed to launch org.apache.spark.deploy.worker.Worker:
> 10.1.1.62:    at java.net.InetAddress.getLocalHost(InetAddress.java:1469)
> 10.1.1.62:    ... 12 more
> 10.1.1.62: full log in 
> /root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-ip-10-1-1-62.out
> [timing] spark-standalone setup:  00h 00m 28s
> .... 
> (omitted for brevity)
> {code}
> /root/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-.out
> {code}
> Spark assembly has been built with Hive, including Datanucleus jars on 
> classpath
> Spark Command: /usr/lib/jvm/java-1.7.0/bin/java -cp 
> :::/root/ephemeral-hdfs/conf:/root/spark/sbin/../conf:/root/spark/lib/spark-assembly-1.2.0-hadoop1.0.4.jar:/root/spark/lib/datanucleus-api-jdo-3.2.6.jar:/root/spark/lib/datanucleus-rdbms-3.2.9.jar:/root/spark/lib/datanucleus-core-3.2.10.jar
>  -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
> org.apache.spark.deploy.master.Master --ip 10.1.1.151 --port 7077 
> --webui-port 8080
> ========================================
> 15/01/14 07:34:47 INFO master.Master: Registered signal handlers for [TERM, 
> HUP, INT]
> Exception in thread "main" java.net.UnknownHostException: ip-10-1-1-151: 
> ip-10-1-1-151: Name or service not known
>         at java.net.InetAddress.getLocalHost(InetAddress.java:1473)
>         at org.apache.spark.util.Utils$.findLocalIpAddress(Utils.scala:620)
>         at 
> org.apache.spark.util.Utils$.localIpAddress$lzycompute(Utils.scala:612)
>         at org.apache.spark.util.Utils$.localIpAddress(Utils.scala:612)
>         at 
> org.apache.spark.util.Utils$.localIpAddressHostname$lzycompute(Utils.scala:613)
>         at 
> org.apache.spark.util.Utils$.localIpAddressHostname(Utils.scala:613)
>         at 
> org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:665)
>         at 
> org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:665)
>         at scala.Option.getOrElse(Option.scala:120)
>         at org.apache.spark.util.Utils$.localHostName(Utils.scala:665)
>         at 
> org.apache.spark.deploy.master.MasterArguments.<init>(MasterArguments.scala:27)
>         at org.apache.spark.deploy.master.Master$.main(Master.scala:819)
>         at org.apache.spark.deploy.master.Master.main(Master.scala)
> Caused by: java.net.UnknownHostException: ip-10-1-1-151: Name or service not 
> known
>         at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
>         at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:901)
>         at 
> java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1293)
>         at java.net.InetAddress.getLocalHost(InetAddress.java:1469)
>         ... 12 more
> {code}
> Problem is that instance launched in VPC may be not able to resolve own local 
> hostname. Please see  
> https://forums.aws.amazon.com/thread.jspa?threadID=92092.
> I am going to submit a fix for this problem since I need this functionality 
> asap.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to