Nicholas Chammas wrote
> The funny thing is that Spark seems to accept this only if the value of
> SPARK_MASTER_IP is a DNS name and not an IP address.
> 
> When I provide an IP address, I get errors in the log when starting the
> master:
> 
> 15/10/15 01:47:31 ERROR NettyTransport: failed to bind to
> /54.210.XX.XX:7077, shutting down Netty transport

A couple of things. (1) That log message appears to originate at line 434 of
NettyTransport.scala.
(https://github.com/akka/akka/blob/master/akka-remote/src/main/scala/akka/remote/transport/netty/NettyTransport.scala)
It appears the exception is rethrown; is it caught somewhere else so we can
see what the actual error was that triggered the log message? I don't see
anything obvious in the code.

(2) sbin/start-master.sh executes something.Master with --ip
SPARK_MASTER_IP, which calls something.MasterArguments to handle its
arguments, which says:

      case ("--ip" | "-i") :: value :: tail =>
        Utils.checkHost(value, "ip no longer supported, please use hostname
" + value)
        host = value
        parse(tail)

      case ("--host" | "-h") :: value :: tail =>
        Utils.checkHost(value, "Please use hostname " + value)
        host = value
        parse(tail)

So it would appear that the intent is that numerical IP addresses are
disallowed, however, Utils.checkHost says:

    def checkHost(host: String, message: String = "") {
      assert(host.indexOf(':') == -1, message)
    }

which accepts numerical IP addresses just fine. Is there some other test
that should be applied in MasterArguments? or maybe checkHost should be
looking for some other pattern? Is it possible that MasterArguments was
changed to disallow --ip without propagating that backwards into any scripts
that call it?

Hope this helps in some way.

Robert Dodier



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/SPARK-MASTER-IP-actually-expects-a-DNS-name-not-IP-address-tp14613p14665.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to