JB,

I am using spark-env.sh to define the master address instead of using
spark-defaults.conf.

I understand that that should work, and indeed it does, but only if
SPARK_MASTER_IP is set to a DNS name and not an IP address.

Perhaps I'm misunderstanding these configuration methods...

Nick


On Fri, Oct 16, 2015 at 12:05 PM Jean-Baptiste Onofré <j...@nanthrax.net>
wrote:

> Hi Nick,
>
> there's the Spark master defined in conf/spark-defaults.conf and the -h
> option that you can provide to sbin/start-master.sh script.
>
> Did you try:
>
> sbin/start-master.sh -h xxx.xxx.xxx.xxx
>
> and then use the IP when you start the slaves:
>
> sbin/start-slave.sh spark://xxx.xxx.xxx.xxx.7077
>
> ?
>
> Regards
> JB
>
> On 10/16/2015 06:01 PM, Nicholas Chammas wrote:
> > I'd look into tracing a possible bug here, but I'm not sure where to
> > look. Searching the codebase for `SPARK_MASTER_IP`, amazingly, does not
> > show it being used in any place directly by Spark
> > <https://github.com/apache/spark/search?utf8=%E2%9C%93&q=SPARK_MASTER_IP
> >.
> >
> > Clearly, Spark is using this environment variable (otherwise I wouldn't
> > see the behavior described in my first email), but I can't see where.
> >
> > Can someone give me a pointer?
> >
> > Nick
> >
> > On Thu, Oct 15, 2015 at 12:37 AM Ted Yu <yuzhih...@gmail.com
> > <mailto:yuzhih...@gmail.com>> wrote:
> >
> >     Some old bits:
> >
> >
> http://stackoverflow.com/questions/28162991/cant-run-spark-1-2-in-standalone-mode-on-mac
> >
> http://stackoverflow.com/questions/29412157/passing-hostname-to-netty
> >
> >     FYI
> >
> >     On Wed, Oct 14, 2015 at 7:10 PM, Nicholas Chammas
> >     <nicholas.cham...@gmail.com <mailto:nicholas.cham...@gmail.com>>
> wrote:
> >
> >         I’m setting the Spark master address via the |SPARK_MASTER_IP|
> >         environment variable in |spark-env.sh|, like spark-ec2 does
> >         <
> https://github.com/amplab/spark-ec2/blob/a990752575cd8b0ab25731d7820a55c714798ec3/templates/root/spark/conf/spark-env.sh#L13
> >.
> >
> >         The funny thing is that Spark seems to accept this only if the
> >         value of |SPARK_MASTER_IP| is a DNS name and not an IP address.
> >
> >         When I provide an IP address, I get errors in the log when
> >         starting the master:
> >
> >         |15/10/15 01:47:31 ERROR NettyTransport: failed to bind to
> >         /54.210.XX.XX:7077, shutting down Netty transport |
> >
> >         (XX is my redaction of the full IP address.)
> >
> >         Am I misunderstanding something about how to use this
> >         environment variable?
> >
> >         The spark-env.sh template indicates that either an IP address or
> >         a hostname should work
> >         <
> https://github.com/apache/spark/blob/4ace4f8a9c91beb21a0077e12b75637a4560a542/conf/spark-env.sh.template#L49
> >,
> >         but my testing shows that only hostnames work.
> >
> >         Nick
> >
> >         ​
> >
> >
>
> --
> Jean-Baptiste Onofré
> jbono...@apache.org
> http://blog.nanthrax.net
> Talend - http://www.talend.com
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to