Ah, my bad, I missed it
<https://github.com/apache/spark/blob/08698ee1d6f29b2c999416f18a074d5193cdacd5/sbin/start-master.sh#L58-L60>
since the GitHub search results preview only showed
<https://github.com/apache/spark/search?utf8=%E2%9C%93&q=SPARK_MASTER_IP>
the first hit from start-master.sh and not this part:

"$sbin"/spark-daemon.sh start org.apache.spark.deploy.master.Master 1 \
  --ip $SPARK_MASTER_IP --port $SPARK_MASTER_PORT --webui-port
$SPARK_MASTER_WEBUI_PORT \
  $ORIGINAL_ARGS

Same goes for some of the other sbin scripts.

Anyway, let’s take a closer look…

Nick
​

On Fri, Oct 16, 2015 at 12:05 PM Sean Owen <so...@cloudera.com> wrote:

> It's used in scripts like sbin/start-master.sh
>
> On Fri, Oct 16, 2015 at 5:01 PM, Nicholas Chammas <
> nicholas.cham...@gmail.com> wrote:
>
>> I'd look into tracing a possible bug here, but I'm not sure where to
>> look. Searching the codebase for `SPARK_MASTER_IP`, amazingly, does not
>> show it being used in any place directly by Spark
>> <https://github.com/apache/spark/search?utf8=%E2%9C%93&q=SPARK_MASTER_IP>
>> .
>>
>> Clearly, Spark is using this environment variable (otherwise I wouldn't
>> see the behavior described in my first email), but I can't see where.
>>
>> Can someone give me a pointer?
>>
>> Nick
>>
>> On Thu, Oct 15, 2015 at 12:37 AM Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>> Some old bits:
>>>
>>>
>>> http://stackoverflow.com/questions/28162991/cant-run-spark-1-2-in-standalone-mode-on-mac
>>> http://stackoverflow.com/questions/29412157/passing-hostname-to-netty
>>>
>>> FYI
>>>
>>> On Wed, Oct 14, 2015 at 7:10 PM, Nicholas Chammas <
>>> nicholas.cham...@gmail.com> wrote:
>>>
>>>> I’m setting the Spark master address via the SPARK_MASTER_IP
>>>> environment variable in spark-env.sh, like spark-ec2 does
>>>> <https://github.com/amplab/spark-ec2/blob/a990752575cd8b0ab25731d7820a55c714798ec3/templates/root/spark/conf/spark-env.sh#L13>
>>>> .
>>>>
>>>> The funny thing is that Spark seems to accept this only if the value of
>>>> SPARK_MASTER_IP is a DNS name and not an IP address.
>>>>
>>>> When I provide an IP address, I get errors in the log when starting the
>>>> master:
>>>>
>>>> 15/10/15 01:47:31 ERROR NettyTransport: failed to bind to 
>>>> /54.210.XX.XX:7077, shutting down Netty transport
>>>>
>>>> (XX is my redaction of the full IP address.)
>>>>
>>>> Am I misunderstanding something about how to use this environment
>>>> variable?
>>>>
>>>> The spark-env.sh template indicates that either an IP address or a
>>>> hostname should work
>>>> <https://github.com/apache/spark/blob/4ace4f8a9c91beb21a0077e12b75637a4560a542/conf/spark-env.sh.template#L49>,
>>>> but my testing shows that only hostnames work.
>>>>
>>>> Nick
>>>> ​
>>>>
>>>
>>>
>

Reply via email to