if [ "$SPARK_MASTER_IP" = "" ]; then
  SPARK_MASTER_IP=`hostname`
  --ip $SPARK_MASTER_IP --port $SPARK_MASTER_PORT --webui-port
$SPARK_MASTER_WEBUI_PORT \
  "$sbin"/../tachyon/bin/tachyon bootstrap-conf $SPARK_MASTER_IP
./sbin/start-master.sh

if [ "$SPARK_MASTER_IP" = "" ]; then
  SPARK_MASTER_IP="`hostname`"
  "$sbin/slaves.sh" cd "$SPARK_HOME" \; "$sbin"/../tachyon/bin/tachyon
bootstrap-conf "$SPARK_MASTER_IP"
"$sbin/slaves.sh" cd "$SPARK_HOME" \; "$sbin/start-slave.sh"
"spark://$SPARK_MASTER_IP:$SPARK_MASTER_PORT"
./sbin/start-slaves.sh

On Fri, Oct 16, 2015 at 9:01 AM, Nicholas Chammas <
nicholas.cham...@gmail.com> wrote:

> I'd look into tracing a possible bug here, but I'm not sure where to look.
> Searching the codebase for `SPARK_MASTER_IP`, amazingly, does not show it
> being used in any place directly by Spark
> <https://github.com/apache/spark/search?utf8=%E2%9C%93&q=SPARK_MASTER_IP>.
>
> Clearly, Spark is using this environment variable (otherwise I wouldn't
> see the behavior described in my first email), but I can't see where.
>
> Can someone give me a pointer?
>
> Nick
>
> On Thu, Oct 15, 2015 at 12:37 AM Ted Yu <yuzhih...@gmail.com> wrote:
>
>> Some old bits:
>>
>>
>> http://stackoverflow.com/questions/28162991/cant-run-spark-1-2-in-standalone-mode-on-mac
>> http://stackoverflow.com/questions/29412157/passing-hostname-to-netty
>>
>> FYI
>>
>> On Wed, Oct 14, 2015 at 7:10 PM, Nicholas Chammas <
>> nicholas.cham...@gmail.com> wrote:
>>
>>> I’m setting the Spark master address via the SPARK_MASTER_IP
>>> environment variable in spark-env.sh, like spark-ec2 does
>>> <https://github.com/amplab/spark-ec2/blob/a990752575cd8b0ab25731d7820a55c714798ec3/templates/root/spark/conf/spark-env.sh#L13>
>>> .
>>>
>>> The funny thing is that Spark seems to accept this only if the value of
>>> SPARK_MASTER_IP is a DNS name and not an IP address.
>>>
>>> When I provide an IP address, I get errors in the log when starting the
>>> master:
>>>
>>> 15/10/15 01:47:31 ERROR NettyTransport: failed to bind to 
>>> /54.210.XX.XX:7077, shutting down Netty transport
>>>
>>> (XX is my redaction of the full IP address.)
>>>
>>> Am I misunderstanding something about how to use this environment
>>> variable?
>>>
>>> The spark-env.sh template indicates that either an IP address or a
>>> hostname should work
>>> <https://github.com/apache/spark/blob/4ace4f8a9c91beb21a0077e12b75637a4560a542/conf/spark-env.sh.template#L49>,
>>> but my testing shows that only hostnames work.
>>>
>>> Nick
>>> ​
>>>
>>
>>

Reply via email to