Passing --host localhost solved the issue, thanks!
Warm regards
Arko

On Mon, Feb 22, 2016 at 5:44 PM, Jakob Odersky <[email protected]> wrote:
> Spark master by default binds to whatever ip address your current host
> resolves to. You have a few options to change that:
> - override the ip by setting the environment variable SPARK_LOCAL_IP
> - change the ip in your local "hosts" file (/etc/hosts on linux, not
> sure on windows)
> - specify a different hostname such as "localhost" when starting spark
> master by passing the "--host HOSTNAME" command-line parameter (the ip
> address will be resolved from the supplied HOSTNAME)
>
> best,
> --Jakob
>
> On Mon, Feb 22, 2016 at 5:09 PM, Arko Provo Mukherjee
> <[email protected]> wrote:
>> Hello,
>>
>> I am running Spark on Windows.
>>
>> I start up master as follows:
>> .\spark-class.cmd org.apache.spark.deploy.master.Master
>>
>> I see that the SparkMaster doesn't start on 127.0.0.1 but starts on my
>> "actual" IP. This is troublesome for me as I use it in my code and
>> need to change every time I restart.
>>
>> Is there a way to make SparkMaster listen to 127.0.0.1:7077?
>>
>> Thanks much in advace!
>> Warm regards
>> Arko
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [email protected]
>> For additional commands, e-mail: [email protected]
>>

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to