Yes, I believe set ENV can archive the same purpose, the benefit of setting
this in code is more stable and unaffected by the environment, as mentioned
above: "GitHub updates virtual environments from time to time".


Regards!

Aron Tao


Alessandro Solimando <[email protected]> 于2021年4月16日周五
下午8:55写道:

> Hi all,
> locally on MacOS I have to set the SPARK_LOCAL_IP environment variable to
> overcome that very same problem (something along the line of export
> SPARK_LOCAL_IP="127.0.0.1").
>
> I wonder if it wouldn't be less intrusive to set an environment variable
> rather than statically setting a SparkConf option in the code as Aron is
> suggesting.
>
> Best regards,
> Alessandro
>
> On Fri, 16 Apr 2021 at 12:26, JiaTao Tao <[email protected]> wrote:
>
> > Hi
> > The problem may be the wrong hostname in "/etc/hosts", a way to solve
> this
> > problem is to set "spark.driver.bindAddress" explicitly, I've tested this
> > and it works.
> > I've created a JIRA(CALCITE-4587) and attached the PR(
> > https://github.com/apache/calcite/pull/2404/files).
> >
> > Regards!
> >
> > Aron Tao
> >
> >
> > Vladimir Sitnikov <[email protected]> 于2021年4月15日周四 上午2:48写道:
> >
> > > > java.net.BindException: Cannot assign requested address: Service
> > > 'sparkDriver' failed
> > > >  after 16 retries (on a random free port)! Consider explicitly
> setting
> > > the appropriate
> > > > binding address for the service 'sparkDriver' (for example
> > > spark.driver.bindAddress
> > > > for SparkDriver) to the correct binding address.
> > >
> > > GitHub updates virtual environments from time to time, so the failure
> > might
> > > be
> > > related to a new environment having a different set of network
> > interfaces.
> > >
> > > The failure looks like a true bug rather than a CI glitch.
> > >
> > > Vladimir
> > >
> >
>

Reply via email to