For each of your job, you can pass spark.ui.port to bind to a different port.
Thanks Best Regards On Fri, Jul 24, 2015 at 7:49 PM, Joji John <jj...@ebates.com> wrote: > Thanks Ajay. > > > The way we wrote our spark application is that we have a generic python > code, multiple instances of which can be called using different parameters. > Does > spark offer any function to bind it to a available port? > > > I guess the other option is to define a function to find open port and > use that. > > > Thanks > > Joji John > > > ------------------------------ > *From:* Ajay Singal <asinga...@gmail.com> > *Sent:* Friday, July 24, 2015 6:59 AM > *To:* Joji John > *Cc:* user@spark.apache.org > *Subject:* Re: ERROR SparkUI: Failed to bind SparkUI > java.net.BindException: Address already in use: Service 'SparkUI' failed > after 16 retries! > > Hi Jodi, > > I guess, there is no hard limit on number of Spark applications running > in parallel. However, you need to ensure that you do not use the same > (e.g., default) port numbers for each application. > > In your specific case, for example, if you try using default SparkUI > port "4040" for more than one Spark applications, the first application you > start will bind to port "4040". So, this port becomes unavailable (at this > moment). Therefore, all subsequent applications you start will get SparkUI > BindException. > > To solve this issue, simply use non-competing port numbers, e.g., 4040, > 4041, 4042... > > Thanks, > Ajay > > On Fri, Jul 24, 2015 at 6:21 AM, Joji John <jj...@ebates.com> wrote: > >> *HI,* >> >> *I am getting this error for some of spark applications. I have multiple >> spark applications running in parallel. Is there a limit in the number of >> spark applications that I can run in parallel.* >> >> >> >> *ERROR SparkUI: Failed to bind SparkUI* >> >> *java.net.BindException: Address already in use: Service 'SparkUI' failed >> after 16 retries!* >> >> >> >> >> >> *Thanks* >> >> *Joji john* >> >> >> > >