Github user tgravescs commented on the pull request:
https://github.com/apache/spark/pull/5722#issuecomment-137447179
So it seems that the documentation is not very clear on this also:
spark.port.maxRetries 16 Default maximum number of retries when
binding to a port before giving up.
when I read that it doesn't mean that it increments the port from the start
point. I read this as retry the same port 16 times.
That can be fixed in a separate jira though.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]