Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/5537#discussion_r28622548
--- Diff:
core/src/main/scala/org/apache/spark/network/netty/SparkTransportConf.scala ---
@@ -43,8 +43,12 @@ object SparkTransportConf {
* @param numUsableCores if nonzero, this will restrict the server and
client threads to only
* use the given number of cores, rather than all
of the machine's cores.
* This restriction will only occur if these
properties are not already set.
+ * @param disablePortRetry if true, server will not retry its port. It's
better for the long-run
+ * server to disable it since the server and
client had the agreement of
+ * the specific port.
*/
- def fromSparkConf(_conf: SparkConf, numUsableCores: Int = 0):
TransportConf = {
+ def fromSparkConf(_conf: SparkConf, numUsableCores: Int = 0,
--- End diff --
Also, this change broke the MiMA checks. It doesn't feel like these classes
should be public (so maybe an exclusion should be fine here), but you can also
work around it by declaring an overloaded method instead.
@aarondav any comments about whether these classes are really meant to be
public?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]