Github user andrewor14 commented on a diff in the pull request:
https://github.com/apache/spark/pull/5575#discussion_r28740458
--- Diff:
core/src/test/scala/org/apache/spark/ExternalShuffleServiceSuite.scala ---
@@ -43,6 +43,9 @@ class ExternalShuffleServiceSuite extends ShuffleSuite
with BeforeAndAfterAll {
conf.set("spark.shuffle.manager", "sort")
conf.set("spark.shuffle.service.enabled", "true")
conf.set("spark.shuffle.service.port", server.getPort.toString)
+
+ // local-cluster mode starts a Worker which would start its own
shuffle service without this:
+ conf.set("spark.worker.shouldHostShuffleServiceIfEnabled", "false")
--- End diff --
Do we actually ever want to start an external shuffle service in a local
cluster? If not I think it makes more sense to just set
`spark.shuffle.service.enabled` to false in `LocalSparkCluster` (we already do
this for the REST submission server for Master)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]