Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/16426#discussion_r94326423
--- Diff: core/src/test/scala/org/apache/spark/SparkContextSuite.scala ---
@@ -455,16 +455,14 @@ class SparkContextSuite extends SparkFunSuite with
LocalSparkContext {
test("register and deregister Spark listener from SparkContext") {
sc = new SparkContext(new
SparkConf().setAppName("test").setMaster("local"))
- try {
- val sparkListener1 = new SparkListener { }
- val sparkListener2 = new SparkListener { }
- sc.addSparkListener(sparkListener1)
- sc.addSparkListener(sparkListener2)
- assert(sc.listenerBus.listeners.contains(sparkListener1))
- assert(sc.listenerBus.listeners.contains(sparkListener2))
- sc.removeSparkListener(sparkListener1)
- assert(!sc.listenerBus.listeners.contains(sparkListener1))
- assert(sc.listenerBus.listeners.contains(sparkListener2))
- }
+ val sparkListener1 = new SparkListener { }
--- End diff --
Actually, I'm the one that's still confused here. The test harness does not
make a new `sc` object by itself. So all of these tests are fine. They make a
`SparkContext`, and the framework stops it. There is no problem.
It's OK to remove this pointless try block while you're here, sure.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]