Github user ksakellis commented on a diff in the pull request:
https://github.com/apache/spark/pull/3711#discussion_r22680915
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/EventLoggingListenerSuite.scala
---
@@ -160,7 +160,7 @@ class EventLoggingListenerSuite extends FunSuite with
BeforeAndAfter with Loggin
*/
private def testApplicationEventLogging(compressionCodec: Option[String]
= None) {
val conf = getLoggingConf(testDirPath, compressionCodec)
- val sc = new SparkContext("local", "test", conf)
+ val sc = new SparkContext("local-cluster[2,2,512]", "test", conf)
--- End diff --
Yes we can, but then that is not really testing much is it. The point of
this test was to test that we are getting the onExecutorAdded event when an
executor is added to the cluster. Modifying the LocalBacked to publish these
events will greatly(in my opinion) reduce the usefulness of the test. We'd only
be testing some of the SparkListener plumbing and not the actual creation of
the events.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]