jiangxb1987 commented on a change in pull request #29228:
URL: https://github.com/apache/spark/pull/29228#discussion_r467539295
##########
File path:
core/src/test/scala/org/apache/spark/scheduler/DAGSchedulerSuite.scala
##########
@@ -295,7 +298,20 @@ class DAGSchedulerSuite extends SparkFunSuite with
LocalSparkContext with TimeLi
override def beforeEach(): Unit = {
super.beforeEach()
- init(new SparkConf())
+ }
+
+ override protected def test(testName: String, testTags: Tag*)(testFun: =>
Any)
+ (implicit pos: Position): Unit = {
+ testWithSparkConf(testName, testTags: _*)()(testFun)(pos)
+ }
+
+ private def testWithSparkConf(testName: String, testTags: Tag*)
Review comment:
If I understand correctly, you are doing the hack here because you need
to modify the SparkConf within `beforeEach` (between `super.beforeEach()` and
`init()`). In other words, you don't need to call `testWithSparkConf` instead
of `test`, if you don't do extra initialization at the `beforeEach()` stage.
Thus, this change is necessarily useful for those test suites you listed, right?
A more staightforward idea would likely to be having a `withConfig(pairs:
(String, String)*)` method to create a new SparkConf with the specified
configuration values? The idea doesn't simplify `DAGSchedulerSuite` that much,
because you still need to first stop the SparkContext then call `init()` with
your own SparkConf, but at least it's not worse than the current approach, and
more easy to understand and to be reused.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]